[OMPI users] Gromacs run in parallel

2010-06-05 Thread lauren
sorry my english..

I want to know how can I run  Gromancs in parallel!
Because when I used  

mdrun &
mpiexec -np 4 mdrun_mpi -v -deffnm em

 to run the minimization in 4 cores > all cores make the same job, again!
They don't run together.  
I want all in parallel make the job faster.


what 
could be wrong?

thank's a lot!


  

[OMPI users] Res: Gromacs run in parallel

2010-06-08 Thread lauren


The version of Gromacs is 4.0.7.
This is the first time that I using Gromacs, then excuse me if I'm nonsense.

Wich part of md.log output  should I post?
after or before the input description?

thanks for all,
and sorry




De: Carsten Kutzner 
Para: Open MPI Users 
Enviadas: Domingo, 6 de Junho de 2010 9:51:26
Assunto: Re: [OMPI users] Gromacs run in parallel

Hi,

which version of Gromacs is this? Could you post the first lines of 
the md.log output file?

Carsten



On Jun 5, 2010, at 10:23 PM, lauren wrote:

sorry my english..
>
>I want to know how can I run  Gromancs in parallel!
>Because when I used  
>
>mdrun &
>mpiexec -np 4 mdrun_mpi -v -deffnm em
>
> to run the minimization in 4 cores > all cores make the same job, again!
>They don't run together.  
>I want all in parallel make the job faster.
>
>
>what could be wrong?
>
>thank's a lot!
>
>
>
> ___
>users mailing list
>us...@open-mpi.org
>http://www.open-mpi.org/mailman/listinfo.cgi/users




[OMPI users] Res: Res: Gromacs run in parallel

2010-06-08 Thread lauren
oh! ok,
Then  I put MPI on server with 4 nodes.
I have to put 1 for each? 
How do I do that?
What's the first step in this case when I want to run 1 job in 4 nodes (the 
same server)?
Cause all of then are make the same job again.

sorry for all...





De: Jeff Squyres 
Para: Open MPI Users 
Enviadas: Terça-feira, 8 de Junho de 2010 10:06:25
Assunto: Re: [OMPI users] Res:  Gromacs run in parallel

I know nothing about Gromacs, but you might want to ensure that your Gromacs 
was compiled with Open MPI.  A common symptom of "mpirun -np 4 
my_mpi_application" running 4 1-process MPI jobs (instead of 1 4-process MPI 
job) is that you compiled my_mpi_application with one MPI implementation, but 
then used the mpirun from a different MPI implementation.


On Jun 8, 2010, at 8:59 AM, lauren wrote:

> 
> The version of Gromacs is 4.0.7.
> This is the first time that I using Gromacs, then excuse me if I'm nonsense.
> 
> Wich part of md.log output  should I post?
> after or before the input description?
> 
> thanks for all,
> and sorry
> 
> De: Carsten Kutzner 
> Para: Open MPI Users 
> Enviadas: Domingo, 6 de Junho de 2010 9:51:26
> Assunto: Re: [OMPI users] Gromacs run in parallel
> 
> Hi,
> 
> which version of Gromacs is this? Could you post the first lines of 
> the md.log output file?
> 
> Carsten
> 
> 
> On Jun 5, 2010, at 10:23 PM, lauren wrote:
> 
>> sorry my english..
>> 
>> I want to know how can I run  Gromancs in parallel!
>> Because when I used  
>> 
>> mdrun &
>> mpiexec -np 4 mdrun_mpi -v -deffnm em
>> 
>>  to run the minimization in 4 cores > all cores make the same job, again!
>> They don't run together.  
>> I want all in parallel make the job faster.
>> 
>> 
>> what could be wrong?
>> 
>> thank's a lot!
>> 
>> 
>> 
>>  ___
>> users mailing list
>> us...@open-mpi.org
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
> 
> 
> 
>  ___
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users


-- 
Jeff Squyres
jsquy...@cisco.com
For corporate legal information go to:
http://www.cisco.com/web/about/doing_business/legal/cri/


___
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users



  

[OMPI users] Res: Res: Gromacs run in parallel

2010-06-08 Thread lauren
I saw 

Host:  pid:  nodeid: 0 nnodes: 1

really it`s running in 1 node
and All of you really undestood my problem, thanks

But how can I fix it.
How can I run 1 job in 4 nodes...?
I really need help, 
I took a look in my files and erase all the errors and the implementations seem 
corect.
From the beginning, please.
`case all tutorials only explain the same thing that look right.
And thanks very much for this help!


 



De: Jeff Squyres 
Para: Open MPI Users 
Enviadas: Terça-feira, 8 de Junho de 2010 10:30:03
Assunto: Re: [OMPI users] Res:  Gromacs run in parallel

No, I'm sorry -- I wasn't clear.  What I meant was, that if you run:

   mpirun -np 4 my_mpi_application

1. If you see a single, 4-process MPI job (regardless of how many nodes/servers 
it's spread across), then all is good.  This is what you want.

2. But if you're seeing 4 independent 1-process MPI jobs (again, regardless of 
how many nodes/servers they are spread across), it's possible that you compiled 
your application with MPI implementation X and then used the "mpirun" from MPI 
implementation Y.  

You will need X==Y to make it work properly -- i.e., to see case #1, above.  I 
mention this because your first post mentioned that you're seeing the same job 
run 4 times.  This implied to me that you are running into case #2.  If I 
misunderstood your problem, then ignore me and forgive the noise.



On Jun 8, 2010, at 9:20 AM, Carsten Kutzner wrote:

> On Jun 8, 2010, at 3:06 PM, Jeff Squyres wrote:
> 
> > I know nothing about Gromacs, but you might want to ensure that your 
> > Gromacs was compiled with Open MPI.  A common symptom of "mpirun -np 4 
> > my_mpi_application" running 4 1-process MPI jobs (instead of 1 4-process 
> > MPI job) is that you compiled my_mpi_application with one MPI 
> > implementation, but then used the mpirun from a different MPI 
> > implementation.
> >
> Hi,
> 
> this can be checked by looking at the Gromacs output file md.log. The second 
> line should
> read something like
> 
> Host:  pid:  nodeid: 0 nnodes: 4
> 
> Lauren, you will want to ensure that nnodes is 4 in your case, and not 1.
> 
> You can also easily test that without any input file by typing
> 
> mpirun -np 4 mdrun -h
> 
> and then should see
> 
> NNODES=4, MYRANK=1, HOSTNAME=<...>
> NNODES=4, MYRANK=2, HOSTNAME=<...>
> NNODES=4, MYRANK=3, HOSTNAME=<...>
> NNODES=4, MYRANK=4, HOSTNAME=<...>
> ...
> 
> 
> Carsten
> 
> 
> >
> > On Jun 8, 2010, at 8:59 AM, lauren wrote:
> >
> >>
> >> The version of Gromacs is 4.0.7.
> >> This is the first time that I using Gromacs, then excuse me if I'm 
> >> nonsense.
> >>
> >> Wich part of md.log output  should I post?
> >> after or before the input description?
> >>
> >> thanks for all,
> >> and sorry
> >>
> >> De: Carsten Kutzner 
> >> Para: Open MPI Users 
> >> Enviadas: Domingo, 6 de Junho de 2010 9:51:26
> >> Assunto: Re: [OMPI users] Gromacs run in parallel
> >>
> >> Hi,
> >>
> >> which version of Gromacs is this? Could you post the first lines of
> >> the md.log output file?
> >>
> >> Carsten
> >>
> >>
> >> On Jun 5, 2010, at 10:23 PM, lauren wrote:
> >>
> >>> sorry my english..
> >>>
> >>> I want to know how can I run  Gromancs in parallel!
> >>> Because when I used 
> >>>
> >>> mdrun &
> >>> mpiexec -np 4 mdrun_mpi -v -deffnm em
> >>>
> >>> to run the minimization in 4 cores > all cores make the same job, again!
> >>> They don't run together. 
> >>> I want all in parallel make the job faster.
> >>>
> >>>
> >>> what could be wrong?
> >>>
> >>> thank's a lot!
> >>>
> >>>
> >>>
> >>> ___
> >>> users mailing list
> >>> us...@open-mpi.org
> >>> http://www.open-mpi.org/mailman/listinfo.cgi/users
> >>
> >>
> >>
> >> ___
> >> users mailing list
> >> us...@open-mpi.org
> >> http://www.open-mpi.org/mailman/listinfo.cgi/users
> >
> >
> > --
> > Jeff Squyres
> > jsquy...@cisco.com
> > For corporate legal information go to:
> > http://www.cisco.com/web/about/doing_business/legal/cri/
> >
> >
> > ___
> > users mailing list
> > us...@open-mpi.org
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
> 
> 
> --
> Dr. Carsten Kutzner
> Max Planck Institute for Biophysical Chemistry
> Theoretical and Computational Biophysics
> Am Fassberg 11, 37077 Goettingen, Germany
> Tel. +49-551-2012313, Fax: +49-551-2012302
> http://www.mpibpc.mpg.de/home/grubmueller/ihp/ckutzne
> 
> 
> 
> 
> 
> ___
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users
> 


-- 
Jeff Squyres
jsquy...@cisco.com
For corporate legal information go to:
http://www.cisco.com/web/about/doing_business/legal/cri/


___
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users



   

[OMPI users] Res: Res: Res: Gromacs run in parallel

2010-06-08 Thread lauren
Hi,
I did it and it match.
mdrun and mpiexec at the same place.
seems ok...
1 more suggestion?

thank you,









De: Carsten Kutzner 
Para: Open MPI Users 
Enviadas: Terça-feira, 8 de Junho de 2010 13:12:35
Assunto: Re: [OMPI users] Res:  Res:  Gromacs run in parallel

Ok,

1. type 'which mdrun' to see where the mdrun executable resides.
2. type ldd 'which mdrun' to find out against which mpi library it is linked
3. type which mpirun (or which mpiexec, whatever you use) to verify that
this is the right mpi launcher for your mdrun.
4. If the MPI's do not match, either use the right mpiexec or recompile
gromacs with the current mpi.

Carsten



On Jun 8, 2010, at 5:50 PM, lauren wrote:

I saw 
>
>Host:  pid:  nodeid: 0 nnodes: 1
>
>really it`s running in 1 node
>and All of you really undestood my problem, thanks
>
>But how can I fix it.
>How can I run 1 job in 4 nodes...?
>I really need help, 
>I took a look in my files and erase all the errors and the implementations 
>seem corect.
>From the beginning, please.
>`case all tutorials only explain the same thing that look right.
>And thanks very much for this help!
>
>
> 
>
>

De: Jeff Squyres 
>Para: Open MPI Users 
>Enviadas: Terça-feira, 8 de Junho de 2010 10:30:03
>Assunto: Re: [OMPI users] Res: Gromacs run in parallel
>
>No, I'm sorry -- I wasn't clear.  What I meant was, that if you run:
>
>  mpirun -np 4 my_mpi_application
>
>1. If you see a single, 4-process MPI job (regardless of how many 
>nodes/servers it's spread across), then all is good.  This is what you want.
>
>2. But if you're seeing 4 independent 1-process MPI jobs (again, regardless of 
>how many nodes/servers they are spread across), it's possible that you 
>compiled your application with MPI implementation X and then used the "mpirun" 
>from MPI implementation Y.  
>
>You will need X==Y to make it work properly -- i.e., to see case #1, above.  I 
>mention this because your first post mentioned that you're seeing the same job 
>run 4 times.  This implied to me that you are running into case #2.  If I 
>misunderstood your problem, then ignore me and forgive the noise.
>
>
>
>On Jun 8, 2010, at 9:20 AM, Carsten Kutzner wrote:
>
>> On Jun 8, 2010, at 3:06 PM, Jeff Squyres wrote:
>> 
>> > I know nothing about Gromacs, but you might want to ensure that your 
>> > Gromacs was compiled with Open MPI.  A common symptom of "mpirun -np 4 
>> > my_mpi_application" running 4 1-process MPI jobs (instead of 1 4-process 
>> > MPI job) is that you compiled my_mpi_application with one MPI 
>> > implementation, but then used the mpirun from a different MPI 
>> > implementation.
>> >
>> Hi,
>> 
>> this can be checked by looking at the Gromacs output file md.log. The second 
>> line should
>> read something like
>> 
>> Host:  pid:  nodeid: 0 nnodes: 4
>> 
>> Lauren, you will want to ensure that nnodes is 4 in your case, and not 1.
>> 
>> You can also easily test that without any input file by typing
>> 
>> mpirun -np 4 mdrun -h
>> 
>> and then should see
>> 
>> NNODES=4, MYRANK=1, HOSTNAME=<...>
>> NNODES=4, MYRANK=2, HOSTNAME=<...>
>> NNODES=4, MYRANK=3, HOSTNAME=<...>
>> NNODES=4, MYRANK=4, HOSTNAME=<...>
>> ...
>> 
>> 
>> Carsten
>> 
>> 
>> >
>> > On Jun 8, 2010, at 8:59 AM, lauren wrote:
>> >
>> >>
>> >> The version of Gromacs is 4.0.7.
>> >> This is the first time that I using Gromacs, then excuse me if I'm 
>> >> nonsense.
>> >>
>> >> Wich part of md.log output  should I post?
>> >> after or before the input description?
>> >>
>> >> thanks for all,
>> >> and sorry
>> >>
>> >> De: Carsten Kutzner 
>> >> Para: Open MPI Users 
>> >> Enviadas: Domingo, 6 de Junho de 2010 9:51:26
>> >> Assunto: Re: [OMPI users] Gromacs run in parallel
>> >>
>> >> Hi,
>> >>
>> >> which version of Gromacs is this? Could you post the first lines of
>> >> the md.log output file?
>> >>
>> >> Carsten
>> >>
>> >>
>> >> On Jun 5, 2010, at 10:23 PM, lauren wrote:
>> >>
>> >>> sorry my english..
>> >>>
>> >>> I want to know how can I run  Gromancs in parallel!
>> >&g

[OMPI users] Res: Res: Res: Res: Gromacs run in parallel

2010-06-08 Thread lauren
Hi,
all are linked.
what should I find ? anything different?
thank`s 
and sorry for all





De: "Addepalli, Srirangam V" 
Para: Open MPI Users 
Enviadas: Terça-feira, 8 de Junho de 2010 13:59:08
Assunto: Re: [OMPI users] Res:  Res:  Res:  Gromacs run in parallel

Hello,

ldd  `which mdrun_mpi`

should give you which libraries the binary is looking for.  What does the above 
command do for your build.

I had a user who had a serial mdrun in his path and it did the same.

Rangam


From: users-boun...@open-mpi.org [users-boun...@open-mpi.org] On Behalf Of 
lauren [owenl...@yahoo.com.br]
Sent: Tuesday, June 08, 2010 11:36 AM
To: Open MPI Users
Subject: [OMPI users] Res:  Res:  Res:  Gromacs run in parallel

Hi,
I did it and it match.
mdrun and mpiexec at the same place.
seems ok...
1 more suggestion?

thank you,






De: Carsten Kutzner 
Para: Open MPI Users 
Enviadas: Terça-feira, 8 de Junho de 2010 13:12:35
Assunto: Re: [OMPI users] Res: Res: Gromacs run in parallel

Ok,

1. type 'which mdrun' to see where the mdrun executable resides.
2. type ldd 'which mdrun' to find out against which mpi library it is linked
3. type which mpirun (or which mpiexec, whatever you use) to verify that
this is the right mpi launcher for your mdrun.
4. If the MPI's do not match, either use the right mpiexec or recompile
gromacs with the current mpi.

Carsten


On Jun 8, 2010, at 5:50 PM, lauren wrote:

I saw
Host:  pid:  nodeid: 0 nnodes: 1

really it`s running in 1 node
and All of you really undestood my problem, thanks

But how can I fix it.
How can I run 1 job in 4 nodes...?
I really need help,
I took a look in my files and erase all the errors and the implementations seem 
corect.
>From the beginning, please.
`case all tutorials only explain the same thing that look right.
And thanks very much for this help!




De: Jeff Squyres mailto:jsquy...@cisco.com>>
Para: Open MPI Users mailto:us...@open-mpi.org>>
Enviadas: Terça-feira, 8 de Junho de 2010 10:30:03
Assunto: Re: [OMPI users] Res: Gromacs run in parallel

No, I'm sorry -- I wasn't clear.  What I meant was, that if you run:

  mpirun -np 4 my_mpi_application

1. If you see a single, 4-process MPI job (regardless of how many nodes/servers 
it's spread across), then all is good.  This is what you want.

2. But if you're seeing 4 independent 1-process MPI jobs (again, regardless of 
how many nodes/servers they are spread across), it's possible that you compiled 
your application with MPI implementation X and then used the "mpirun" from MPI 
implementation Y.

You will need X==Y to make it work properly -- i.e., to see case #1, above.  I 
mention this because your first post mentioned that you're seeing the same job 
run 4 times.  This implied to me that you are running into case #2.  If I 
misunderstood your problem, then ignore me and forgive the noise.



On Jun 8, 2010, at 9:20 AM, Carsten Kutzner wrote:

> On Jun 8, 2010, at 3:06 PM, Jeff Squyres wrote:
>
> > I know nothing about Gromacs, but you might want to ensure that your 
> > Gromacs was compiled with Open MPI.  A common symptom of "mpirun -np 4 
> > my_mpi_application" running 4 1-process MPI jobs (instead of 1 4-process 
> > MPI job) is that you compiled my_mpi_application with one MPI 
> > implementation, but then used the mpirun from a different MPI 
> > implementation.
> >
> Hi,
>
> this can be checked by looking at the Gromacs output file md.log. The second 
> line should
> read something like
>
> Host:  pid:  nodeid: 0 nnodes: 4
>
> Lauren, you will want to ensure that nnodes is 4 in your case, and not 1.
>
> You can also easily test that without any input file by typing
>
> mpirun -np 4 mdrun -h
>
> and then should see
>
> NNODES=4, MYRANK=1, HOSTNAME=<...>
> NNODES=4, MYRANK=2, HOSTNAME=<...>
> NNODES=4, MYRANK=3, HOSTNAME=<...>
> NNODES=4, MYRANK=4, HOSTNAME=<...>
> ...
>
>
> Carsten
>
>
> >
> > On Jun 8, 2010, at 8:59 AM, lauren wrote:
> >
> >>
> >> The version of Gromacs is 4.0.7.
> >> This is the first time that I using Gromacs, then excuse me if I'm 
> >> nonsense.
> >>
> >> Wich part of md.log output  should I post?
> >> after or before the input description?
> >>
> >> thanks for all,
> >> and sorry
> >>
> >> De: Carsten Kutzner mailto:ckut...@gwdg.de>>
> >> Para: Open MPI Users mailto:us...@open-mpi.org>>
> >> Enviadas: Domingo, 6 de Junho de 2010 9:51:26
> >> Assunto: Re: [OMPI users] Gromacs run in parallel
> >>
> >> Hi,
&

[OMPI users] Res: Res: Res: Res: Gromacs run in parallel

2010-06-08 Thread lauren
One problem with versions or incompatibility can lead to a error like:
"Unable to start a daemon on the local node" 
and
"ompi_mpi_init: ort_init failed" 


??

thanks





De: "Addepalli, Srirangam V" 
Para: Open MPI Users 
Enviadas: Terça-feira, 8 de Junho de 2010 13:59:08
Assunto: Re: [OMPI users] Res:  Res:  Res:  Gromacs run in parallel

Hello,

ldd  `which mdrun_mpi`

should give you which libraries the binary is looking for.  What does the above 
command do for your build.

I had a user who had a serial mdrun in his path and it did the same.

Rangam


From: users-boun...@open-mpi.org [users-boun...@open-mpi.org] On Behalf Of 
lauren [owenl...@yahoo.com.br]
Sent: Tuesday, June 08, 2010 11:36 AM
To: Open MPI Users
Subject: [OMPI users] Res:  Res:  Res:  Gromacs run in parallel

Hi,
I did it and it match.
mdrun and mpiexec at the same place.
seems ok...
1 more suggestion?

thank you,






De: Carsten Kutzner 
Para: Open MPI Users 
Enviadas: Terça-feira, 8 de Junho de 2010 13:12:35
Assunto: Re: [OMPI users] Res: Res: Gromacs run in parallel

Ok,

1. type 'which mdrun' to see where the mdrun executable resides.
2. type ldd 'which mdrun' to find out against which mpi library it is linked
3. type which mpirun (or which mpiexec, whatever you use) to verify that
this is the right mpi launcher for your mdrun.
4. If the MPI's do not match, either use the right mpiexec or recompile
gromacs with the current mpi.

Carsten


On Jun 8, 2010, at 5:50 PM, lauren wrote:

I saw
Host:  pid:  nodeid: 0 nnodes: 1

really it`s running in 1 node
and All of you really undestood my problem, thanks

But how can I fix it.
How can I run 1 job in 4 nodes...?
I really need help,
I took a look in my files and erase all the errors and the implementations seem 
corect.
>From the beginning, please.
`case all tutorials only explain the same thing that look right.
And thanks very much for this help!




De: Jeff Squyres mailto:jsquy...@cisco.com>>
Para: Open MPI Users mailto:us...@open-mpi.org>>
Enviadas: Terça-feira, 8 de Junho de 2010 10:30:03
Assunto: Re: [OMPI users] Res: Gromacs run in parallel

No, I'm sorry -- I wasn't clear.  What I meant was, that if you run:

  mpirun -np 4 my_mpi_application

1. If you see a single, 4-process MPI job (regardless of how many nodes/servers 
it's spread across), then all is good.  This is what you want.

2. But if you're seeing 4 independent 1-process MPI jobs (again, regardless of 
how many nodes/servers they are spread across), it's possible that you compiled 
your application with MPI implementation X and then used the "mpirun" from MPI 
implementation Y.

You will need X==Y to make it work properly -- i.e., to see case #1, above.  I 
mention this because your first post mentioned that you're seeing the same job 
run 4 times.  This implied to me that you are running into case #2.  If I 
misunderstood your problem, then ignore me and forgive the noise.



On Jun 8, 2010, at 9:20 AM, Carsten Kutzner wrote:

> On Jun 8, 2010, at 3:06 PM, Jeff Squyres wrote:
>
> > I know nothing about Gromacs, but you might want to ensure that your 
> > Gromacs was compiled with Open MPI.  A common symptom of "mpirun -np 4 
> > my_mpi_application" running 4 1-process MPI jobs (instead of 1 4-process 
> > MPI job) is that you compiled my_mpi_application with one MPI 
> > implementation, but then used the mpirun from a different MPI 
> > implementation.
> >
> Hi,
>
> this can be checked by looking at the Gromacs output file md.log. The second 
> line should
> read something like
>
> Host:  pid:  nodeid: 0 nnodes: 4
>
> Lauren, you will want to ensure that nnodes is 4 in your case, and not 1.
>
> You can also easily test that without any input file by typing
>
> mpirun -np 4 mdrun -h
>
> and then should see
>
> NNODES=4, MYRANK=1, HOSTNAME=<...>
> NNODES=4, MYRANK=2, HOSTNAME=<...>
> NNODES=4, MYRANK=3, HOSTNAME=<...>
> NNODES=4, MYRANK=4, HOSTNAME=<...>
> ...
>
>
> Carsten
>
>
> >
> > On Jun 8, 2010, at 8:59 AM, lauren wrote:
> >
> >>
> >> The version of Gromacs is 4.0.7.
> >> This is the first time that I using Gromacs, then excuse me if I'm 
> >> nonsense.
> >>
> >> Wich part of md.log output  should I post?
> >> after or before the input description?
> >>
> >> thanks for all,
> >> and sorry
> >>
> >> De: Carsten Kutzner mailto:ckut...@gwdg.de>>
> >> Para: Open MPI Users mailto:us...@open-mpi.org>>
> >> Enviadas: Domingo, 6 de Junho de 2010 9:51:26
&