[OMPI users] Error run mpiexec

2008-07-21 Thread mariognu-outside
Hi all,

First, excuse my english, it isn't good :)

Well, I have 2 machines, one a Xeon with 2 cpu (64bit) and a Pentium 4 with 
only one cpu. At the 2 machines I have installed Ubuntu 8 Server and all 
packages to open-mpi and gromacs.

I use gromacs for my works

Ok, in the 2 machines, at my users folder, I have a file like this:
machine1 cpu=2
machine2

Machine1 is Xeon (192.168.0.10) and Machine2 is Pentium 4 (192.168.0.11)

My file /etc/hosts is configured too.

When I run mpiexec in machine2, I have like this:
mariojose@machine2:~/lam-mpi$ mpiexec -n 3 hostname
machine1
machine2
-
It seems that [at least] one of the processes that was started with
mpirun did not invoke MPI_INIT before quitting (it is possible that
more than one process did not invoke MPI_INIT -- mpirun was only
notified of the first one, which was on node n0).

mpirun can *only* be used with MPI programs (i.e., programs that
invoke MPI_INIT and MPI_FINALIZE).  You can use the "lamexec" program
to run non-MPI programs over the lambooted nodes.
-
machine1
mpirun failed with exit status 252

When I run in machine1 I have like this:

mariojose@machine1:~/lam-mpi$ mpiexec -n 3 hostname
machine1
machine1
machine2
-
It seems that [at least] one of the processes that was started with
mpirun did not invoke MPI_INIT before quitting (it is possible that
more than one process did not invoke MPI_INIT -- mpirun was only
notified of the first one, which was on node n0).

mpirun can *only* be used with MPI programs (i.e., programs that
invoke MPI_INIT and MPI_FINALIZE).  You can use the "lamexec" program
to run non-MPI programs over the lambooted nodes.
-
mpirun failed with exit status 252

I don't know why I have this message. I think that is a error.

I try run with gromacs, if anybody use gromacs and can help me I like very much 
:) . 

mariojose@machine1:~/lam-mpi$ grompp -f run.mdp -p topol.top -c pr.gro -o 
run.tpr
mariojose@machine1:~/mpiexec -n 3 mdrun -v -deffnm run

It's works Ok. I see that cpu of 2 machines woks in 100%. It look well for me. 
But I have a error em I run mdrun_mpi that is a binary to work in cluster.

mariojose@machine1:~/lam-mpi$ grompp -f run.mdp -p topol.top -c pr.gro -o 
run.tpr -np 3 -sort -shuffle
mariojose@machine1:~/lam-mpi$ mpiexec -n 3 mdrun_mpi -v -deffnm run
NNODES=3, MYRANK=0, HOSTNAME=machine1
NNODES=3, MYRANK=2, HOSTNAME=machine1
NODEID=0 argc=4
NODEID=2 argc=4
NNODES=3, MYRANK=1, HOSTNAME=machine2
NODEID=1 argc=4
 :-)  G  R  O  M  A  C  S  (-:

 Gyas ROwers Mature At Cryogenic Speed

:-)  VERSION 3.3.3  (-:


  Written by David van der Spoel, Erik Lindahl, Berk Hess, and others.
   Copyright (c) 1991-2000, University of Groningen, The Netherlands.
 Copyright (c) 2001-2008, The GROMACS development team,
check out http://www.gromacs.org for more information.

 This program is free software; you can redistribute it and/or
  modify it under the terms of the GNU General Public License
 as published by the Free Software Foundation; either version 2
 of the License, or (at your option) any later version.

  :-)  mdrun_mpi  (-:

Option Filename  Type Description

  -srun.tpr  InputGeneric run input: tpr tpb tpa xml
  -orun.trr  Output   Full precision trajectory: trr trj
  -xrun.xtc  Output, Opt. Compressed trajectory (portable xdr format)
  -crun.gro  Output   Generic structure: gro g96 pdb xml
  -erun.edr  Output   Generic energy: edr ene
  -grun.log  Output   Log file
-dgdl   run.xvg  Output, Opt. xvgr/xmgr file
-field  run.xvg  Output, Opt. xvgr/xmgr file
-table  run.xvg  Input, Opt.  xvgr/xmgr file
-tablep run.xvg  Input, Opt.  xvgr/xmgr file
-rerun  run.xtc  Input, Opt.  Generic trajectory: xtc trr trj gro g96 pdb
-tpirun.xvg  Output, Opt. xvgr/xmgr file
 -eirun.edi  Input, Opt.  ED sampling input
 -eorun.edo  Output, Opt. ED sampling output
  -jrun.gct  Input, Opt.  General coupling stuff
 -jorun.gct  Output, Opt. General coupling stuff
-ffout  run.xvg  Output, Opt. xvgr/xmgr file
-devout run.xvg  Output, Opt. xvgr/xmgr file
-runav  run.xvg  Output, Opt. xvgr/xmgr file
 -pirun.ppa  Input, Opt.  Pull parameters
 -porun.ppa  Output, Opt. Pull parameters
 -pdrun.pdo  Output, Opt. Pull data output
 -pnrun.ndx  Input, Opt.  Index file
-mtxrun.mtx  Output, Opt. Hessian matrix
 -dnrun.ndx  O

Re: [OMPI users] Error run mpiexec

2008-07-21 Thread mariognu-outside
Hi,

In my /usr/bin I have
lrwxrwxrwx 1 root root25 2008-07-17 10:25 /usr/bin/mpiexec -> 
/etc/alternatives/mpiexec
-rwxr-xr-x 1 root root 19941 2008-03-23 03:36 /usr/bin/mpiexec.lam
lrwxrwxrwx 1 root root 7 2008-07-17 10:25 /usr/bin/mpiexec.openmpi -> 
orterun

I try to run

mpiexec.openmpi -n 3 mdrun_mpi.openmpi -n 3 -v -deffnm run

But only machine 1 the cpu is working in 100%. The machine2 not.

I run

mpiexec.openmpi -n 3 hostname

And I have

machine1
machine1
machine1

My path

mariojose@machine1:~/lam-mpi$ echo $PATH
/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games

I don't know what happening

Thanks

Mario Jose


--- Em seg, 21/7/08, Ralph Castain  escreveu:

> De: Ralph Castain 
> Assunto: Re: [OMPI users] Error run mpiexec
> Para: mariognu-outs...@yahoo.com.br, "Open MPI Users" 
> Data: Segunda-feira, 21 de Julho de 2008, 11:00
> If you look closely at the error messages, you will see that
> you were  
> executing LAM-MPI, not Open MPI. If you truly wanted to run
> Open MPI,  
> I would check your path to ensure that mpiexec is pointing
> at the Open  
> MPI binary.
> 
> Ralph
> 
> On Jul 21, 2008, at 7:47 AM, mariognu-outs...@yahoo.com.br
> wrote:
> 
> > Hi all,
> >
> > First, excuse my english, it isn't good :)
> >
> > Well, I have 2 machines, one a Xeon with 2 cpu (64bit)
> and a Pentium  
> > 4 with only one cpu. At the 2 machines I have
> installed Ubuntu 8  
> > Server and all packages to open-mpi and gromacs.
> >
> > I use gromacs for my works
> >
> > Ok, in the 2 machines, at my users folder, I have a
> file like this:
> > machine1 cpu=2
> > machine2
> >
> > Machine1 is Xeon (192.168.0.10) and Machine2 is
> Pentium 4  
> > (192.168.0.11)
> >
> > My file /etc/hosts is configured too.
> >
> > When I run mpiexec in machine2, I have like this:
> > mariojose@machine2:~/lam-mpi$ mpiexec -n 3 hostname
> > machine1
> > machine2
> >
> -
> > It seems that [at least] one of the processes that was
> started with
> > mpirun did not invoke MPI_INIT before quitting (it is
> possible that
> > more than one process did not invoke MPI_INIT --
> mpirun was only
> > notified of the first one, which was on node n0).
> >
> > mpirun can *only* be used with MPI programs (i.e.,
> programs that
> > invoke MPI_INIT and MPI_FINALIZE).  You can use the
> "lamexec" program
> > to run non-MPI programs over the lambooted nodes.
> >
> -
> > machine1
> > mpirun failed with exit status 252
> >
> > When I run in machine1 I have like this:
> >
> > mariojose@machine1:~/lam-mpi$ mpiexec -n 3 hostname
> > machine1
> > machine1
> > machine2
> >
> -
> > It seems that [at least] one of the processes that was
> started with
> > mpirun did not invoke MPI_INIT before quitting (it is
> possible that
> > more than one process did not invoke MPI_INIT --
> mpirun was only
> > notified of the first one, which was on node n0).
> >
> > mpirun can *only* be used with MPI programs (i.e.,
> programs that
> > invoke MPI_INIT and MPI_FINALIZE).  You can use the
> "lamexec" program
> > to run non-MPI programs over the lambooted nodes.
> >
> -
> > mpirun failed with exit status 252
> >
> > I don't know why I have this message. I think that
> is a error.
> >
> > I try run with gromacs, if anybody use gromacs and can
> help me I  
> > like very much :) .
> >
> > mariojose@machine1:~/lam-mpi$ grompp -f run.mdp -p
> topol.top -c  
> > pr.gro -o run.tpr
> > mariojose@machine1:~/mpiexec -n 3 mdrun -v -deffnm run
> >
> > It's works Ok. I see that cpu of 2 machines woks
> in 100%. It look  
> > well for me. But I have a error em I run mdrun_mpi
> that is a binary  
> > to work in cluster.
> >
> > mariojose@machine1:~/lam-mpi$ grompp -f run.mdp -p
> topol.top -c  
> > pr.gro -o run.tpr -np 3 -sort -shuffle
> > mariojose@machine1:~/lam-mpi$ mpiexec -n 3 mdrun_mpi
> -v -deffnm run
> > NNODES=3, MYRANK=0, HOSTNAME=machine1
> > NNODES=3, MYRANK=2, HOSTNAME=machine1
> > NODEID=0 argc=4
> > NODEID=2 argc=4
> > NNODES=3, MYRANK=1, HOSTNAME=machine2
> > NODEID=1 argc=4
> > :-)  G  R  O  M  A  C  S  (-:
> >
> > Gyas ROwers Mature At Cryogenic
> Speed
> >
> >:-)  VERSION 3.3.3  (-:
> >
> >
> >  Written by David van der Spoel, Erik Lindahl,
> Berk Hess, and  
> > others.
> >   Copyright (c) 1991-2000, University of
> Groningen, The  
> > Netherlands.
> > Copyright (c) 2001-2008, The GROMACS
> development team,
> >check out http://www.gromacs.org for more
> information.
> >
> > This program is free software; you can
> redistribute it and/or
> >  modify it under the terms of the GNU General
> Public License
> > a