Thanks for the response. The output I receive is:
mpirun -n 4 mpihello.exe
Master says, Flag: 1 MyID: 0
Master says, Flag2: 2 MyID: 0
Slave: 1
Slave: 2
Slave: 3
Master says, Flag: 1 MyID: 0
Slave says, Flag: 1 MyID: 0
Slave says, Flag: 1 MyID: 0
Master says, Flag2: 2 MyID: 0
Slave says, Flag2: 0 MyID: 0
Slave says, Flag2: 0 MyID: 0
Master says, Flag: 1 MyID: 0
Slave says, Flag: 1 MyID: 0
Master says, Flag2: 2 MyID: 0
Slave says, Flag2: 0 MyID: 0
So after the first mpi_rcv, the myid changes. This occurs on two
Windows 7 64 bit machines. I compiled this on one machine, with the
environment I described previously and the other I just have OpenMPI
installed and ran the .exe using mpirun as shown above.
And if I compile the same code with openMPI uninstalled, but using.
Microsoft MPI, it works as you would expect.
-Amorgan
On Fri, Apr 6, 2012 at 9:25 AM, Jeffrey Squyres <jsquy...@cisco.com
<mailto:jsquy...@cisco.com>> wrote:
The output from that program looks fine to me on Linux:
[6:25] svbu-mpi:~/mpi % mpirun -np 4 hello
Slave: 1
Slave: 2
Slave says, Flag: 1 MyID: 2
Slave says, Flag2: 2 MyID: 2
Slave: 3
Slave says, Flag: 1 MyID: 3
Slave says, Flag2: 2 MyID: 3
Master says, Flag: 1 MyID: 0
Master says, Flag2: 2 MyID: 0
Master says, Flag: 1 MyID: 0
Master says, Flag2: 2 MyID: 0
Master says, Flag: 1 MyID: 0
Master says, Flag2: 2 MyID: 0
Slave says, Flag: 1 MyID: 1
Slave says, Flag2: 2 MyID: 1
Shiqing -- can you verify on Windows?
On Apr 5, 2012, at 6:15 PM, Anton Morgan wrote:
> Some things to add. I installed Microsoft MPI and this issue did
not occur and gave me the correct rank/myid numbers when running
this program. So it seems something might be incorrect in Open
MPI. I would still like to use Open MPI, so I would like to help
and see a resolution to this.
>
> Also to add in the example Makefile. change pikaia to mpihello
to make correctly.
>
> Thanks.
>
> On Thu, Apr 5, 2012 at 3:39 PM, Anton Morgan
<amorgan.cart...@gmail.com <mailto:amorgan.cart...@gmail.com>> wrote:
> My setup is kinda convoluted unfortunately so this also might be
an issue, but just keep that in the back of your mind for now and
assume that is not the problem. I am using Windows 7 64-bit, with
cygwin and compiling using x86_64-w64-mingw32-gfortran and
installed open MPI via OpenMPI_v1.5.5-1_win64.exe. I have compiled
and ran some mpi test programs I made, but first time using the
mpi_send and mpi_recv commands I ran into this error, or what
seems to be an error to me.
>
> Back story: I am trying to run Parallel Pikaia, which is an open
source Genetics Algorithm in Fortran that uses MPI. It should run
out of the box fine, but it does run all processes properly. So I
started to troubleshoot and found that after the first mpi_recv
command on the slaves, the myid changes to 0, but right before the
command it is the appropriate myid/rank. So I made a simple
fortran code to test if it was Pikaia or MPI and it shows to be MPI.
>
> Fortran Code:
> c ----------------------------------------------
>
> program mpi_hello
>
> implicit none
>
> include 'mpif.h'
>
> integer ierr,myid,nproc,rc,flag,nrank,rank
> integer status(MPI_STATUS_SIZE), flag2
>
> c ----------------------------------------------
> c Initialize MPI
> c ----------------------------------------------
> call mpi_init( ierr )
> call mpi_comm_rank( MPI_COMM_WORLD, myid, ierr )
> call mpi_comm_size( MPI_COMM_WORLD, nproc, ierr )
> nrank=nproc-1
>
> c ----------------------------------------------
> c Master portion
> c ----------------------------------------------
> if (myid.eq.0) then
> flag=1
> flag2=2
> c send two integers to all slaves
> do rank=1,nrank
> call mpi_send( flag, 1, MPI_INTEGER, rank,
> + 1, MPI_COMM_WORLD, ierr )
> print 8, flag, myid
> 8 format('Master says, Flag: ',i0.1, ' MyID: ', i0.1)
> call mpi_send( flag2, 1, MPI_INTEGER, rank,
> + 1, MPI_COMM_WORLD, ierr )
> print 10, flag2, myid
> 10 format('Master says, Flag2: ',i0.1, ' MyID: ', i0.1)
> enddo
> c ----------------------------------------------
> c Slave portion
> c ----------------------------------------------
> elseif (myid.ne.0) then
> c to see ID before mpi_rcv
> print *, 'Slave: ', myid
> call mpi_recv( flag, 1, MPI_INTEGER, 0,
> + 1, MPI_COMM_WORLD, status, ierr )
> c check myid after recv which turns to 0 on my environment
> print 9, flag, myid
> 9 format('Slave says, Flag: ',i0.1, ' MyID: ', i0.1)
> call mpi_recv( flag2, 1, MPI_INTEGER, 0,
> + 1, MPI_COMM_WORLD, status, ierr )
> print 11, flag2, myid
> 11 format('Slave says, Flag2: ',i0.1, ' MyID: ', i0.1)
> endif
>
> call mpi_finalize(rc)
> stop
> end
> c ----------------------------------------------
>
> Simple makefile for my environment:
> #
> # MPI makefile
> #
> #INSTALL_DIR = ./
> F77 = x86_64-w64-mingw32-gfortran
> # Progra~2 because it is located in Program Files (x86)
> LIB = -L/cygdrive/c/Progra~2/OpenMPI_v1.5.5-x64/bin
> INCLUDE = -I/cygdrive/c/Progra~2/OpenMPI_v1.5.5-x64/include
> FFLAGS =
> MAKE = make
> SHELL = /bin/sh
> #
> ### End User configurable options ###
>
> SRC1 = mpihello
> OBJS = $(SRC1).o
>
> pikaia : $(OBJS)
> $(F77) $(FFLAGS) -o mpihello $(OBJS) $(LIB) -lmpi_f77
> # rm -f *.o
>
> $(SRC1).o : $(SRC1).f
> $(F77) $(FFLAGS) $(INCLUDE) -c $(SRC1).f
>
> So I am wondering if this is something that is an issue with the
current build of openMPI , if I am missing something or if it's my
convoluted environment. Attached is the source and makefile of
what is above and then my built .exe and a libgcc_s_sjlj-1.dll to
run the .exe.
>
> Thank you for the help
>
> --
> AMorgan
>
>
>
> _______________________________________________
> users mailing list
> us...@open-mpi.org <mailto:us...@open-mpi.org>
> http://www.open-mpi.org/mailman/listinfo.cgi/users
--
Jeff Squyres
jsquy...@cisco.com <mailto:jsquy...@cisco.com>
For corporate legal information go to:
http://www.cisco.com/web/about/doing_business/legal/cri/
_______________________________________________
users mailing list
us...@open-mpi.org <mailto:us...@open-mpi.org>
http://www.open-mpi.org/mailman/listinfo.cgi/users
--
*AMorgan*
_______________________________________________
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users