Hi
I have been using mpich (v1.2.6) for some time but I have just installed fedora
5 which comes with Open MPI v1.0.1. Using a program that worked with mpich, the
code compiles with om-mpif77 but using om-mpirun the program aborts with the
message:
2 additional processes aborted (not shown)
1 pro
If you are looking for the path of least resistance, then going back to
MPICH is probably your best bet (there is certainly merit in "it ain't
broke, so don't fix it").
However, there may be a few other factors to consider:
- Just because an app runs and completes with one MPI implementation
doe
Most excellent -- many thanks!
> -Original Message-
> From: users-boun...@open-mpi.org
> [mailto:users-boun...@open-mpi.org] On Behalf Of Marcelo Souza
> Sent: Thursday, May 04, 2006 5:40 PM
> To: us...@open-mpi.org
> Subject: [OMPI users] Open MPI 1.0.2 Slackware Package
>
> If inter
Dear Richard,
could You please provide further input on what Your program is exactly doing
before being killed.
Could You please recompile Your program with "-g" and send us the call stack.
Would it be possible to share Your code responsible with the list?
Thank You very much.
With best regard
Pallas runs OK up to Alltoall test, then we get:
/data/software/qa/MPI/openmpi-1.1a2-rhel4-`uname -m`-mvapi/bin/mpirun
--mca \
mpi_leave_pinned 1 --np 4 --hostfile /tmp/hostfile env
LD_LIBRARY_PATH=/data\
/software/qa/MPI/openmpi-1.1a2-rhel3-`uname
-m`-mvapi/lib:/data/software/qa/\
MPI/openmpi-1.1