Quoting "Jeff Squyres (jsquyres)" :
> If you are looking for the path of least resistance, then going back to
> MPICH is probably your best bet (there is certainly merit in "it ain't
> broke, so don't fix it").
>
True - but where is your sense of adventure!!
> However, there may be a few other f
Dear Richard,
could You please provide further input on what Your program is exactly doing
before being killed.
Could You please recompile Your program with "-g" and send us the call stack.
Would it be possible to share Your code responsible with the list?
Thank You very much.
With best regard
.0.2 snapshots (what will eventually become 1.0.3).
Hope this helps.
> -Original Message-
> From: users-boun...@open-mpi.org
> [mailto:users-boun...@open-mpi.org] On Behalf Of Richard Wait
> Sent: Tuesday, May 09, 2006 7:14 AM
> To: us...@open-mpi.org
> Subject: [OMPI us
Hi
I have been using mpich (v1.2.6) for some time but I have just installed fedora
5 which comes with Open MPI v1.0.1. Using a program that worked with mpich, the
code compiles with om-mpif77 but using om-mpirun the program aborts with the
message:
2 additional processes aborted (not shown)
1 pro