On Mar 9, 2006, at 12:18 PM, Pierre Valiron wrote:

- 'mpirun --help' non longer crashes.

Improvement :)

- standard output seems messy:

a) 'mpirun -np 4 pwd' returns randomly 1 or two lines, never 4. The same behaviour occurs if the output is redirected to a file.

b) When running some simple "demo" fortran code, the standard output is buffered within open-mpi and all results are issued at the end. No intermediates are showed.

Ok, I know what the issue here is. We don't properly support ptys on Solaris, so the Fortran code is going into page buffering mode causing all kinds of issues. I think the same problem may be responsible for the issues with the race condition for short lived programs. I'm working on a fix for this issue, but it might take a bit of time.

- running a slightly more elaborate program fails:

a) compile behaves differently with mpif77 and mpif90.

While mpif90 compiles and builds "silently", mpif77 is talkative:

valiron@icare ~/BENCHES > mpif77 -xtarget=opteron -xarch=amd64 -o all all.f NOTICE: Invoking /opt/Studio11/SUNWspro/bin/f90 -f77 -ftrap=%none - I/users/valiron/lib/openmpi-1.1a1r9224/include -xtarget=opteron - xarch=amd64 -o all all.f -L/users/valiron/lib/openmpi-1.1a1r9224/ lib -lmpi -lorte -lopal -lsocket -lnsl -lrt -lm -lthread -ldl
all.f:
       rw_sched:
MAIN all:
       lam_alltoall:
       my_alltoall1:
       my_alltoall2:
       my_alltoall3:
       my_alltoall4:
       check_buf:
       alltoall_sched_ori:
       alltoall_sched_new:

b) whatever the code was compiled with mpif77 or mpif90, execution fails:

valiron@icare ~/BENCHES > mpirun -np 2 all
Signal:11 info.si_errno:0(Error 0) si_code:1(SEGV_MAPERR)
Failing at addr:40
*** End of error message ***
Signal:11 info.si_errno:0(Error 0) si_code:1(SEGV_MAPERR)
Failing at addr:40
*** End of error message ***

Compiling with -g adds no more information.

Doh, that probably shouldn't be happening. I'll try to investigate further once I have the pty issues sorted out.

Brian


--
  Brian Barrett
  Open MPI developer
  http://www.open-mpi.org/


Reply via email to