I have been struggling trying to get a usable build of openmpi on Mac OSX Mavericks (10.9.1). I can get openmpi to configure and build without error, but have problems after that which depend on the openmpi version.
With 1.6.5, make check fails the opal_datatype_test, ddt_test, and ddt_raw
tests. The various atomic_* tests pass. See checklogs_1.6.5, attached
as a .gz file.
Following suggestions from openmpi discussions I tried openmpi version
1.7.4rc1. In this case make check indicates all tests passed. But when I
proceeded to try to build a parallel code (parallel HDF5) it failed.
Following an email exchange with the HDF5 support people, they suggested I
try to compile and run the attached bit of simple code Sample_mpio.c (which
they supplied) which does not use any HDF5, but just attempts a parallel
write to a file and parallel read. That test failed when requesting more
than 1 processor -- which they say indicates a failure of the openmpi
installation. The error message was:
MPI_INIT: argc 1
MPI_INIT: argc 1
Testing simple C MPIO program with 2 processes accessing file ./mpitest.data
(Filename can be specified via program argument)
Proc 0: hostname=Ron-Cohen-MBP.local
Proc 1: hostname=Ron-Cohen-MBP.local
MPI_BARRIER[0]: comm MPI_COMM_WORLD
MPI_BARRIER[1]: comm MPI_COMM_WORLD
Proc 0: MPI_File_open with MPI_MODE_EXCL failed (MPI_ERR_FILE: invalid file)
MPI_ABORT[0]: comm MPI_COMM_WORLD errorcode 1
MPI_BCAST[1]: buffer 7fff5a483048 count 1 datatype MPI_INT root 0 comm
MPI_COMM_WORLD
I then went back to my openmpi directories and tried running some of the
individual tests in the test and examples directories. In particular in
test/class I found one test that seem to not be run as part of make check
which failed, even with one processor; this is opal_bitmap. Not sure if
this is because 1.7.4rc1 is incomplete, or there is something wrong with
the installation, or maybe a 32 vs 64 bit thing? The error message is
mpirun detected that one or more processes exited with non-zero status,
thus causing the job to be terminated. The first process to do so was:
Process name: [[48805,1],0]
Exit code: 255
Any suggestions?
More generally has anyone out there gotten an openmpi build on Mavericks to
work with sufficient success that they can get the attached Sample_mpio.c
(or better yet, parallel HDF5) to build?
Details: Running Mac OS X 10.9.1 on a mid-2009 Macbook pro with 4 GB
memory; tried openmpi 1.6.5 and 1.7.4rc1. Built openmpi against the stock
gcc that comes with XCode 5.0.2, and gfortran 4.9.0.
Files attached: config.log.gz, openmpialllog.gz (output of running
ompi_info --all), checklog2.gz (output of make.check in top openmpi
directory).
I am not attaching logs of make and install since those seem to have been
successful, but can generate those if that would be helpful.
ompiinfoalllog.gz
Description: GNU Zip compressed data
checklog2.gz
Description: GNU Zip compressed data
config.log.gz
Description: GNU Zip compressed data
