On Wed, 2014-03-12 at 10:52 -0400, Bennet Fauber wrote: > My experience with Rmpi and OpenMPI is that it doesn't seem to do well > with the dlopen or dynamic loading. I recently installed R 3.0.3, and > Rmpi, which failed when built against our standard OpenMPI but > succeeded using the following 'secret recipe'. Perhaps there is > something here that will be helpful for you. > I have a couple of things to report. First, http://www.stats.uwo.ca/faculty/yu/Rmpi/changelogs.htm says It looks like that the option --disable-dlopen is not necessary to install Open MPI 1.6, at least on Debian. This might be R's .onLoad correctly loading dynamic libraries and Open MPI is not required to be compiled with static libraries enabled.
Second, I tried rebuilding MPI with --disable-dlopen WITHOUT any of the changes to R or Rmpi. The behavior didn't change. Nobody said it would, but I thought it was worth a try. Third, the source of the double-load of mpi-related libraries looks like this code in Rmpi.c: if (!dlopen("libmpi.so.0", RTLD_GLOBAL | RTLD_LAZY) && !dlopen("libmpi.so", RTLD_GLOBAL | RTLD_LAZY)){ So libmpi.so.1 is loaded because it's linked to Rmpi.so, and libmpi.so.0 is loaded because the code does so explicitly. The motivation was http://www.stats.uwo.ca/faculty/yu/Rmpi/changelogs.htm notes ---------------------------------- 2007-10-24, version 0.5-5: dlopen has been used to load libmpi.so explicitly. This is mainly useful for Rmpi under OpenMPI where one might see many error messages: mca: base: component_find: unable to open osc pt2pt: file not found (ignored) if libmpi.so is not loaded with RTLD_GLOBAL flag. ------------------------------------- I think I'll try changing to to try libmpi.so first so that it picks up libmpi.so.1 if available. I've already rebuilt R, though it looks as if Rmpi may have been the source of the problems. Ross > ### Install openmpi 1.6.5 > > export PREFIX=/scratch/support_flux/ > bennet/local > COMPILERS='CC=gcc CXX=g++ FC=gfortran F77=gfortran' > CONFIGURE_FLAGS='--disable-dlopen --enable-static' > cd openmpi-1.6.5 > ./configure --prefix=${PREFIX} \ > --mandir=${PREFIX}/man \ > --with-tm=/usr/local/torque \ > --with-openib --with-psm \ > --with-io-romio-flags='--with-file-system=testfs+ufs+nfs+lustre' \ > $CONFIGURE_FLAGS \ > $COMPILERS > make > make check > make install > > ### Install R 3.0.3 > > wget http://cran.case.edu/src/base/R-3/R-3.0.3.tar.gz > tar xzvf R-3.0.3.tar.gz > cd R-3.0.3 > > export MPI_HOME=/scratch/support_ > flux/bennet/local > export LD_LIBRARY_PATH=$MPI_HOME/lib:${LD_LIBRARY_PATH} > export LD_LIBRARY_PATH=$MPI_HOME/openmpi:${LD_LIBRARY_PATH} > export PATH=${PATH}:${MPI_HOME}/bin > export LDFLAGS='-Wl,-O1' > export R_PAPERSIZE=letter > export R_INST=${PREFIX} > export FFLAGS='-O3 -mtune=native' > export CFLAGS='-O3 -mtune=native' > ./configure --prefix=${R_INST} --mandir=${R_INST}/man > --enable-R-shlib --without-x > make > make check > make install > wget http://www.stats.uwo.ca/faculty/yu/Rmpi/download/linux/Rmpi_0.6-3.tar.gz > R CMD INSTALL Rmpi_0.6-3.tar.gz \ > --configure-args="--with-Rmpi-include=$MPI_HOME/include > --with-Rmpi-libpath=$MPI_HOME/lib --with-Rmpi-type=OPENMPI" > > Make sure environment variables and paths are set > > MPI_HOME=/home/software/rhel6/openmpi-1.6.5/gcc-4.4.7-static > PATH=/home/software/rhel6/openmpi-1.6.5/gcc-4.4.7-static/bin > LD_LIBRARY_PATH=$LD_LIBRARY_PATH}:/home/software/rhel6/openmpi-1.6.5/gcc-4.4.7-static/lib > LD_LIBRARY_PATH=$LD_LIBRARY_PATH}:/home/software/rhel6/openmpi-1.6.5/gcc-4.4.7-static/lib/openmpi > PATH=/home/software/rhel6/R/3.0.3/bin:$LD_LIBRARY_PATH} > LD_LIBRARY_PATH=/home/software/rhel6/R/3.0.3/lib64/R/lib:$LD_LIBRARY_PATH} > > ## Then install snow with > R > > install.packages('snow') > [ . . . . > > > I think the key thing is the --disable-dlopen, though it might require > both. Jeff Squyres had a post about this quite a while ago that gives > more detail about what's happening: > > http://www.open-mpi.org/community/lists/devel/2012/04/10840.php > > -- bennet > _______________________________________________ > users mailing list > us...@open-mpi.org > http://www.open-mpi.org/mailman/listinfo.cgi/users