Re: [OMPI users] LAMA error - mpirun segfault

2015-08-10 Thread Nils Smeds
Thanks Ralph, I'm trying to find out what can be accomplished in binding using the command-line and when I need to generate a mapping file. Using the command line I find is typically more robust. It is just too easy to forget to adapt a mapping script when moving between systems. For the sake of

Re: [OMPI users] Son of Grid Engine, Parallel Environments and OpenMPI 1.8.7

2015-08-10 Thread Lane, William
Here's a qrsh run of OpenMPI 1.8.7 that actually generated an error message, usually I get no output whatsoever (i.e. from stderr or stdout) from the job, and it eventually generates core dumps: qrsh -V -now yes -pe orte 209 mpirun -np 209 -display-devel-map --prefix /hpc/apps/mpi/openmpi/1.8.7

[OMPI users] Open MPI 1.8.8 and hcoll in system space

2015-08-10 Thread David Shrader
Hello All, I'm having some trouble getting Open MPI 1.8.8 to configure correctly when hcoll is installed in system space. That is, hcoll is installed to /usr/lib64 and /usr/include/hcoll. I get an error during configure: $> Konsole output ./configure --with-hcoll ...output snipped... Konsole

Re: [OMPI users] Open MPI 1.8.8 and hcoll in system space

2015-08-10 Thread Gilles Gouaillardet
David, the configure help is misleading about hcoll ... --with-hcoll(=DIR) Build hcoll (Mellanox Hierarchical Collectives) support, searching for libraries in DIR the =DIR is not really optional ... configure will not complain if you configure with --with-hcoll o