[OMPI users] [OMPI USERS] Cross-compiling

2017-11-14 Thread Alberto Ortiz
Hi, I am trying to run in this type of environment: 1- A linux PC in which I intend to compile MPI programs for arm embedded processors 2- The arms I have OpenMPI compiled in the arms with dynamic libraries, in case I compile natively as well as for the use of 'mpirun' when I get the cross-compil

[OMPI users] Build options

2017-11-14 Thread Bennet Fauber
We are trying SLURM for the first time, and prior to this I've always built OMPI with Torque support. I was hoping that someone with more experience than I with both OMPI and SLURM might provide a bit of up-front advice. My situation is that we are running CentOS 7.3 (soon to be 7.4), we use Mell

Re: [OMPI users] Build options

2017-11-14 Thread David Lee Braun
Hi Bennet, what is the issue you have with dlopen? and what options do you use with mpi --bind? i think the only change i make to my openmpi compile is to added '--with-cuda=...' and '--with-pmi=...' D On 11/14/2017 10:01 AM, Bennet Fauber wrote: > We are trying SLURM for the first time, and p

Re: [OMPI users] Build options

2017-11-14 Thread Bennet Fauber
David, Thanks for the reply. I believe the dlopen and Rmpi don't get along because the Rmpi uses fork. That's a vague recollection from several years ago. R is pretty important for us. I believe that leaving dlopen enabled also hits our NFS server harder with I/O requests for the modules. --

[OMPI users] Help with binding processes correctly in Hybrid code (Openmpi +openmp)

2017-11-14 Thread Anil K. Dasanna
Hello all, I am relatively new to mpi computing. I am doing particle simulations. So far, I only used pure mpi and I never had a problem. But for my system, its the best if one uses hybrid programming. But I always fail to correctly bind all processes and receive binding errors from cluster. Could

Re: [OMPI users] Help with binding processes correctly in Hybrid code (Openmpi +openmp)

2017-11-14 Thread Gilles Gouaillardet
Hi, per https://www2.cisl.ucar.edu/resources/computational-systems/cheyenne/running-jobs/pbs-pro-job-script-examples, you can try #PBS -l select=2:ncpus=16:mpiprocs=2:ompthreads=8 Cheers, Gilles On Tue, Nov 14, 2017 at 4:32 PM, Anil K. Dasanna wrote: > Hello all, > > I am relatively new to