Re: [OMPI users] Moving an installation

2020-07-24 Thread Steven Varga via users
Hi Currently i am approaching a similar problem/workflow with spack and an AWS S3 shared storage. Mounting the storage from a laptop gives you same layout as on each node of my AWC EC2 cluster. As others mentioned before: you still have to recompile your work, to take advantage of the XEON class c

Re: [OMPI users] Singleton and Spawn

2019-09-25 Thread Steven Varga via users
As far as I know you have to wire up the connections among MPI clients, allocate resources etc. PMIx is a library to set up all processes, and shipped with openmpi. The standard HPC method to launch tasks is through job schedulers such as SLURM or GRID Engine. SLURM srun is very similar to mpirun:

Re: [OMPI users] **URGENT: Error during testing

2019-08-19 Thread Steven Varga via users
Hi this is steven. I am building custom clusters on AWS Ec2 and had some problems in the past. I am getting good result with external pmix 3.1.3 ./autogen.sh && ./configure --prefix=/usr/local/ --with-platform=optimized --with-hwloc=/usr/local --with-libevent=/usr/local --enable-pmix-binaries --en

Re: [OMPI users] Segmentation fault when using 31 or 32 ranks

2019-07-10 Thread Steven Varga via users
Hi i am fighting similar. Did you try to update the pmix most recent 3.1.3 series release? On Wed, Jul 10, 2019, 12:24 Raymond Arter via users, < users@lists.open-mpi.org> wrote: > Hi, > > I have the following issue with version 4.0.1 when running on a node with > two 16 core CPUs (Intel Xeon Gol