Re: [OMPI users] [EXTERNAL] Issue with mpirun inside a container

2024-09-30 Thread Jeffrey Layton via users
everything in the container, but > for some reason Open MPI failed to figure > out the name in the host file is the container, in this case, try without > the -H option, or try using localhost in the host file. > > Cheers, > > Gilles > > On Mon, Sep 30, 2024 at 1:34 AM Jeffr

Re: [OMPI users] [EXTERNAL] Issue with mpirun inside a container

2024-09-29 Thread Jeffrey Layton via users
nt. Howard From: users on behalf of Jeffrey Layton via users Reply-To: Open MPI Users Date: Friday, September 27, 2024 at 1:08 PM To: Open MPI Users Cc: Jeffrey Layton Subject: [EXTERNAL] [OMPI users] Issue with mpirun inside a container Good afternoon, I'm getting an

[OMPI users] Issue with mpirun inside a container

2024-09-27 Thread Jeffrey Layton via users
Good afternoon, I'm getting an error message when I run "mpirun ... " inside a Docker container. The message: bash: line 1: /usr/local/mpi/bin/orted: No such file or directory -- ORTE was unable to reliably start one or more

[OMPI users] General problem after installation

2024-08-27 Thread Jeffrey Layton via users
Good morning, I'm building version 5.0.3 and the configuration and installation all seem to go fine. I'm installing into my home directory: /home/user/bin So I configured it with the command: CC=gcc FC=gfortran ./configure prefix=/home/user/bin gcc is version 11.4.0 and gfortran is version 11

[OMPI users] Debug lesson #3

2024-05-16 Thread Jeffrey Layton via users
Good morning, In my debugging education I've hit lesson #3 with an error I'm not sure about. I tried reading about it, but I didn't quite get it. My command line is the following: mpirun --mca pml '^ucx' --mca btl '^openib' -np 1 -map-by ppr:8:node /home/jelayton/bin/bin/app This is just the s

[OMPI users] Helping interpreting error output

2024-04-16 Thread Jeffrey Layton via users
Good afternoon MPI fans of all ages, Yet again, I'm getting an error that I'm having trouble interpreting. This time, I'm trying to run ior. I've done it a thousand times but not on an NVIDIA DGX A100 with multiple NICs. The ultimate command is the following: /cm/shared/apps/openmpi4/gcc/4.1.5/

Re: [OMPI users] [EXTERNAL] Help deciphering error message

2024-03-08 Thread Jeffrey Layton via users
are/info --disable-optimizations > --disable-logging --disable-debug --disable-assertions --enable-mt > --disable-params-check --without-go --without-java --enable-cma --with-cuda > --with-gdrcopy --with-verbs --with-knem --with-rdmacm --without-rocm > --with-xpmem --without-fuse3

[OMPI users] Help deciphering error message

2024-03-07 Thread Jeffrey Layton via users
Good afternoon, I'm getting an error message I'm not sure how to use to debug an issue. I'll try to give you all of the pertinent about the setup, but I didn't build the system nor install the software. It's an NVIDIA SuperPod system with Base Command Manager 10.0. I'm building IOR but I'm really

Re: [OMPI users] Error build Open MPI 4.1.5 with GCC 11.3

2023-07-18 Thread Jeffrey Layton via users
ndency analysis. > > I'm guessing that /usr/lib/gcc/x86_64-linux-gnu/9/include/float.h doesn't > actually exist on your system -- but then how did it get into Open MPI's > makefiles? > > Did you run configure on one machine and make on a different machine, > perchance? > --

Re: [OMPI users] Error build Open MPI 4.1.5 with GCC 11.3

2023-07-18 Thread Jeffrey Layton via users
> makefiles? > > Did you run configure on one machine and make on a different machine, > perchance? > -- > *From:* users on behalf of Jeffrey > Layton via users > *Sent:* Monday, July 17, 2023 2:05 PM > *To:* Open MPI Users > *Cc:* Jeffrey

[OMPI users] Error build Open MPI 4.1.5 with GCC 11.3

2023-07-17 Thread Jeffrey Layton via users
Good afternoon, I'm trying to build Open MPI 4.1.5 using GCC 11.3. However, I get an error that I'm not sure how to correct. The error is, ... CC pscatter.lo CC piscatter.lo CC pscatterv.lo CC piscatterv.lo CC psend.lo CC psend_init.lo CC ps

[OMPI users] Question about run time message

2020-03-13 Thread Jeffrey Layton via users
Good morning, I've compiled a hello world MPI code and when I run it, I get some messages I'm not familiar with. The first one is, -- WARNING: Linux kernel CMA support was requested via the btl_vader_single_copy_mechanism MCA

Re: [OMPI users] UCX and MPI_THREAD_MULTIPLE

2019-08-23 Thread Jeffrey Layton via users
Adding the UCX list to this thread. On Fri, Aug 23, 2019 at 7:35 PM Paul Edmon via users < users@lists.open-mpi.org> wrote: > I have a code using MPI_THREAD_MULTIPLE along with MPI-RMA that I'm > using OpenMPI 4.0.1. Since 4.0.1 requires UCX I have it installed with > MT on (1.6.0 build). The