Re: [OMPI users] mpirun hangs on m1 mac w openmpi-4.1.3

2022-05-05 Thread Bennet Fauber via users
Any chance this is all due to an OS X security setting? Apple has been putting locked doors on many, many things lately. On Thu, May 5, 2022 at 8:57 AM Jeff Squyres (jsquyres) via users wrote: > > Scott -- > > Sorry; something I should have clarified in my original email: I meant you to > run

Re: [OMPI users] libmpi_mpifh.so.40 - error

2022-01-31 Thread Bennet Fauber via users
If you are running this on a cluster or other professionally supported machine, your system administrator may be able to help. You should also check to make sure that you should be running LS-DYNA directly. I believe that you should be running mpirun or mpiexec followed by the name of the LS-DYNA

Re: [OMPI users] RES: OpenMPI - Intel MPI

2022-01-26 Thread Bennet Fauber via users
Luis, Can you install OpenMPI into your home directory (or other shared filesystem) and use that? You may also want contact your cluster admins to see if they can help do that or offer another solution. On Wed, Jan 26, 2022 at 3:21 PM Luis Alfredo Pires Barbosa via users wrote: > > Hi Ralph, >

Re: [OMPI users] OpenMPI 4.1.1, CentOS 7.9, nVidia HPC-SDk, build hints?

2021-09-30 Thread Bennet Fauber via users
definition of `ompi_op_avx_3buff_functions_avx2' > > ./.libs/liblocal_ops_avx2.a(liblocal_ops_avx2_la-op_avx_functions.o):/project/muno/OpenMPI/BUILD/SRC/openmpi-4.1.1/ompi/mca/op/avx/op_avx_functions.c:651: > first defined here > make[2]: *** [mca_op_avx.la] Error 2 > make[2]: Le

Re: [OMPI users] OpenMPI 4.1.1, CentOS 7.9, nVidia HPC-SDk, build hints?

2021-09-29 Thread Bennet Fauber via users
Ray, If all the errors about not being compiled with -fPIC are still appearing, there may be a bug that is preventing the option from getting through to the compiler(s). It might be worth looking through the logs to see the full compile command for one or more of them to see whether that is true?

[OMPI users] MCA parameter to disable OFI?

2021-04-20 Thread Bennet Fauber via users
We are getting this message when OpenMPI starts up. -- WARNING: There was an error initializing an OpenFabrics device. Local host: gls801 Local device: mlx5_0

Re: [OMPI users] OpenMPI and maker - Multiple messages

2021-02-18 Thread Bennet Fauber via users
Thomas, I think OpenMP is installed correctly. This $ mpiexec -mca btl ^openib -N 5 gcc --version asks OpenMPI to run `gcc --version` once for each processor assigned to the job, so if you did NOT get 5 sets of output, it would be incorrect. >From your error error message, it looks to me as th

Re: [OMPI users] Books/resources to learn (open)MPI from

2020-08-06 Thread Bennet Fauber via users
It covers a good deal more than MPI, but there is at least one full chapter on MPI in Scientific Programming and Computer Architecture, Divakar Viswanath (MIT Press, 2017) also available online at https://divakarvi.github.io/bk-spca/spca.html https://divakarvi.github.io/bk-spca/spca.

[OMPI users] vader_single_copy_mechanism

2020-02-24 Thread Bennet Fauber via users
We are getting errors on our system that indicate that we should export OMPI_MCA_btl_vader_single_copy_mechanism=none Our user originally reported > This occurs for both GCC and PGI. The errors we get if we do not set this > indicate something is going wrong in our communication which uses

Re: [OMPI users] OpenFabrics

2020-02-03 Thread Bennet Fauber via users
This is what CentOS installed. $ yum list installed hwloc\* Loaded plugins: langpacks Installed Packages hwloc.x86_64 1.11.8-4.el7@os hwloc-devel.x86_64 1.11.8-4.el7@os hwloc-libs.x86_64

Re: [OMPI users] OpenFabrics

2020-02-03 Thread Bennet Fauber via users
duler (Slurm), PMIx, and OpenMPI, so I am a bit muddled about how all the moving pieces work yet. On Sun, Feb 2, 2020 at 4:16 PM Jeff Squyres (jsquyres) wrote: > > Bennet -- > > Just curious: is there a reason you're not using UCX? > > > > On Feb 2, 2020, a

[OMPI users] OpenFabrics

2020-02-02 Thread Bennet Fauber via users
We get these warnings/error from OpenMPI, version 3.1.4 and 4.0.2 -- WARNING: No preset parameters were found for the device that Open MPI detected: Local host:gl3080 Device name: mlx5_0 Device ven

Re: [OMPI users] OpenMPI 3.1.4 and UCX

2019-09-08 Thread Bennet Fauber via users
Setting UCX_LOG_LEVEL=error suppresses the messages. There may be release eager messages. If anyone is interested, this is the GitHub Issue: https://github.com/openucx/ucx/issues/4175 On Sun, Sep 8, 2019 at 11:37 AM Bennet Fauber wrote: > > I am posting this here, first, as I think these quest

[OMPI users] OpenMPI 3.1.4 and UCX

2019-09-08 Thread Bennet Fauber via users
I am posting this here, first, as I think these questions are probably OpenMPI related and not related specifically to parallel HDF5. I am trying to get parallel HDF5 installed, but in the `make check`, I am getting many, many warnings of the form - mpool.c:38 UCX WARN object 0x2afbefc67f