Hi,
I have been trying to install OpenMPI v4.1.4 on a university HPC cluster.
We use the Bright cluster manager and have SLURM v21.08.8 and RHEL 8.6. I
used a script to install OpenMPI that a former co-worker had used to
successfully install OpenMPI v3.0.0 previously. I updated it to include new
v
Tue, Oct 4, 2022 at 12:33 PM Pritchard Jr., Howard
wrote:
> *Message sent from a system outside of UConn.*
>
> HI JD,
>
>
>
> Could you post the configure options your script uses to build Open MPI?
>
>
>
> Howard
>
>
>
> *From: *users on behalf
; file.
>
>
>
>
>
> *Jeffrey D. (JD) Tamucci *
>
> University of Connecticut
>
> Molecular & Cell Biology
>
> RA in Lab of Eric R. May
>
> PhD / MPH Candidate
>
> he/him
>
>
>
>
>
> On Tue, Oct 4, 2022 at 12:33 PM Pritchard Jr.,
shared/apps/slurm/21.08.8/lib64 \
> --with-hwloc=/cm/shared/apps/hwloc/1.11.11 \
> --with-cuda=/gpfs/sharedfs1/admin/hpc2.0/apps/cuda/11.6 \
> --enable-shared \
> --enable-static &&
> make -j 32 &&
> make -j 32 check
> make install
>
> T