did you happen to get 4.7.1 which comes with ucx-1.7.0-1.47100
compiled again openmpi 4.0.2?

i got snagged by this

https://github.com/open-mpi/ompi/issues/7128

which i thought would have had the fixes merged into the v4.0.2 tag,
but it doesn't seem so in my case


On Fri, Feb 7, 2020 at 11:34 AM Ray Muno via users
<users@lists.open-mpi.org> wrote:
>
> Were using MLNX_OFED 4.7.3. It supplies UCX 1.7.0.
>
> We have OpenMPI 4.02 compiled against the Mellanox OFED 4.7.3 provided 
> versions of UCX, KNEM and
> HCOLL, along with HWLOC 2.1.0 from the OpenMPI site.
>
> I mirrored the build to be what Mellanox used to configure OpenMPI in HPC-X 
> 2.5.
>
> I have users using GCC, PGI, Intel and AOCC compilers with this config.  PGI 
> was the only one that
> was a challenge to build due to conflicts with HCOLL.
>
> -Ray Muno
>
> On 2/7/20 10:04 AM, Michael Di Domenico via users wrote:
> > i haven't compiled openmpi in a while, but i'm in the process of
> > upgrading our cluster.
> >
> > the last time i did this there were specific versions of mpi/pmix/ucx
> > that were all tested and supposed to work together.  my understanding
> > of this was because pmi/ucx was under rapid development and the api's
> > were changing
> >
> > is that still an issue or can i take the latest stable branches from
> > git for each and have a relatively good shot at it all working
> > together?
> >
> > the one semi-immovable i have right now is ucx which is at 1.7.0 as
> > installed by mellanox ofed.  if the above is true, is there a matrix
> > of versions i should be using for all the others?  nothing jumped out
> > at me on the openmpi website
> >
>
>
> --
>
>   Ray Muno
>   IT Manager
>   e-mail:   m...@aem.umn.edu
>   University of Minnesota
>   Aerospace Engineering and Mechanics         Mechanical Engineering
>

Reply via email to