Re: [OMPI users] MPI with RoCE

2022-09-06 Thread Harutyun Umrshatyan via users
ever Infiniband adapters are > more than 'super ethernet adapters' > I would run the following utilities to investigate your Infiniband fabric > > sminfo > ibhosts > ibdiagnet > > Then on one of the compute nodes > > ofed_info > > ompi_info > > >

Re: [OMPI users] [EXT] MPI with RoCE

2022-09-03 Thread Harutyun Umrshatyan via users
nce RoCE is configured and tested (using things like ib_send_bw -d > mlx5_bond_0 -x 7 -R -T 106 -D 10), getting UCX to use RoCE is quite easy, > and compiling OpenMPI to use UCX is also very easy. > > Sean > ------ > *From:* users on behalf of Harutyun &g

[OMPI users] MPI with RoCE

2022-09-03 Thread Harutyun Umrshatyan via users
Hi everyone Could someone please share any experience using MPI with RoCE ? I am trying to set up infiniband adapters (Mellanox cards for example) and run MPI applications with RoCE (Instead of TCP). As I understand, there might be some environment requirements or restrictions like kernel version,