protocols for MPI message passing (and ignore the "normal" Ethernet interfaces).
--
Jeff Squyres
jsquy...@cisco.com
From: users on behalf of Harutyun Umrshatyan
via users
Sent: Tuesday, September 6, 2022 2:58 AM
To: Open MPI Users
Cc: Harutyun Umrsha
Guys,
I actually could make it work!
I had to change Mellanox configuration from Ethernet to Infiniband and set
up IPoIB.
That was in fact a good experience, but the issue is that not all my
Mellanoxes can be configured to Infiniband.
My final destination is to make it work without Mellanox OFED o
Stupid reply from me. You do know that Infiniband adapters operate without
an IP address?
Yes, configuring IPOIB is a good idea - however Infiniband adapters are
more than 'super ethernet adapters'
I would run the following utilities to investigate your Infiniband fabric
sminfo
ibhosts
ibdiagnet
Hi everyone
Could someone please share any experience using MPI with RoCE ?
I am trying to set up infiniband adapters (Mellanox cards for example) and
run MPI applications with RoCE (Instead of TCP).
As I understand, there might be some environment requirements or
restrictions like kernel version,