On Jul 16, 2020, at 2:56 PM, Lana Deere via users <users@lists.open-mpi.org<mailto:users@lists.open-mpi.org>> wrote:
I am new to open mpi. I built 4.0.4 on a CentOS7 machine and tried doing an mpirun of a small program compiled against openmpi. It seems to have failed because my host does not have infiniband. I can't seem to figure out how I should configure when I build so it will do what I want, namely use infiniband if there are IB HCAs on the system and otherwise use the ethernet on the system. UCX is the underlying library that Mellanox/Nvidia prefers these days for use with MPI and InfiniBand. Meaning: you should first install UCX and then build Open MPI with --with-ucx=/directory/of/ucx/installation. We just hosted parts 1 and 2 of a seminar entitled "The ABCs of Open MPI" that covered topics like this. Check out: https://www.open-mpi.org/video/?category=general#abcs-of-open-mpi-part-1 and https://www.open-mpi.org/video/?category=general#abcs-of-open-mpi-part-2 In particular, you might want to look at slides 28-42 in part 2 for a bunch of discussion about how Open MPI (by default) picks the underlying network / APIs to use, and then how you can override that if you want to. -- Jeff Squyres jsquy...@cisco.com<mailto:jsquy...@cisco.com>