Sorry, I forgot to mention that I did get my mpi app working with:
mpirun --mca oob_tcp_dynamic_ipv4_ports 46100-46117 --mca btl_tcp_port_min_v4
46118 --mca btl_tcp_port_range_v4 17
But it’s not safe just to hard code those port ranges incase someone else uses
those ports, or I want to run the
Hi,
Thanks for the explanation. I’m trying to restrict the port range, because if I
don’t, mpiexec doesn’t function reliably.
With 2 hosts it always works, then as you add hosts it is more and more likely
to fail, until by 16 hosts is almost always fails.
“Fails” here means that mpiexec termin
Gilles,
For some odd reason, 'self, vader' didn't seem as effective as "^tcp". Not
sure why, but at least I have something that seems to work.
I suppose I don't really need tcp sockets on a single laptop :D
Matt
On Thu, Mar 18, 2021 at 8:46 PM Gilles Gouaillardet via users <
users@lists.open-mp
Let me briefly explain how MPI jobs start. mpirun launches a set of daemons,
one per node. Each daemon has a "phone home" address passed to it on its cmd
line. It opens a port (obtained from its local OS) and connects back to the
port provided on its cmd line. This establishes a connection back
No firewall between nodes in the cluster.
OMPI may be asking localhost for available ports, but is it checking those
ports are also available on all the other hosts it’s going to run on?
On 18 Mar 2021, at 15:57, Ralph Castain via users
mailto:users@lists.open-mpi.org>> wrote:
Hmmm...then you