Hi everyone,
I'm working on OpenFOAM v5 and have been successful in getting two nodes working together.
(both 18.04 LTS connected via GbE)
As both machines have a quad port gigabit NIC I have been trying to persuade mpirun to use
more than a single link on each machine for its communications, b
Hi Jeff,
> How are you measuring that it hasn't been successful?
A network switch sits between the two machines and I am watching the link activity on the
ports.
> One thing to make sure of is that you interfaces are on different subnets.
Oh. I had them all on the same subnet. Now the first
Many, many thanks.
Couldn't see the wood for the trees !
I now have the two machines using all their 1Gb ports to talk to each other.
Cheers Jeff,
Happy holidays.
Bob. South UK.
___
users mailing list
users@lists.open-mpi.org
https://lists.open-mpi.org/
Hi everyone,
To be honest, as an MPI / IB noob, I don't know if this falls under
OpenMPI or Mellanox
Am running a small cluster of HP DL380 G6/G7 machines.
Each runs Ubuntu server 20.04 and has a Mellanox ConnectX-3 card,
connected by an IS dumb switch.
When I begin my MPI program (snappy