Re: [OMPI users] Multi-rail support

2010-03-25 Thread Jeff Squyres
On Mar 25, 2010, at 10:31 AM, Rolf Vandevaart wrote: > You do not need to configure both IB ports or do IB bonding. To hammer this point home: Open MPI will automatically use all ACTIVE OpenFabrics ports that it can find, regardless of whether they have IPoIB setup on them or not. -- Jeff Squ

Re: [OMPI users] Multi-rail support

2010-03-25 Thread Rolf Vandevaart
You need to ensure your cluster has some type of TCP connectivity. That can be via a non-IB interface if you have that, or configure one of your IB interfaces with IP. It is needed so the run-time environment can start the job and exchange endpoint information to all the processes. You do not

Re: [OMPI users] Multi-rail support

2010-03-25 Thread PN
A quick question. Do I need to configure different IP for both IB ports before running mpirun? Or configure an IP and bond both IB ports? Or simply configure one IP for ib0 is enough? Thanks a lot. PN 2010/3/25 Rolf Vandevaart > They will automatically be used by the library. There is nothing

Re: [OMPI users] Multi-rail support

2010-03-25 Thread Rolf Vandevaart
They will automatically be used by the library. There is nothing special that you need to do. Unfortunately, there is no simple way to tell if they are being used. I would suggest that you specifically call them out in different calls to mpirun to make sure they are both working. If they bo

[OMPI users] Multi-rail support

2010-03-25 Thread Dmitry Kuzmin
Hi there, We have a cluster with 2 HCAs installed on each node (Mellanox ConnectX IB QDR cards). It's not clear from the documentation how we could use both of them for MPI communications. What should we do to enable 2 cards and how we can check that both are using? Thanks in advance! Dmitry