Re: [OMPI users] OpenMPI with Gemini Interconnect

2014-04-16 Thread Ralph Castain
The Java bindings are written on top of the C bindings, so you'll be able to use those networks just fine from Java :-) On Wed, Apr 16, 2014 at 2:27 PM, Saliya Ekanayake wrote: > Thank you Nathan, this is what I was looking for. I'll try to build > OpenMPI 1.8 and get back to this thread if I

Re: [OMPI users] OpenMPI with Gemini Interconnect

2014-04-16 Thread Saliya Ekanayake
Thank you Nathan, this is what I was looking for. I'll try to build OpenMPI 1.8 and get back to this thread if I run into issues. Saliya On Wed, Apr 16, 2014 at 5:19 PM, Nathan Hjelm wrote: > You do not need CCM to use Open MPI on with Gemini and Aries. Open MPI > has natively supported both n

Re: [OMPI users] OpenMPI with Gemini Interconnect

2014-04-16 Thread Nathan Hjelm
You do not need CCM to use Open MPI on with Gemini and Aries. Open MPI has natively supported both networks since 1.7.0. Please take a look at the platform files in contrib/platform/lanl/cray_xe6 for CLE 4.1 support. You should be able to just build using: configure --with-platform=contrib/platfor

Re: [OMPI users] OpenMPI with Gemini Interconnect

2014-04-16 Thread Saliya Ekanayake
I see. Also, I wanted to build OpenMPI because the provided OpenMPI didn't have Java binding. It seems at this point the only option is to use TCP in CCM in BigRed 2 and if I remember correctly Mason and Quarry don't have IB as well, correct? Thank you, Saliya On Wed, Apr 16, 2014 at 5:01 PM,

Re: [OMPI users] OpenMPI with Gemini Interconnect

2014-04-16 Thread Ray Sheppard
Hello, Big Red 2 provides its own MPICH based MPI. The only case where the provided OpenMPI module becomes relevant is when you create a CCMLogin instance in Cluster Compatibility Mode (CCM). For most practical uses, those sorts of needs are better addressed on the Quarry or Mason machines.

[OMPI users] OpenMPI with Gemini Interconnect

2014-04-16 Thread Saliya Ekanayake
Hi, We have a Cray XE6/XK7 supercomputer (BigRed II) and I was trying to get OpenMPI Java binding working on it. I couldn't find a way to utilize its Gemini interconnect, instead was running on TCP, which is inefficient. I see some work has been done along these lines in [1] and wonder if you cou