Yes, Ethernet and infiniband networks which are connecting nodes
On Nov 22, 2013 11:55 PM, "Reuti" wrote:
> Hi,
>
> Am 20.11.2013 um 21:42 schrieb Venkat Reddy:
>
> > Hi Team,
> >
> > I am compiled the OpenFoam-1.7.1,openFoam-2.2.1,OpenFoam-2.2.2 versions.
> > All the versions same problem that s
Hi,
I installed OpenMPI on our small XE6 using the configure options under /contrib
directory. It appears it is working fine, but it ignores MCA parameters (set
in env var). So I switched to mpirun (in OpenMPI) and it can handle MCA
parameters somehow. However, mpirun fails to allocate proc
My guess is that you aren't doing the allocation correctly - since you are
using qsub, can I assume you have Moab as your scheduler?
aprun should be forwarding the envars - do you see them if you just run "aprun
-n 1 printenv"?
On Nov 23, 2013, at 2:13 PM, Teranishi, Keita wrote:
> Hi,
>
> I
Strange - I run on Mavericks now without problem. Can you run "mpirun -n 1
hostname"?
You also might want to check your PATH and LD_LIBRARY_PATH to ensure you have
the prefix where you installed OMPI 1.6.5 at the front. Mac distributes a very
old version of OMPI with its software and you don't
Hmmm...well, it seems to work for me:
$ mpirun -n 4 ./thread_init
Calling MPI_Init_thread...
Calling MPI_Init_thread...
Calling MPI_Init_thread...
Calling MPI_Init_thread...
MPI_Init_thread returned, provided = 3
MPI_Init_thread returned, provided = 3
MPI_Init_thread returned, provided = 3
MPI_Ini
Here is the module environment, and I allocate interactive node by "qsub -I
-lmppwidth=32 -lmppnppn=16."
module list
Currently Loaded Modulefiles:
1) modules/3.2.6.7
2) craype-network-gemini
3) cray-mpich2/5.6.4
4) atp/1.6.3
5) xe-sysroot/4.1.40
6) switch/1.0-1.0401.36779.2.72.gem
7)
May have to wait for Nathan on Mon - I'm not familiar enough with the XE
environment. One thing I note: in your modules, I see cray-mpich2 but not OMPI.
Are you sure you are using the OMPI you built?
What version of OMPI is this?
You can add --display-alloc to your cmd line to see what OMPI thi
Dominique,
It looks like you are compiling Open MPI with Homebrew. The flags they use in
the formula when --enable-mpi-thread-multiple is wrong.
c.f. a similar problem with MacPorts
https://lists.macosforge.org/pipermail/macports-tickets/2013-June/138145.html.
Pierre
On Nov 23, 2013, at 4:56 PM