Perfect, thanks a lot!
> Gesendet: Freitag, 12. Juni 2015 um 17:13 Uhr
> Von: "Noam Bernstein"
> An: "Open MPI Users"
> Betreff: Re: [OMPI users] OpenMPI (1.8.3) and environment variable export
>
> > On Jun 12, 2015, at 11:08 AM, borno_bo...@gmx.de wrote:
> >
> > Hey there,
> >
> > I know th
Just a follow-up. RPATH was the trouble. All is well now in the land
of the climatologists again. Thanks again for the help.
Ray
On 6/12/2015 8:00 AM, Ray Sheppard wrote:
Thanks again Gilles,
You might be on to something. Dynamic libraries sound like
some
> On Jun 12, 2015, at 11:08 AM, borno_bo...@gmx.de wrote:
>
> Hey there,
>
> I know that variable export in general can be done with the -x option of
> mpirun, but I guess that won't help me.
> More precisely I have a heterogeneous cluster (number of cores per cpu) and
> one process for each n
Is this a threaded code? If so, you should add —bind-to none to your 1.8 series
command line
> On Jun 12, 2015, at 7:58 AM, kishor sharma wrote:
>
> Hi There,
>
>
>
> I am facing slowness running numpy code using mpirun with openmpi 1.8.1
> version.
>
>
>
> With Open MPI (1.8.1)
>
> --
Hey there,
I know that variable export in general can be done with the -x option of
mpirun, but I guess that won't help me.
More precisely I have a heterogeneous cluster (number of cores per cpu) and one
process for each node. The application I need to launch uses hybrid MPI+OpenMP
paralleliza
Hi There,
I am facing slowness running numpy code using mpirun with openmpi 1.8.1
version.
With Open MPI (1.8.1)
-
> /usr/lib64/openmpi/bin/mpirun -version
mpirun (Open MPI) 1.8.1
Report bugs to http://www.open-mpi.org/community/help/
> time /usr/lib64/openmpi/bin/mp
Hi There,
I am facing slowness running numpy code using mpirun with openmpi 1.8.1
version.
With Open MPI (1.8.1)
-
> /usr/lib64/openmpi/bin/mpirun -version
mpirun (Open MPI) 1.8.1
Report bugs to http://www.open-mpi.org/community/help/
> time /usr/lib64/openmpi/bin/mp
Thanks again Gilles,
You might be on to something. Dynamic libraries sound like something
a Python developer might love (no offense intended to the stereotype).
It would also explain why the build went smoothly but the test run
crashed. I am going to try putting an RPATH variable in the env
Hi Dave,
I use Debian/Ubuntu by default and my first approach (a number of years ago
at this stage) was to install from apt. However, if memory serves, I had
difficulty getting the packages LAM-MPI to work with the FDS5 software at
the time.
Obviously, this is specifc to the FDS5 software.
My int
Ray,
one possibility is one of the loaded library was built with -rpath and this
causes the mess
an other option is you have to link _error.so with libmpi.so
Cheers,
Gilles
On Friday, June 12, 2015, Ray Sheppard wrote:
> Hi Gilles,
> Thanks for the reply. I completely forgot that lived in
Hi Gilles,
Thanks for the reply. I completely forgot that lived in the main
library. ldd doesn't show that it read my LD_LIBRARY_PATH (I also push
out an LPATH variable just for fun). I force modules to echoed when
users initialize them. You can see OpenMPI was visible to H5py. Now I
won
11 matches
Mail list logo