On Nov 17, 2005, at 9:20 AM, Brian Barrett wrote:
I'm unable to replicate your problem. I was testing on a Fedora Core
3 system with Clustermatic 5. Is is possible that you have a random
dso from a previous build in your installation path? How are you
running mpirun -- maybe I'm just not hitt
On Nov 17, 2005, at 8:21 AM, David Huebner wrote:
I'm managing a small cluster running Clustermatic 5 on top of
Fedora Core 4. OMPI won't build, exiting with the following error.
gcc -O3 -DNDEBUG -fno-strict-aliasing -pthread -o .libs/orted
orted.o -Wl,--export-dynamic ../../../orte/.libs
The MX library call exit if the error handler is not set before the
initialization. This error get fixed, it will get into the tarball
shortly.
Meanwhile you can use the btl_base_exclude=mx,gm in order to force them to
be skipped.
Thanks,
george.
On Thu, 17 Nov 2005, Troy Telford wrot
I wouldn't be suprised if this is simply an issue of configuration:
In my test cluster, I've got Myrinet, InfiniBand, and Gigabit Ethernet
support.
My understanding is that when you use 'mpirun' without specifying an MCA
(including systemwide and/or user configurations in ~/.openmpi) , Open
Daryl -
I'm unable to replicate your problem. I was testing on a Fedora Core
3 system with Clustermatic 5. Is is possible that you have a random
dso from a previous build in your installation path? How are you
running mpirun -- maybe I'm just not hitting the same code path you
are...
> Date: Thu, 17 Nov 2005 09:17:07 -0700
> From: "Daryl W. Grunau"
> Subject: Re: [O-MPI users] OMPI 1.0 rc6 --with-bproc errors
> To: Jeff Squyres
> Cc: Open MPI Users
> Message-ID: <20051117161707.ga6...@lanl.gov>
> Content-Type: text/plain; charset=us-ascii
>
> > Date: Tue, 15 Nov 2005 08:43:
I'm managing a small cluster running Clustermatic 5 on top of Fedora
Core 4. OMPI won't build, exiting with the following error.
gcc -O3 -DNDEBUG -fno-strict-aliasing -pthread -o .libs/orted orted.o
-Wl,--export-dynamic ../../../orte/.libs/liborte.so -lbproc /home/
dthuebner/Desktop/openmp
> Date: Tue, 15 Nov 2005 08:43:58 -0800
> From: Jeff Squyres
> Subject: Re: [O-MPI users] OMPI 1.0 rc6 --with-bproc errors
> To: Open MPI Users
> Message-ID:
> Content-Type: text/plain; charset=US-ASCII; format=flowed
>
> Daryl --
>
> I don't think that anyone directly replied to you, but I sa
Thanks Jeff. The problem is solved on the latest version (8172).
Clement
Jeff Squyres wrote:
Clement --
Sorry for the delay in replying. We're running around crazy here at
SC, which pretty much keeps us away from e-mail except early in the
morning and late at night.
We fixed a bunch of