> Date: Thu, 17 Nov 2005 09:20:10 -0800
> From: Brian Barrett <brbar...@open-mpi.org>
> Subject: Re: [O-MPI users] OMPI 1.0 rc6 --with-bproc errors
> To: Open MPI Users <us...@open-mpi.org>
> Message-ID: <6e3f2f6a-fb69-4879-a2b1-e286b5db7...@open-mpi.org>
> Content-Type: text/plain; charset=US-ASCII; delsp=yes; format=flowed
> 
> Daryl -
> 
> I'm unable to replicate your problem.  I was testing on a Fedora Core  
> 3 system with Clustermatic 5.  Is is possible that you have a random  
> dso from a previous build in your installation path?  How are you  
> running mpirun -- maybe I'm just not hitting the same code path you   
> are...

Brian, thanks for trying to replicate.  I'm actually not building any dso's
for OMPI, merely static libs and recompiling my app.  I'm running as
follows:

   mpirun -H 200,201,202,203 -np 4 ./a.out

The last successful build I've had was rc4, which succeeds in running the
above test.  I'll try to build/install 1.0 just announced and let you know.

Daryl

P.s.  I'm building OMPI with the following flags:

   --prefix=/opt/OpenMPI/openmpi-1.0rc8/ib
   --disable-shared --enable-static --with-bproc
   --with-mvapi=/opt/IB/ibgd-1.8.0/driver/infinihost

> Thanks,
> 
> Brian

Reply via email to