> Date: Fri, 18 Nov 2005 10:34:29 -0700
> From: "Daryl W. Grunau" <d...@lanl.gov>
> To: Brian Barrett <brbar...@open-mpi.org>
> Cc: us...@open-mpi.org
> Subject: Re: [O-MPI users] OMPI 1.0 rc6 --with-bproc errors
> 
> > Date: Thu, 17 Nov 2005 09:20:10 -0800
> > From: Brian Barrett <brbar...@open-mpi.org>
> > Subject: Re: [O-MPI users] OMPI 1.0 rc6 --with-bproc errors
> > To: Open MPI Users <us...@open-mpi.org>
> > Message-ID: <6e3f2f6a-fb69-4879-a2b1-e286b5db7...@open-mpi.org>
> > Content-Type: text/plain; charset=US-ASCII; delsp=yes; format=flowed
> > 
> > Daryl -
> > 
> > I'm unable to replicate your problem.  I was testing on a Fedora Core  
> > 3 system with Clustermatic 5.  Is is possible that you have a random  
> > dso from a previous build in your installation path?  How are you  
> > running mpirun -- maybe I'm just not hitting the same code path you   
> > are...
> 
> Brian, thanks for trying to replicate.  I'm actually not building any dso's
> for OMPI, merely static libs and recompiling my app.  I'm running as
> follows:
> 
>    mpirun -H 200,201,202,203 -np 4 ./a.out
> 
> The last successful build I've had was rc4, which succeeds in running the
> above test.  I'll try to build/install 1.0 just announced and let you know.

Looks like this problem got fixed in 1.0!!!  Thanks,

Daryl

Reply via email to