Do you have a static version of libnuma available?  If not, then the
static linking will likely fail. 

> -----Original Message-----
> From: users-boun...@open-mpi.org 
> [mailto:users-boun...@open-mpi.org] On Behalf Of Jeffrey B. Layton
> Sent: Wednesday, April 12, 2006 11:31 AM
> To: layto...@charter.net; Open MPI Users
> Subject: Re: [OMPI users] Problem running code with OpenMPI-1.0.1
> 
> OK, this is weird. I built 1.0.2 with the following options:
> 
> ./configure --prefix=/home/jlayton/bin/OPENMPI-1.0.2-PGI6.0-OPTERON 
> --disable-io-romio \
>        --enable-static --enable-shared
> 
> and installed it. When I tried to build a code with static
> links (linking in the libs), it yells about not being able to
> find libnuma. I see a directory opal/mca/maffinity/libnuma
> but I can't find libnuma. I can build the code fine using
> shared libs, but not static one. Any ideas on how to fix the
> static lib problem?
> 
> Thanks!
> 
> Jeff
> 
> 
> > Well, yes these nodes do have multiple TCP interfaces.
> > I'll give 1.0.2 a whirl :)
> >
> > Thanks!
> >
> > Jeff
> >
> >   
> >> Do you, perchance, have multiple TCP interfaces on at 
> least one of the
> >> nodes you're running on?
> >>
> >> We had a mistake in the TCP network matching code during 
> startup -- this
> >> is fixed in v1.0.2.  Can you give that a whirl?
> >>
> >>
> >>   
> >>     
> >>> -----Original Message-----
> >>> From: users-boun...@open-mpi.org 
> >>> [mailto:users-boun...@open-mpi.org] On Behalf Of Jeffrey B. Layton
> >>> Sent: Tuesday, April 11, 2006 11:25 AM
> >>> To: Open MPI Users
> >>> Subject: [OMPI users] Problem running code with OpenMPI-1.0.1
> >>>
> >>> Good morning,
> >>>
> >>>    I'm trying to run one of the NAS Parallel Benchmarks (bt) with
> >>> OpenMPI-1.0.1 that was built with PGI 6.0. The code never
> >>> starts (at least I don't see any output) until I kill the 
> code. Then
> >>> I get the following message:
> >>>
> >>> 
> [0,1,2][btl_tcp_endpoint.c:559:mca_btl_tcp_endpoint_complete_connect] 
> >>> connect() failed with 
> >>> errno=113[0,1,4][btl_tcp_endpoint.c:559:mca_btl_tcp_endpoint_c
> >>> omplete_connect] 
> >>> connect() failed with
> >>> errno=113[0,1,8][btl_tcp_endpoint.c:559:mca_btl_tcp_endpoint_c
> >>> omplete_connect] 
> >>> connect() failed with errno=113mpirun: killing job...
> >>>
> >>> Any ideas on this one?
> >>>
> >>> Thanks!
> >>>
> >>> Jeff
> >>> _______________________________________________
> >>> users mailing list
> >>> us...@open-mpi.org
> >>> http://www.open-mpi.org/mailman/listinfo.cgi/users
> >>>
> >>>     
> >>>       
> >> _______________________________________________
> >> users mailing list
> >> us...@open-mpi.org
> >> http://www.open-mpi.org/mailman/listinfo.cgi/users
> >>
> >>   
> >>     
> > _______________________________________________
> > users mailing list
> > us...@open-mpi.org
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
> >
> >   
> _______________________________________________
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users
> 

Reply via email to