1 - Do you have problems with openmpi 1.4 too? (I don't, haven't built
1.4.1 yet)
2 - There is a bug in the pathscale compiler with -fPIC and -g that
generates incorrect dwarf2 data so debuggers get really confused and
will have BIG problems debugging the code. I'm chasing them to get a
fix...
3 - Do you have an example code that have problems?

On Mon, 2010-01-25 at 15:01 -0500, Jeff Squyres wrote:
> I'm afraid I don't have any clues offhand.  We *have* had problems with the 
> Pathscale compiler in the past that were never resolved by their support 
> crew.  However, they were of the "variables weren't initialized and the 
> process generally aborts" kind of failure, not a "persistent hang" kind of 
> failure.
> 
> Can you tell where in MPI_Init the process is hanging?  E.g., can you build 
> Open MPI with debugging enabled (such as by passing CFLAGS=-g to OMPI's 
> configure line) and then attach a debugger to a hung process and see what 
> it's stuck on?
> 
> 
> On Jan 25, 2010, at 7:52 AM, Rafael Arco Arredondo wrote:
> 
> > Hello:
> > 
> > I'm having some issues with Open MPI 1.4.1 and Pathscale compiler
> > (version 3.2). Open MPI builds successfully with the following configure
> > arguments:
> > 
> > ./configure --with-openib=/usr --with-openib-libdir=/usr/lib64
> > --with-sge --enable-static CC=pathcc CXX=pathCC F77=pathf90 F90=pathf90
> > FC=pathf90
> > 
> > (we have OpenFabrics 1.2 Infiniband drivers, by the way)
> > 
> > However, applications hang on MPI_Init (or maybe MPI_Comm_rank or
> > MPI_Comm_size, a basic hello-world anyway doesn't print 'Hello World
> > from node...'). I tried running them with and without SGE. Same result.
> > 
> > This hello-world works flawlessly when I build Open MPI with gcc:
> > 
> > ./configure --with-openib=/usr --with-openib-libdir=/usr/lib64
> > --with-sge --enable-static
> > 
> > This successful execution runs in one machine only, so it shouldn't use
> > Infiniband, and it also works when several nodes are used.
> > 
> > I was able to build previous versions of Open MPI with Pathscale (1.2.6
> > and 1.3.2, particularly). I tried building version 1.4.1 both with
> > Pathscale 3.2 and Pathscale 3.1. No difference.
> > 
> > Any ideas?
> > 
> > Thank you in advance,
> > 
> > Rafa
> > 
> > --
> > Rafael Arco Arredondo
> > Centro de Servicios de Informática y Redes de Comunicaciones
> > Universidad de Granada
> > 
> > _______________________________________________
> > users mailing list
> > us...@open-mpi.org
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
> > 
> 
> 

Reply via email to