Wow - glad you were able to dig that one out!!
> On Nov 12, 2015, at 9:47 AM, Ray Sheppard <rshep...@iu.edu> wrote:
>
> Hi All,
> I thought I would follow up with a solution. It turns out there is a bug
> in glibc that is now exposed in old versions (pre 1.8) of OpenMPI on Cray XE
> systems. Thanks to Justin Davis of Cray for figuring it out. The simple
> solution is to remount /dev/pts with the gid=5 option turned on. Then all
> works well again.
> Ray
>
> On 10/17/2015 12:20 PM, Ralph Castain wrote:
>> I’m not sure there is a way to do it - that’s a pretty old version, and the
>> RTE in it is completely different. So entirely possible that something in
>> the update exposed a problem that no longer works.
>>
>> Out of curiosity: I’m unaware of any changes in the MPI definitions (there
>> were extensions, but no breakage). So why can’t you just build the old
>> packages against 1.8.4?
>>
>>> On Oct 17, 2015, at 7:29 AM, Ray Sheppard <rshep...@iu.edu>
>>> <mailto:rshep...@iu.edu> wrote:
>>>
>>> Hi All,
>>> We run a Cray XE/XT-7. For normal (ESM) use, Cray supplies integrated MPI
>>> libraries. However, for cluster compatibility mode, we build OpenMPI to
>>> use. Generally we use 1.8.4 but some old packages, like Jaguar, are tied
>>> to an old version (1.4.5). At the last maint, they all started breaking so
>>> I rebuilt them. Version 1.8.4 rebuilt fine and runs fine. However, even a
>>> simple application, recompiled by the new package, fails in 1.4.5 with the
>>> error below. I have tried a number of different configure options. The
>>> current one follows this note. I am hoping someone could tell what needs
>>> to be done to 1.4.5 to build the way 1.8.4 did (i.e. without the pipe
>>> error). Thanks in advance for any insights.
>>> Ray
>>>
>>> ./configure CXX=g++ CC=gcc FC=gfortran CFLAGS="-O2" F77=gfortran
>>> FCFLAGS="-O2" --enable-shared --enable-static --with-tm=no
>>> --with-threads=posix --without-openib --enable-mca-no-build=btl-openib
>>> --with-gnu-ld --prefix=/N/soft/cle5/openmpi/gnu/1.4.5
>>>
>>>
>>> :~/testdir> !mpirun
>>> mpirun -np 8 -machinefile test_machine hellompi
>>> --------------------------------------------------------------------------
>>> mpirun was unable to launch the specified application as it encountered an
>>> error:
>>>
>>> Error: pipe function call failed when setting up I/O forwarding subsystem
>>> Node: nid00819
>>>
>>> while attempting to start process rank 0.
>>> --------------------------------------------------------------------------
>>>
>>> _______________________________________________
>>> users mailing list
>>> us...@open-mpi.org <mailto:us...@open-mpi.org>
>>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>>> <http://www.open-mpi.org/mailman/listinfo.cgi/users>
>>> Link to this post:
>>> http://www.open-mpi.org/community/lists/users/2015/10/27890.php
>>> <http://www.open-mpi.org/community/lists/users/2015/10/27890.php>
>> _______________________________________________
>> users mailing list
>> us...@open-mpi.org <mailto:us...@open-mpi.org>
>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>> <http://www.open-mpi.org/mailman/listinfo.cgi/users>
>> Link to this post:
>> http://www.open-mpi.org/community/lists/users/2015/10/27891.php
>> <http://www.open-mpi.org/community/lists/users/2015/10/27891.php>
> _______________________________________________
> users mailing list
> us...@open-mpi.org
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> Link to this post:
> http://www.open-mpi.org/community/lists/users/2015/11/28033.php