Thanks Ralph,
I was afraid that there wasn't a "oh we started..." For what it is
worth, everything up to and including 1.7.x shows the same sort of
failure. The trouble with Jaguar is that they actually embed their own
version of everything inside their code. I once took on the task of
swapping out the embedded version with a newer one. That turned out to
be much more difficult than one would think. So, now I support a
version for the general public (currently 1.8.4) and one for them.
Ray
On 10/17/2015 12:20 PM, Ralph Castain wrote:
I’m not sure there is a way to do it - that’s a pretty old version, and the RTE
in it is completely different. So entirely possible that something in the
update exposed a problem that no longer works.
Out of curiosity: I’m unaware of any changes in the MPI definitions (there were
extensions, but no breakage). So why can’t you just build the old packages
against 1.8.4?
On Oct 17, 2015, at 7:29 AM, Ray Sheppard <rshep...@iu.edu> wrote:
Hi All,
We run a Cray XE/XT-7. For normal (ESM) use, Cray supplies integrated MPI
libraries. However, for cluster compatibility mode, we build OpenMPI to use.
Generally we use 1.8.4 but some old packages, like Jaguar, are tied to an old
version (1.4.5). At the last maint, they all started breaking so I rebuilt
them. Version 1.8.4 rebuilt fine and runs fine. However, even a simple
application, recompiled by the new package, fails in 1.4.5 with the error
below. I have tried a number of different configure options. The current one
follows this note. I am hoping someone could tell what needs to be done to
1.4.5 to build the way 1.8.4 did (i.e. without the pipe error). Thanks in
advance for any insights.
Ray
./configure CXX=g++ CC=gcc FC=gfortran CFLAGS="-O2" F77=gfortran FCFLAGS="-O2"
--enable-shared --enable-static --with-tm=no --with-threads=posix --without-openib
--enable-mca-no-build=btl-openib --with-gnu-ld --prefix=/N/soft/cle5/openmpi/gnu/1.4.5
:~/testdir> !mpirun
mpirun -np 8 -machinefile test_machine hellompi
--------------------------------------------------------------------------
mpirun was unable to launch the specified application as it encountered an
error:
Error: pipe function call failed when setting up I/O forwarding subsystem
Node: nid00819
while attempting to start process rank 0.
--------------------------------------------------------------------------
_______________________________________________
users mailing list
us...@open-mpi.org
Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
Link to this post:
http://www.open-mpi.org/community/lists/users/2015/10/27890.php
_______________________________________________
users mailing list
us...@open-mpi.org
Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
Link to this post:
http://www.open-mpi.org/community/lists/users/2015/10/27891.php