does OpenMPI support Quadrics elan3/4 interconnects?
I saw a few hits on google suggesting that support was partial or maybe
planned, but couldn't find much in the openmpi sources to suggest any
support at all.
cheers,
robin
Thanks Jeff.
The kind of faults I was trying to trap are those of application/node
faults/failures. I literally kill the application on another node in
hope to try to trap it and react accordingly. This is similar to FT-MPI
shrinking the size, etc.
If you suggest a different implementation
That's great to hear! For now we'll just create local users for those
who need access to MPI on this system, but I'll keep an eye on the
list for when you do get a chance to finish that fix. Thanks again!
On 3/18/07, Ralph Castain wrote:
Excellent! Yes, we use pipe in several places, including
Excellent! Yes, we use pipe in several places, including in the run-time
during various stages of launch, so that could be a problem.
Also, be aware that other users have reported problems on LDAP-based systems
when attempting to launch large jobs. The problem is that the OpenMPI launch
system has
I just received an email from a friend who is helping me work on
resolving this; he was able to trace the problem back to a pipe() call
in OpenMPI apparently:
The problem is with the pipe() system call (which is invoked by the
MPI_Send() as far as I can tell) by a LDAP authenticated user. Still
Whoops -- looks like we're missing an #include statement (we do all
of our testing on OS X 10.4, not 10.3).
I'm not sure offhand which 10.3 #include file provides the definition
for "struct iovec" -- can you try adding a few #includes to the top
of the file (e.g., ) and let us know which on