On Mon, 2014-03-24 at 07:59 -0700, Ralph Castain wrote:
> I suspect the root cause of the problem here lies in how MPI messages are
> progressed. OMPI doesn't have an async progress method (yet), and so
> messaging on both send and recv ends is only progressed when the app calls
> the MPI librar
I suspect the root cause of the problem here lies in how MPI messages are
progressed. OMPI doesn't have an async progress method (yet), and so messaging
on both send and recv ends is only progressed when the app calls the MPI
library. It sounds like your app issues an isend or recv, and then spe
Hi, Ross,
Just out of curiosity, is Rmpi required for some package that you're
using? I only ask because, if you're mostly writing your own MPI
calls, you might want to look at pbdR/pbdMPI, if you haven't already.
They also have a pbdPROF for profiling and which should be able to do
some profilin
I have a bunch of simulators communicating results to a single
assembler. The results seem to take a long time to be received, and the
delay increases as the system runs. Here are some results:
sent received delay
70.679 94.776 24.097
94.677 144.906 50.229
122.082 238.713 116.631
144.785