I've resolved the problem in a satisfactory way by circumventing one-sided
entirely. I.e., this issue is finally closed:
https://bitbucket.org/petsc/petsc-dev/issue/9/implement-petscsf-without-one-sided
Users can proceed anyway using the run-time option
-acknowledge_ompi_onesided_bug, which will
*Bump*
There doesn't seem to have been any progress on this. Can you at least have
an error message saying that Open MPI one-sided does not work with
datatypes instead of silently causing wanton corruption and deadlock?
On Thu, Dec 22, 2011 at 4:17 PM, Jed Brown wrote:
> [Forgot the attachment.
[Forgot the attachment.]
On Thu, Dec 22, 2011 at 15:16, Jed Brown wrote:
> I wrote a new communication layer that we are evaluating for use in mesh
> management and PDE solvers, but it is based on MPI-2 one-sided operations
> (and will eventually benefit from some of the MPI-3 one-sided proposal
I wrote a new communication layer that we are evaluating for use in mesh
management and PDE solvers, but it is based on MPI-2 one-sided operations
(and will eventually benefit from some of the MPI-3 one-sided proposals,
especially MPI_Fetch_and_op() and dynamic windows). All the basic
functionality