You should redo it in terms of George's suggestion, in that way you should
also circumvent the "manual" alignment of data. George's method is the best
generic way of doing it.
As for the -r8 thing, just do not use it :)
And check the interface for the routines used to see why MPIstatus is used.
Ralph,
Thanks for the patch. It cleaned up the pmi check nicely.
Applied, configured and compiled without any problems! Great!
The configure gave me:
--- MCA component pubsub:pmi (m4 configuration macro)
checking for MCA component pubsub:pmi compile mode... dso
checking if user requested PMI sup
Dear all,
thanks a lot. I rewrote the code starting from the Nick's one.
It's work
I have still to think about the "-r8" things, I belive that less er type
less we make error.
Another questions about Nick's code:
Why do I have to use MPIstatus(MPI_STATUS_SIZE) and not a simple MPI%ierror
Thanks
Dear all,
I would like to use the METIS library to partitioning my domain. May domain
is a structured mesh but I would like to use different cell, according to
how many particles I have in each cell.
Does someone tall me, please, where I can find a good example in FORTRAN
and how link my program
I've poked at this a bit and think I have all the combinations covered -
can you try the attached patch? I don't have a way to test it right now, so
I don't want to put it in the trunk.
Thanks
Ralph
On Mon, Oct 6, 2014 at 6:02 PM, Ralph Castain wrote:
> I've looked at your patch, and it isn't
Hi Howard,
We have NOT defined IPv6 on the nodes.
Actually I was looking at the location of the code that complains and I
also saw references to IPv6 sockets.
Thanks a lot for the suggestion! I'll try this out tomorrow.
Regards
Michael
On Mon, Oct 6, 2014 at 11:07 PM, Howard Pritchard
wrote:
Hi Michael,
If you do not include --enable-ipv6 in the config line, do you still
observe the problem?
Is it possible that one or more interfaces on nodes H1 and H2 do not have
ipv6 enabled?
Howard
2014-10-06 16:51 GMT-06:00 Michael Thomadakis :
> Hello,
>
> I've configured OpenMPI1.8.3 with th