This is not fixed in the trunk. At this time MPI_THREAD_MULTIPLE will always
hang (though
there may be some configurations that don't.) The problem is when multiple
threads are
active opal_condition_wait ALWAYS blocks on a condition variable instead of
calling
opal_progress(). Thus we will not p
I built the 1.7.x nightly tar ball on 10.8 (Mountain Lion) and 10.9 (Mavericks)
and it still hangs. I tried compiling with --enable-mpi-thread-multiple only
and with the other options Pierre mentioned. The PETSc tests hang in both cases.
I'm curious to know if the nightly tar ball fixes the issu
No surprise there - that's known behavior. As has been said, we hope to
extend the thread-multiple support in the 1.9 series.
On Mon, Dec 2, 2013 at 6:33 PM, Eric Chamberland <
eric.chamberl...@giref.ulaval.ca> wrote:
> Hi,
>
> I just open a new "chapter" with the same subject. ;-)
>
> We are u
Hi,
I just open a new "chapter" with the same subject. ;-)
We are using OpenMPI 1.6.5 (compiled with --enable-thread-multiple) with
Petsc 3.4.3 (on colosse supercomputer:
http://www.calculquebec.ca/en/resources/compute-servers/colosse). We
observed a deadlock with threads within the openib bt
I'm joining this thread late, but I think I know what is going on:
- I am able to replicate the hang with 1.7.3 on Mavericks (with threading
enabled, etc.)
- I notice that the hang has disappeared at the 1.7.x branch head (also on
Mavericks)
Meaning: can you try with the latest 1.7.x nightly ta
On 2013-11-25, at 9:02 PM, Ralph Castain wrote:
> On Nov 25, 2013, at 5:04 PM, Pierre Jolivet wrote:
>
>>
>> On Nov 24, 2013, at 3:03 PM, Jed Brown wrote:
>>
>>> Ralph Castain writes:
>>>
Given that we have no idea what Homebrew uses, I don't know how we
could clarify/respond.
>
Sent from my iPhone
> On Nov 25, 2013, at 5:04 PM, Pierre Jolivet wrote:
>
>
>> On Nov 24, 2013, at 3:03 PM, Jed Brown wrote:
>>
>> Ralph Castain writes:
>>
>>> Given that we have no idea what Homebrew uses, I don't know how we
>>> could clarify/respond.
>
> Ralph, it is pretty easy to k
On Nov 24, 2013, at 3:03 PM, Jed Brown wrote:
> Ralph Castain writes:
>
>> Given that we have no idea what Homebrew uses, I don't know how we
>> could clarify/respond.
>
Ralph, it is pretty easy to know what Homebrew uses, c.f.
https://github.com/mxcl/homebrew/blob/master/Library/Formula/op
Ralph Castain writes:
> Given that we have no idea what Homebrew uses, I don't know how we
> could clarify/respond.
Pierre provided a link to MacPorts saying that all of the following
options were needed to properly enable threads.
--enable-event-thread-support --enable-opal-multi-threads
--e
Given that we have no idea what Homebrew uses, I don't know how we could
clarify/respond.
On Nov 24, 2013, at 12:43 PM, Jed Brown wrote:
> Pierre Jolivet writes:
>> It looks like you are compiling Open MPI with Homebrew. The flags they use
>> in the formula when --enable-mpi-thread-multiple
Dominique Orban writes:
> My question originates from a hang similar to the one I described in
> my first message in the PETSc tests. They still hang after I corrected
> the OpenMPI compile flags. I'm in touch with the PETSc folks as well
> about this.
Do you have an updated stack trace?
pgpg26
Pierre Jolivet writes:
> It looks like you are compiling Open MPI with Homebrew. The flags they use in
> the formula when --enable-mpi-thread-multiple is wrong.
> c.f. a similar problem with MacPorts
> https://lists.macosforge.org/pipermail/macports-tickets/2013-June/138145.html.
If these "wron
Pierre,
Thank you for pointing out the erroneous flags. I am indeed compiling from
Homebrew. After using the flags mentioned in the link you give, this is the
output of Ralph's test program:
$ mpirun -n 2 ./testmpi2
Calling MPI_Init_thread...
Calling MPI_Init_thread...
MPI_Init_thread returned,
Dominique,
It looks like you are compiling Open MPI with Homebrew. The flags they use in
the formula when --enable-mpi-thread-multiple is wrong.
c.f. a similar problem with MacPorts
https://lists.macosforge.org/pipermail/macports-tickets/2013-June/138145.html.
Pierre
On Nov 23, 2013, at 4:56 PM
Hmmm...well, it seems to work for me:
$ mpirun -n 4 ./thread_init
Calling MPI_Init_thread...
Calling MPI_Init_thread...
Calling MPI_Init_thread...
Calling MPI_Init_thread...
MPI_Init_thread returned, provided = 3
MPI_Init_thread returned, provided = 3
MPI_Init_thread returned, provided = 3
MPI_Ini
Hi,
I'm compiling the example code at the bottom of the following page that
illustrates MPI_Init_Thread():
http://mpi.deino.net/mpi_functions/mpi_init_thread.html
I have OpenMPI 1.7.3 installed on OSX 10.8.5 with --enable-thread-multiple
compiled with clang-425.0.28. I can reproduce t
16 matches
Mail list logo