Re: [deal.II] Re: Problems in solving two PDE systems using MPI+Thread parallel computing

2017-05-25 Thread Jack
Many thanks for your encouragement. I’m neither an expert on MPI nor on thread. I underestimated the difficulty. This may take a very long time for me to figure out. Hopefully some experts on these parallel computing regimes would like to address such a curious problem soon. Thanks!

Re: [deal.II] Re: Problems in solving two PDE systems using MPI+Thread parallel computing

2017-05-24 Thread Wolfgang Bangerth
On 05/23/2017 07:24 PM, Jack wrote: Originally, I suspected that when solving two linear systems simultaneously by two threads would reduce time. But now it seems that this idea increases the complexity of communication between MPI communicators and coding, and also not surely to decrease compu

Re: [deal.II] Re: Problems in solving two PDE systems using MPI+Thread parallel computing

2017-05-23 Thread Jack
Hi Prof. Wolfgang, Many thanks! Originally, I suspected that when solving two linear systems simultaneously by two threads would reduce time. But now it seems that this idea increases the complexity of communication between MPI communicators and coding, and also not surely to decrease

Re: [deal.II] Re: Problems in solving two PDE systems using MPI+Thread parallel computing

2017-05-23 Thread Wolfgang Bangerth
Jack, “The way to do this is to clone the MPI communicator you use for your overall problem once for each linear system. ” That means for my problem I have to copy the Vector and Matrix of one linear system(either the thermal diffusion or stokes flow) to another Vector and Matrix which are

Re: [deal.II] Re: Problems in solving two PDE systems using MPI+Thread parallel computing

2017-05-23 Thread Jack
Hi Prof. Wolfgang, Thanks so much! “The way to do this is to clone the MPI communicator you use for your overall problem once for each linear system. ” That means for my problem I have to copy the Vector and Matrix of one linear system(either the thermal diffusion or stokes flow) to anoth

Re: [deal.II] Re: Problems in solving two PDE systems using MPI+Thread parallel computing

2017-05-22 Thread Wolfgang Bangerth
Jack, unrelated to the question about versions of MPI libraries, when you solve two linear systems on separate threads, you need to pay attention that the two solvers (and all associated objects such as matrices, vectors, etc) use separate MPI communicators. Otherwise you will have the two sol

Re: [deal.II] Re: Problems in solving two PDE systems using MPI+Thread parallel computing

2017-05-20 Thread Jack
Hi Bruno, I changed the corresponding code and remake the library again, unfortunately, the same warn came out as before. I changed this in another workstation which uses MPICH2, it also failed during solving the two systems. I'm trying to update my OpenMPI to v2.1, but in the latest version,

Re: [deal.II] Re: Problems in solving two PDE systems using MPI+Thread parallel computing

2017-05-20 Thread Bruno Turcksin
Jack, 2017-05-19 22:06 GMT-04:00 Jack : > I should update my OpenMPI to the latest version and such problem will be > avoided and I do not need to change my code for initialization of MPI and > TBB myself? Unfortunately, you wil need to initialize MPI yourself or you can change this line https://

[deal.II] Re: Problems in solving two PDE systems using MPI+Thread parallel computing

2017-05-19 Thread Jack
Hi Bruno, I appreciated it very much for your responses. The Matrices and Vectors are of TrilinosWrappers, the solvers should use MPI, too. I used the OpenMPI 1.8.1. The release date is Apr 22 2014, later than the post on the Github. I initialized the MPI as following try { us

[deal.II] Re: Problems in solving two PDE systems using MPI+Thread parallel computing

2017-05-19 Thread Bruno Turcksin
Jack, are your solvers using MPI? This looks similar to this problem https://github.com/open-mpi/ompi/issues/1081 Which version of MPI are you using? How do you initialize MPI? MPI_InitFinalize set MPI_THREAD_SERIALIZED which "tells MPI that we might use several threads but never call two MPI