Jack,

“The way to do this is to clone the MPI communicator you use for your overall
problem once for each linear system. ”

That means for my problem I have to copy the Vector and Matrix of one linear system(either the thermal diffusion or stokes flow) to another Vector and Matrix which are built with a clone of present communicator before starting a thread to solve this linear system.

And then I have to copy the solution of this linear system to the targeted Vector which is built using the present MPI communicator for assembling systems.

That seems too complicated. Just set up the linear system (=matrix, vectors) with a different communicator from the beginning.


By the way is it possible to copy Vector and Matrix of TrilinosWrappers to another ones those are initialized with another MPI communicator?

I don't remember. It is certainly nor a usual operation. You'll have to investigate whether copying also copies the communicator or not by looking at the actual source.

It is not common to solve linear systems on different threads in parallel when using MPI. I don't want to say that that is not a good idea -- it seems like a good idea for sure because it can hide some of the latency associated with MPI communication, but it is not usually done. I suspect that you will not save much time compared to just solving the two linear systems one after the other.

Best
 W.

--
------------------------------------------------------------------------
Wolfgang Bangerth          email:                 bange...@colostate.edu
                           www: http://www.math.colostate.edu/~bangerth/

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- You received this message because you are subscribed to the Google Groups "deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to