Jane,
let me take this to the deal.II mailing list, since that is the place where these kinds of questions should be asked:

I recently finished an SUPG based incompressible unsteady Navier-Stokes solver.
The test case is the flow over cylinder problem. I parallized the code. Everything works well with the linear code. But when I switch to parallel with an iterative linear solver (SolverFGMRES). It performs 10x slower than the direct solver(SparseDirectUMFPACK) I used in the linear code. The iterative solver typically takes 5-6 gmres iterations to solve the system.

Out of curiosity, what is your preconditioner?


I tried to use the parallel direct solver SparseDirectMUMPS, but it was not able to solve the system.

What exactly happens?


I am wondering if this is expected? Given the test case has a system that is small ( 2D code with 10,000 dof). And is there any other parallel direct solver I can try?

As a rule of thumb, direct solvers are typically faster than iterative solvers for problems with less than around 100,000 unknowns. Your problem is so small that it's not worth bothering with iterative solvers. It's probably also so small that it's not worth bothering with parallelization: parallelization only works well if you have more than around 50,000 unknowns per processor. You're still far away from that.

Best
 W.

--
------------------------------------------------------------------------
Wolfgang Bangerth          email:                 bange...@colostate.edu
                           www: http://www.math.colostate.edu/~bangerth/

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- You received this message because you are subscribed to the Google Groups "deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/ab5915b7-d5a2-1c55-1efe-846247ffc0bc%40colostate.edu.

Reply via email to