Hi all,

I have parallelized a code in which I use GMRES or Bicgstab solvers with 
BlockJacobi preconditioner for a symmetric but not possitive-difinite 
system matrix. The problem is that it converges for 150000 DoFs and fewer 
(on 1 node with 16 processor) but does'nt converge for larger number of 
DoFs for example 500000 or one million (even with 64 processors). 

It would be appreciated if you could give me any clue to find out how to 
make it converge for large number of DoFs. The following is the error I 
usually get.

Thanks 
Hamed

----------------------------------------------------
Exception on processing:

--------------------------------------------------------
An error occurred in line <151> of file 
</shared/hpc/deal.ii-candi/tmp/unpack/deal.II-v8.4.1/source/lac/petsc_solver.cc>
 
in function
    void dealii::PETScWrappers::SolverBase::solve(const 
dealii::PETScWrappers::MatrixBase&, dealii::PETScWrappers::VectorBase&, 
const dealii::PETScWrappers::VectorBase&, const 
dealii::PETScWrappers::PreconditionerBase&)
The violated condition was:
    false
The name and call sequence of the exception was:
    SolverControl::NoConvergence (solver_control.last_step(), 
solver_control.last_value())
Additional Information:
Iterative method reported convergence failure in step 10000. The residual 
in the last step was 1.54204e-06.

This error message can indicate that you have simply not allowed a 
sufficiently large number of iterations for your iterative solver to 
converge. This often happens when you increase the size of your problem. In 
such cases, the last residual will likely still be very small, and you can 
make the error go away by increasing the allowed number of iterations when 
setting up the SolverControl object that determines the maximal number of 
iterations you allow.

The other situation where this error may occur is when your matrix is not 
invertible (e.g., your matrix has a null-space), or if you try to apply the 
wrong solver to a matrix (e.g., using CG for a matrix that is not symmetric 
or not positive definite). In these cases, the residual in the last 
iteration is likely going to be large.
--------------------------------------------------------

Aborting!
----------------------------------------------------
ERROR: Uncaught exception in MPI_InitFinalize on proc 1. Skipping 
MPI_Finalize() to avoid a deadlock.

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to