Dear Timo,
Sorry, I did not mean to write to your personal e-mail. Just pressed the
wrong reply button by mistake.
Thank you very much for your reply. I will check the way how I set
mpi_communicator now. Hopefully, will find the problem.
I have changed step-40 couple of days ago myself to MUMP
and from your email I got off-list (please try to use the mailinglist):
> [0]PETSC ERROR: #1 PetscCommDuplicate() line 137 in
> /home/anna/petsc-3.6.4/src/sys/objects/tagm.c
> An error occurred in line <724> of file
> in function
> void dealii::PETScWrappers::SparseDirectMUMPS::solve(const
>
Anna,
> The main reason I do this is that I do not understand how to reuse this
> decomposition in deal.ii.
> I am relatively new to deal.ii and C++, and I have never used MUMPS before.
Well, this has nothing to do with MUMPS or deal.II. It sounds like you
are struggling because you are not famil
Dear Timo,
The main reason I do this is that I do not understand how to reuse this
decomposition in deal.ii.
I am relatively new to deal.ii and C++, and I have never used MUMPS before.
The way I set it up with SparseDirectUMFPACK was to use InnerPreconditioner
structure:
template
struct Inn
Dear Timo,
Because I do not understand how to reuse this decomposition in deal.ii.
I am relatively new to deal.ii and C++, and I have never used mumps before.
I would appreciate any advice on this. Maybe there is some example in deal.ii
that use mumps to construct a preconditioner.
Thank you ve
> template
> void Pa_Preconditioner::
> vmult(LA::MPI::BlockVector &dst, const LA::MPI::BlockVector &src) const
> {
> SolverControl cn;
> PetScWrappers::SparseDirectMUMPS solver(cn,mpi_communicator);
> solver.set_symmetric_mode(true);
> solver.solve(B,dst.block(0),src.block(0));
> solver.solve(B,ds
Dear Timo,
Thank you for your reply.
I am still having trouble with implementing my code with MUMPS.
I will briefly describe the problem:
I am solving 2 systems of Maxwell's equations in the following way:
the systems are Ax1=b1 and Ax2=b2;
A is a sparse block symmetric matrix (C -M; -M -C).
Dear Timo,
Thank you for your reply.
I am still having trouble with implementing my code with MUMPS.
I will briefly describe the problem:
I am solving 2 systems of Maxwell's equations in the following way:
the systems are Ax1=b1 and Ax2=b2;
A is a sparse block symmetric matrix (C -M; -M -C).
Anna,
> Now with parallel implementation I would like to use
> PETScWrappers::SparseDirectMUMPS instead of SparseDirectUMFPACK.
>
> However I am getting the following error:
>
> error: ‘SparseDirectMUMPS’ in namespace ‘LA’ does not name a type
> typedef LA::SparseDirectMUMPS type;
You have to u
Dear All,
I have modified step-22 to solve system of Maxwell's equations, and then
used step-40(and 55) to parallelize the code.
In the serial version I used
template
struct InnerPreconditioner;
template<>
struct InnerPreconditioner<0>
{
typedef SparseDirectUMFPACK type;
};
template<>
struct Inn
10 matches
Mail list logo