The documentation states that Tpetra supports 
- MPI
- Shared memory parallelization: (OpenMP, CUDA, Posix)

and: 
Scalar: A Scalar is the type of values in the sparse matrix or dense 
vector. This is the type most likely to be changed by many users. The most 
common use cases 
are float, double, std::complex<float> and std::complex<double> 

and it contains:

   - 
   
   Parallel distributions: Tpetra::Map 
   
<https://docs.trilinos.org/dev/packages/tpetra/doc/html/classTpetra_1_1Map.html>
 - 
   Contains information used to distribute vectors, matrices and other 
   objects. This class is analogous to Epetra's Epetra_Map class.
   
   - 
   
   Distributed dense vectors: Tpetra::MultiVector 
   
<https://docs.trilinos.org/dev/packages/tpetra/doc/html/classTpetra_1_1MultiVector.html>
   , Tpetra::Vector 
   
<https://docs.trilinos.org/dev/packages/tpetra/doc/html/classTpetra_1_1Vector.html>
 - 
   Provides vector services such as scaling, norms, and dot products.
   
   - 
   
   Distributed sparse matrices: Tpetra::RowMatrix 
   
<https://docs.trilinos.org/dev/packages/tpetra/doc/html/classTpetra_1_1RowMatrix.html>
   , Tpetra::CrsMatrix 
   
<https://docs.trilinos.org/dev/packages/tpetra/doc/html/classTpetra_1_1CrsMatrix.html>
    - Tpetra::RowMatrix 
   
<https://docs.trilinos.org/dev/packages/tpetra/doc/html/classTpetra_1_1RowMatrix.html>
 is 
   a abstract interface for row-distributed sparse matrices. 
   Tpetra::CrsMatrix 
   
<https://docs.trilinos.org/dev/packages/tpetra/doc/html/classTpetra_1_1CrsMatrix.html>
 is 
   a specific implementation of Tpetra::RowMatrix 
   
<https://docs.trilinos.org/dev/packages/tpetra/doc/html/classTpetra_1_1RowMatrix.html>,
 
   utilizing compressed row storage format. Both of these classes derive from 
   Tpetra::Operator 
   
<https://docs.trilinos.org/dev/packages/tpetra/doc/html/classTpetra_1_1Operator.html>,
 
   the base class for linear operators.
   

see
https://docs.trilinos.org/dev/packages/tpetra/doc/html/index.html  

Pascal Kraft schrieb am Sonntag, 26. Juli 2020 um 10:57:36 UTC+2:

> Hi Wolfgang,
>
> here is what I found out about the topic: 
> Originally, I only knew Trilinos because I used the distributed matrices 
> and vectors in my application. I also knew that there is a configuration of 
> trilinos to make complex numbers available in all packages that support it. 
> However, from what I can tell, that only effects Tpetra datatypes, not 
> Epetra. From what I have seen in the dealwrappers, they only use Epetra. An 
> interesting detail about this is the Komplex-Package, which is described as 
> an Epetra based solver for complex systems, which wraps Epetra matrices and 
> stores the real and imaginary parts as blocks. (see here:  
> https://docs.trilinos.org/dev/packages/komplex/doc/html/index.html )
> At GitHub I can see that project 4 deals with adding Tpetra support which 
> would make complex numbers in Tpetra usable in deal if the interface is 
> built to support them)
>
> About GMRES: I will be using PETSc GMRES to solve my system, but if 
> possible I will try to also solve it with dealii::SolverGMRES and let you 
> know what happens.
>
> Kind regards,
> Pascal
>
> Wolfgang Bangerth schrieb am Sonntag, 26. Juli 2020 um 01:43:44 UTC+2:
>
>> On 7/23/20 10:42 AM, Pascal Kraft wrote:
>> > 
>> > I have Trillions compiled with support for complex numbers and also 
>> searched 
>> > through the LinearAlgebra documentation.
>>
>> I don't think I knew that one can compile Trilinos with complex numbers. 
>> How 
>> do you do that?
>>
>> It does not greatly surprise me that we use TrilinosScalar and double 
>> interchangeably. If Trilinos can indeed be compiled with complex numbers, 
>> then 
>> we ought to find a way to (i) make TrilinosScalar dependent on whatever 
>> Trilinos was compiled for, (ii) ensure that all of the places that 
>> currently 
>> don't compile because we use double in place of TrilinosScalar are fixed.
>>
>> Patches are, as always, very welcome!
>>
>>
>> > I require GMRES as a solver (which should be possible, because the 
>> GMRES 
>> > Versions all use a templated Vector which can take complex components) 
>> and MPI 
>> > distribution of a sparse system. I have so far only seen FullMatrix to 
>> accept 
>> > complex numbers.
>>
>> I believe that GMRES could indeed be made to work for complex-valued 
>> problems, 
>> but I'm not sure any of us have every tried. When writing step-58, I 
>> toyed 
>> with the idea of looking up in the literature what one would need for a 
>> complex GMRES, but in the end decided to just make SparseDirectUMFPACK 
>> work 
>> instead. The issue is that for every matrix-vector and vector-vector 
>> operation 
>> that happens inside GMRES, you have to think about whether one or the 
>> other 
>> operand needs to be complex-conjugated. I'm certain that is possible, but 
>> would require an audit of a few hundred lines. It would probably be 
>> simpler to 
>> just use PETSc's (or Trilinos') GMRES implementation.
>>
>> Best
>> W.
>>
>> -- 
>> ------------------------------------------------------------------------
>> Wolfgang Bangerth email: bang...@colostate.edu
>> www: http://www.math.colostate.edu/~bangerth/
>>
>>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/20321841-a502-42f6-b91f-e7d2246662d6n%40googlegroups.com.

Reply via email to