Dear Wolfgang,

thank you for your guidance, I did what you suggested. Basically all calls 
in the system matrix construction have been removed but integration of unit 
per each element, with zero rhs. To make the system solvable, (non-zero, 
but it should be non relevant) Dirichlet boundary conditions have been 
imposed on the whole boundary. I checked the memory consumption through 
top->RES without solving the linear system, no issues arose. Again, as soon 
as I invoke 

BiCG.solve (this->system_matrix, distributed_incremental_displacement, 
this->system_rhs, preconditioner);

top->RES shows memory leaks. Guided by Richard remark, I implemented the 
trilinos solver as well. In such a case, no leaks at all. Even building the 
system matrix in the complete form.
My guess is that there must be either some conflicts or installation 
related issues with PETSc. Note that the system solution was always OK, the 
concern was just about the memory loss.

I did check that no issue arise in running step-18 from the library 
examples. 

In conclusion, the problem must be there, sorry I was not able to identify 
it more properly. Maybe Richard can be more precise, and I am at disposal, 
of course.

Hope this helps. 

Alberto


Il giorno sabato 25 luglio 2020 alle 05:46:08 UTC+2 Wolfgang Bangerth ha 
scritto:

> On 7/24/20 3:32 AM, Alberto Salvadori wrote:
> > 
> > It turns out that this code produces a memory loss, quite significant 
> since I 
> > am solving my system thousands of times, eventually inducing the run to 
> fail. 
> > I am not sure what is causing this issue and how to solve it, maybe more 
> > experienced users than myself can catch the problem with a snap of 
> fingers.
> > 
> > I have verified the issue on my mac (Catalina) as well as on linux 
> ubuntu 
> > (4.15.0), using deal.ii 9.1.1.
> > Apparently the issue reveals only when mpi is invoked with more than one 
> > processor, whereas it does not emerge when running in serial or by 
> mpirun -np 1.
>
> Alberto -- I've taken a look at the SolverBicgstab class and don't see 
> anything glaringly obvious that would suggest where the memory is lost. 
> It's 
> also funny that that would only happen with more than one processor 
> because 
> the memory handling of PETSc vectors shouldn't be any different for one or 
> more processors.
>
> Do you think you could come up with a simple test case that illustrates 
> the 
> problem? In your case, I'd start with the code you have and remove 
> basically 
> everything you do: replace the assembly by a function that just fills the 
> matrix with the identity matrix (or something similarly simple), remove 
> everything that does anything useful with the solution, remove graphical 
> output, etc. The only thing that should remain is the loop that repeatedly 
> solves a linear system and illustrates the memory leak, but the program no 
> longer has to do anything useful (in fact, it probably shouldn't -- it 
> should 
> only exercise the one part you suspect of causing the memory leak).
>
> I think that would make finding the root cause substantially simpler!
>
> Best
> W.
>
> -- 
> ------------------------------------------------------------------------
> Wolfgang Bangerth email: bang...@colostate.edu
> www: http://www.math.colostate.edu/~bangerth/
>
>
-- 


Informativa sulla Privacy: http://www.unibs.it/node/8155 
<http://www.unibs.it/node/8155>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/903450f4-1846-4979-978c-929f8c3167b9n%40googlegroups.com.

Reply via email to