On Sun, Mar 09, 2025 at 11:44:17AM +0000, Gavin Smith wrote:
> > Also, maybe more 'philosophically', not freeing this memory is a bit
> > like accepting that memory leaks are not an issue if the speed gain is
> > important when the memory is only released at exit.
> 
> I don't accept the term "memory leak" to refer to still allocated memory at
> the end of the program.

My point here is that, even if it is not a memory leak in C, it is
unreachable memory, and if a program could analyse the Perl and C
program together, it should report the memory as unreachable when the
scope where the $document exists in Perl is left.  It is not strictly a
memory leak, but conceptually, it looks like one.

> In the past, I found that valgrind would not report a leak if the program
> ended with "exit (0)" at the end of 'main' rather than "return", as 
> local variables in 'main' would still reference the memory.  Memory referenced
> by global (or static) variables would also not be viewed as lost.
> 
> More recently I found this could be affected by optimization settings.
> That is the reason for my comment in README-hacking regarding the info
> tests: "Note that a small number of tests may report leaks unless info
> is compiled with CFLAGS='-O0'."
> 
> There's no point in doing expensive clean-up operations immediately
> before exiting the program.  The only point would be if valgrind could not
> cope with it and real memory leaks were being obscured.

Ok.  I think that memory leaks can appear or disappear depending on what
we free, though, or, as you say above, what the compiler does in
details, or whether we use return/exit...

Another point is if there are two files processed by texi2any, if the
documents (except for the last) are not freed explicitely, the total
memory would include them.

I still think that with TEST, we should always free to find out issues.

> > Indeed, the
> > unrealeased memory is a memory leak from the whole Perl+C program point
> > of view, as the Perl object that could be used to retrieve the C to be
> > freed is lost (it is not fully lost, all the documents may still be
> > released, but the handler to this specific document is lost).  (As a
> > side note, the document holds Perl references, such that if it is not
> > free'd, Perl will not be able to collect this memory.  Since we are at
> > the end of the program, not allowing Pelr to release memory it probably
> > also speeds up the program).
> 
> As far as I know Perl does not clean up all of its allocations unless
> the "destruct level" is set with e.g. Perl::Destruct::Level.

Not all, but Perl cleans some allocations when it notices that the
reference count has reached 0, independently of Perl::Destruct::Level.
So, if the C code keeps some references, it cannot be done.  I am not
saying that it is a big deal, though, it is just a remark.

-- 
Pat

Reply via email to