https://gcc.gnu.org/bugzilla/show_bug.cgi?id=113476

--- Comment #18 from Aldy Hernandez <aldyh at gcc dot gnu.org> ---
(In reply to rguent...@suse.de from comment #17)
> On Wed, 21 Feb 2024, aldyh at gcc dot gnu.org wrote:
> 
> > https://gcc.gnu.org/bugzilla/show_bug.cgi?id=113476
> > 
> > --- Comment #8 from Aldy Hernandez <aldyh at gcc dot gnu.org> ---
> > (In reply to Richard Biener from comment #5)
> > > (In reply to Martin Jambor from comment #4)
> > > > The right place where to free stuff in lattices post-IPA would be in
> > > > ipa_node_params::~ipa_node_params() where we should iterate over 
> > > > lattices
> > > > and deinitialize them or perhaps destruct the array because since
> > > > ipcp_vr_lattice directly contains Value_Range which AFAIU directly 
> > > > contains
> > > > int_range_max which has a virtual destructor... does not look like a POD
> > > > anymore.  This has escaped me when I was looking at the IPA-VR changes 
> > > > but
> > > > hopefully it should not be too difficult to deal with.
> > > 
> > > OK, that might work for the IPA side.
> > > 
> > > It's quite unusual to introduce a virtual DTOR in the middle of the class
> > > hierarchy though.  Grepping I do see quite some direct uses of 'irange'
> > > and also 'vrange' which do not have the DTOR visible but 'irange' already
> > > exposes and uses 'maybe_resize'.  I think those should only be introduced
> > > in the class exposing the virtual DTOR (but why virtual?!).
> > > 
> > > Would be nice to have a picture of the range class hierarchies with
> > > pointers on which types to use in which circumstances ...
> > 
> > For reference, you should always use the most base class you can.  If
> > you can get the work done with the vrange API, use that.  If you're
> > dealing with an integer, use irange.  The int_range<> derived class
> > should only be used for defining a variable, so:
> > 
> > int_range<123> foobar; // Defined with storage.
> > 
> > if (is_a <irange>)
> > {
> >   irange &p = as_a <irange> (foobar);
> >   ...
> > }
> > 
> > // Use an irange reference here, not an int_range reference.
> > void somefunc (const irange &r)
> > {
> > }
> > 
> > Also, the reason irange does not have a destructor is because it has
> > no storage.  Only int_range<> has storage associated with it, so it is
> > the only one able to see if the range grew:
> 
> But when I do
> 
>  irange *r = new int_range<>;
>  delete r;
> 
> then it will fail to release memory?  Are virtual DTORs not exactly
> invented for this, when you delete on a reference/pointer to a base
> class but the object is really a derived one?

There is no memory to release above, as int_range<> does not grow by default. 
Only int_range_max can grow, and it's meant to live on the stack by design. 
That was the whole point:

typedef int_range<3, /*RESIZABLE=*/true> int_range_max;

I suppose if you specifically wanted to shoot yourself in the foot, you could
do:

irange *r = new int_range<5, true>;

...and that would fail to release memory, but why would you do that?  If you're
allocating a lot of ranges, we have the vrange_allocator written specifically
for that, which is a lot less memory intensive.  It's what we use in the cache.

If ranges are anywhere but the stack, they should go through the allocator
which will squish down things appropriately, as iranges are very big.  The
ipa_vr class also uses the allocator in order to keep the memory footprint
down.

I guess it wouldn't hurt to have a virtual destructor in irange or even vrange
for future proofing, but it's not needed right now.  I don't have any strong
opinions, unless there's a performance penalty.

Reply via email to