Dan Sugalski wrote:
> At 12:06 PM 2/9/2001 -0500, Ken Fox wrote:
> >  2. Work proportional to live data, not total data. This is hard to
> >     believe for a C programmer, but good garbage collectors don't have
> >     to "free" every allocation -- they just have to preserve the live,
> >     or reachable, data. Some researchers have estimated that 90% or
> >     more of all allocated data dies (becomes unreachable) before the
> >     next collection. A ref count system has to work on every object,
> >     but smarter collectors only work on 10% of the objects.
>
> As is this. (Perl can generate a lot of garbage if you're messing around
> with strings and arrays a lot)
>

Let me see if I got that right. If I change the way some objects are used so
that I tend to create other objects instead of reusing the old ones, I'm
actually not degrading GC performance, since its work is proportional to
live data. Right? This increases memory usage, though, right? Would this
cause some thrashing if the excessive memory usage causes degrading to
virtual memory? (I guess not, since live data would probably be accessed,
and dead data would probably be discarded somehow before going to virtual
memory, right?).

What are actually the consequences of generating more or less garbage by
reusing/not reusing structures, under this advanced GC model?

> Finally, all you really need to do is read the last day or so of p5p where
> Alan's trying to plug a batch of perl memory leaks to see how well the
> refcount scheme seems to be working now...

Yeah, I know that... But I actually think this is because Perl 5's
implementation of refcounting is quite messy, specially when weakrefs are in
the game.

- Branden

Reply via email to