>> On Sun, 30 Aug 2009 08:34:47 +0200, Stefan Folkerts 
>> <stefan.folke...@itaa.nl> said:


> Interesting ideas and a simulator would be fun for this purpose.
> You could be right and your example does make sense in a way but
> still..  I do wonder if it works out in the real world.

> Let's say you have normal data that expires (user files etc) and
> large databases, some you keep for many months and sometimes even
> years.

I understand the case you're making, and I agree that the size of your
files has an impact.  I'm suggesting that the impact isn't huge, and
that it evens out in a reasonably short timeframe.

Eventually, whatever the volume size, you wind up with a library full
of volumes more or less randomly distributed between 0% and 50%
reclaimable.  If you're keeping up with reclamation, that means you're
_in_ a steady state, so you're _doing_ the same amount of work per
unit time.


So when I say "To a first approximation, it's irrelevant", focus on
the "First appoximation" bit; Yes, there are variations here, but
don't sweat them too much.

It's certainly possible to back yourself into corners with very large
or very small volumes.



- Allen S. Rout

Reply via email to