Raymond Hettinger added the comment:

> Limiting the cache size is also not a solution in the 
> practical example with request that I linked to in the
> previous comment, because we can't know in advance how
> many times per request the function is going to be called, 
> picking an arbitrary number feels wrong and may lead to 
> unexpected behaviors

This suggests that you don't really want an LRU cache which is specifically 
designed to limit the cache size by expelling the
least recently used entry.

At its heart, the cache decorator is all about mapping a fixed inputs to fixed 
outputs.  The memory conservation comes from the replacement strategy and an 
option to clear the cache entirely.

The reason that my answer and Serhiy's answer don't fit your needs is that it 
isn't clear what you really want to do.  I think you should move this 
discussion to StackOverflow so others can help you tease-out your actual needs 
and suggest appropriate solutions.  Ideally, you should start with real use 
cases rather than focusing on hacking-up the LRU cache implementation.

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue19859>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to