Paul Rubin schrieb: > Anyone got a favorite LRU cache implementation? I see a few in google > but none look all that good. I just want a dictionary indexed by > strings, that remembers the last few thousand entries I put in it.
I don't know a module for that (although it might exist), but I could imagine how to implement it. Generally a LRU cache could be implemented as a (linked) list with a max. size of N. On usage of an object this object will be moved to the top of the list (and removed from the old position). If it's not yet stored in the cache, you (fetch it from the source and) add it on top and remove the last one, if the list exceeds N entries. So you need to do two things: 1) Maintain a list: Use a (linked) list. 2) Random access to that list: Use a dict (hash), that also has to be updated on removal/addition of objects. -- Thomas Wittek Web: http://gedankenkonstrukt.de/ Jabber: [EMAIL PROTECTED] GPG: 0xF534E231 -- http://mail.python.org/mailman/listinfo/python-list