I've also put the source for my layered cache here:
http://code.google.com/p/stick-cache/
You basically have one interface and 3 implementations: memcache,
datastore, in-memory (LRU ReadWrite locks)
One quite handy thing it does is handle blocking for expensive items.
You can pass a Callable like this:
cache.value("myKey", callable);
If the value exists, return it. If not, then build it but block so
other threads don't try to build it at the same time. Once it is done
update all levels of the cache.
Hope this helps
On 11 Mar 2010, at 03:52, Don Schwarz wrote:
To pick one at random:
http://commons.apache.org/collections/apidocs/org/apache/commons/collections/map/LRUMap.html
On Wed, Mar 10, 2010 at 11:08 AM, Prashant Gupta <[email protected]
> wrote:
Thanks Don for clarification.
100MB seems sufficient for me. Please suggest me a library for LRU
caching implementation.
--
You received this message because you are subscribed to the Google
Groups "Google App Engine for Java" group.
To post to this group, send email to [email protected]
.
To unsubscribe from this group, send email to [email protected]
.
For more options, visit this group at http://groups.google.com/group/google-appengine-java?hl=en
.
--
You received this message because you are subscribed to the Google
Groups "Google App Engine for Java" group.
To post to this group, send email to [email protected]
.
To unsubscribe from this group, send email to [email protected]
.
For more options, visit this group at http://groups.google.com/group/google-appengine-java?hl=en
.
--
You received this message because you are subscribed to the Google Groups "Google
App Engine for Java" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/google-appengine-java?hl=en.