You can make a caching function that returns a decorator depending on your condition, that you subsequently apply to your function:
def conditional_caching(condition): def decorator(f): cache = {} def cached_f(n): if n in cache: return cache[n] else: value = f(n) if condition(n): cache[n] = value return value return cached_f return decorator (just a proof of concept, I haven't bothered with wraps or arguments hashability) After this you simply do @conditional_caching(condition) def my_function(n): ...whatever Obviously this would be quite inefficient if implemented in pure python, but shouldn't be hard to make a similar function in the same sort as @cached_function. One could even use a default condition=None and in that case return the usual cached_function decorator. Since the (presence of) a condition is checked the first time the decorator is created, this should not make any existing stuff slower. Cheers, J On Wednesday, May 14, 2014 6:04:51 PM UTC+1, Simon King wrote: > > Hi! > > Nathann's original question/proposal was about ways to cache the return > value of a cached function only under specific conditions. This would > probably boil down to a decorator that takes more than one function as > argument. This is not unprecedented: Think of Python's @property. > > So, I could very well imagine that we could have a syntax like this > (which is totally analogous to @property!) > > class MyClass: > @cached_method > def foo(self, *args, **kwds): > "doc" > <compute value, given *preprocessed* arguments> > return value > @foo.mangle_arguments > def foo(self, *args, **kwds): > "doc" > <preprocess the given arguments> > return preprocessed arguments > @foo.create_key > def foo(self, *args, **kwds): > "doc" > <use the preprocessed arguments to create the cache key> > return key > @foo.cache_condition > def foo(self, *args, **kwds): > "doc" > <test whether the value should be cached or not> > return True resp. False > > Of course, the next question is whether this could be done without > slowing down existing stuff (that does not use the new feature). It is > probably possible, by creating new sub-classes of CachedMethodCaller. > > Note that in the scenario above, I wrote "doc" a couple of times. In > some post, Nathann complained that he would need to write several > docstrings and tests when he would separate the task of computing the > value, computing some cache condition and so on. Well, Nathann, I think > here you are argueing *pro* separation of tasks! Namely, separate tasks > have separate docs, which makes it easier to read. > > Note also that the above syntax would help us to use > UniqueRepresentation and other stuff. One shortcoming of > UniqueRepresentation is that *currently* one has to override > __classcall__ by some static method, and do all the separate task > (argument mangling, creation of the key, creation of the instance, > caching by calling super(...).__classcall__) is done in one single > method. > > In this point, UniqueFactory currently has its only advantage over > UniqueRepresentation, from my point of view. But with the syntax above, > one could probably override just __classcall__.mangle_arguments in a > subclass of UniqueRepresentation (hmmmm, thinking a few minutes, I am > not sure how this could be easily done, technically...), which would > be nicer than overriding the whole __classcall__ > > Anyway, I am not sure how this could be implemented, but I think the > above extension of @cached_method (or similarly @cached_function) would > be nice to have. > > Best regards, > Simon > > > -- You received this message because you are subscribed to the Google Groups "sage-devel" group. To unsubscribe from this group and stop receiving emails from it, send an email to sage-devel+unsubscr...@googlegroups.com. To post to this group, send email to sage-devel@googlegroups.com. Visit this group at http://groups.google.com/group/sage-devel. For more options, visit https://groups.google.com/d/optout.