[issue13177] Avoid chained exceptions in lru_cache

2011-10-16 Thread Raymond Hettinger
Raymond Hettinger added the comment: Thanks for the bug report and for the test case. -- ___ Python tracker ___ ___ Python-bugs-list

[issue13177] Avoid chained exceptions in lru_cache

2011-10-16 Thread Ezio Melotti
Ezio Melotti added the comment: My comment was referring to double try/except suggested by Eric. Indeed the if/else might lead to a race condition, and that's a good reason to avoid LBYL -- even if on average calling hash() twice might be faster. I'm happy with the fix you committed, thanks!

[issue13177] Avoid chained exceptions in lru_cache

2011-10-16 Thread Raymond Hettinger
Changes by Raymond Hettinger : -- resolution: -> fixed status: open -> closed ___ Python tracker ___ ___ Python-bugs-list mailing lis

[issue13177] Avoid chained exceptions in lru_cache

2011-10-16 Thread Roundup Robot
Roundup Robot added the comment: New changeset 8365f82f8a13 by Raymond Hettinger in branch '3.2': Issue 13177: Make tracebacks more readable by avoiding chained exceptions in the lru_cache. http://hg.python.org/cpython/rev/8365f82f8a13 -- nosy: +python-dev

[issue13177] Avoid chained exceptions in lru_cache

2011-10-15 Thread Raymond Hettinger
Raymond Hettinger added the comment: Sorry, I'm not going to introduce a possible race condition because you find try/except to be less elegant than an if-contains test. Also, I want to keep the current behavior of calling hash only once. Your original complaint is valid though, so I'll go

[issue13177] Avoid chained exceptions in lru_cache

2011-10-14 Thread Ezio Melotti
Ezio Melotti added the comment: Canceling the chained exception might work as a workaround, but it requires yet another try/except and it's not very elegant in my opinion. Raymond, about __missing__ it shouldn't be a problem here, because we are using "in" to look in the cache, and the cache

[issue13177] Avoid chained exceptions in lru_cache

2011-10-14 Thread Eric Snow
Eric Snow added the comment: Could you just cancel the chained exception? >>> try: {}["asdf"] ... except KeyError: ... try: raise Exception() ... except Exception as x: ... x.__cause__ = None ... x.__context__ = None ... x.__traceback__ = None ... rai

[issue13177] Avoid chained exceptions in lru_cache

2011-10-14 Thread Raymond Hettinger
Raymond Hettinger added the comment: One possibility is to move the call to user_function() outside of the KeyError exception handler so that user exceptions won't be chained: def wrapper(*args, **kwds): nonlocal hits, misses key = args if kwds: key += kwd_mark + tuple(sort

[issue13177] Avoid chained exceptions in lru_cache

2011-10-14 Thread Raymond Hettinger
Raymond Hettinger added the comment: Another issue is that I want to keep the related accesses to the OrderedDict inside the "with lock" in order to avoid a race condition between the testing-for-membership step and the retrieve-the-cached-value step. -- _

[issue13177] Avoid chained exceptions in lru_cache

2011-10-14 Thread Raymond Hettinger
Raymond Hettinger added the comment: This changes behavior so that hash() gets called twice for the key tuple, resulting in decreased performance. In an earlier version of the lru_cache before the "with lock" was introduced, the try/except form was more atomic. It also worked well with dict

[issue13177] Avoid chained exceptions in lru_cache

2011-10-14 Thread Ezio Melotti
Ezio Melotti added the comment: Here's an example (copied from msg142063) of what the traceback is without the patch: >>> from functools import lru_cache >>> @lru_cache() ... def func(arg): raise ValueError() ... >>> func(3) Traceback (most recent call last): File "/home/wolf/dev/py/3.2/Lib/

[issue13177] Avoid chained exceptions in lru_cache

2011-10-14 Thread Ezio Melotti
New submission from Ezio Melotti : The attached patch changes lru_cache to use if/else instead of try/except. This has 2 effects: 1) it avoids chained exceptions and makes the error messages clearer; 2) it probably makes lru_cache a bit faster since building and catching exceptions is expensive.