Hi everyone! I just saw a bug (?) in bson.dbref:DBRef.__getattr__ Here's they're code: def __getattr__(self, key): return self.__kwargs[key]
And when you do copy.deepcopy on that object it will raise you KeyError. So here's a small piece of code that reproduces the problem: >>> class A(object): .. def __init__(self): .. self.d = {} .. def __getattr__(self, key): .. self.d[key] .. a = A() .. copy.deepcopy(a) Traceback (most recent call last): File "<pyshell#17>", line 7, in <module> copy.deepcopy(a) File "/usr/lib/python2.6/copy.py", line 171, in deepcopy copier = getattr(x, "__deepcopy__", None) File "<pyshell#17>", line 5, in __getattr__ self.d[key] KeyError: '__deepcopy__' So I thought the right thing right now will be to do just: class A(object): def __init__(self): self.d = {} def __getattr__(self, key): if key.startswith('__'): raise AttributeError self.d[key] and it works, but... isn't that wrong? I mean, shouldn't deepcopy somehow work in this situation, or, maybe, something else should differ? And why this code: class A(object): def __init__(self): self.d = {} def __getattr__(self, key): if key in dir(self.d): return self.d[key] raise AttributeError a = A() deepcopy(a) gets "maximum recursion depth exceeded"? Thank you. -- jabber: kost-be...@ya.ru -- http://mail.python.org/mailman/listinfo/python-list