Getting a callable for any value?
Is there anything like this in the standard library? class AnyFactory(object): def __init__(self, anything): self.product = anything def __call__(self): return self.product def __repr__(self): return "%s.%s(%r)" % (self.__class__.__module__, self.__class__.__name__, self.product) my use case is: collections.defaultdict(AnyFactory(collections.defaultdict(AnyFactory(None And I think lambda expressions are not preferable... I found itertools.repeat(anything).next and functools.partial(copy.copy, anything) but both of those don't repr well... and are confusing... I think AnyFactory is the most readable, but is confusing if the reader doesn't know what it is, am I missing a standard implementation of this? -- http://mail.python.org/mailman/listinfo/python-list
Style question (Poll)
Which is preferred: for value in list: if not value is another_value: value.do_something() break --or-- if list and not list[0] is another_value: list[0].do_something() Comments are welcome, Thanks -- http://mail.python.org/mailman/listinfo/python-list
Object Diffs
Hello Python list: I am doing research into doing network based propagation of python objects. In order to maintain network efficiency. I wan't to just send the differences of python objects, I was wondering if there was/is any other research or development in this area? I was thinking that I could look at how pickle works an implement a diff system there, or I could actually create a pickle of the object and then use difflib to compare the flattened text... Ideas, or comments welcome Thanks, Dave Butler -- http://mail.python.org/mailman/listinfo/python-list
"High water" Memory fragmentation still a thing?
Question: Does python in general terms (apart from extensions or gc manipulation), exhibit a "high water" type leak of allocated memory in recent python versions (2.7+)? Background: >From the post: http://chase-seibert.github.io/blog/2013/08/03/diagnosing-memory-leaks-python.html Begin quote: Long running Python jobs that consume a lot of memory while running may not return that memory to the operating system until the process actually terminates, even if everything is garbage collected properly. That was news to me, but it's true. What this means is that processes that do need to use a lot of memory will exhibit a "high water" behavior, where they remain forever at the level of memory usage that they required at their peak. Note: this behavior may be Linux specific; there are anecdotal reports that Python on Windows does not have this problem. This problem arises from the fact that the Python VM does its own internal memory management. It's commonly know as memory fragmentation. Unfortunately, there doesn't seem to be any fool-proof method of avoiding it. End Quote However this paper seems to indicate that that is not a modern problem: http://www.evanjones.ca/memoryallocator/ --Thanks Dave Butler -- https://mail.python.org/mailman/listinfo/python-list