On Thu, Apr 7, 2011 at 6:38 AM, Martin v. Loewis <mar...@v.loewis.de> wrote: > You can adjust the implementations of PyMem_Malloc and PyObject_Malloc. > This would catch many allocations, but not all of them. If you adjust > PyMem_MALLOC instead of PyMem_Malloc, you catch even more allocations - > but extensions modules which directly call malloc() still would bypass > this accounting.
I'm not too concerned about extensions, here; in any case, I lock most of them off. I just want to prevent stupid stuff like this: a='a' while True: a+=a from bringing the entire node to its knees. Obviously that will eventually bomb with MemoryError, but I'd rather it be some time *before* the poor computer starts thrashing virtual memory. (Hmm. I tried the above code in Python 2.6.6 on my scratch box, with 3GB of memory, and it actually died with "OverflowError: strings are too large to concat" at 1GB. Must be the 32-bit Python on there, heh. But repeating the exercise in the same Python with a second variable produces the expected MemoryError.) If it's too difficult, I'll probably just tell my boss that we need 8GB of physical memory in these things, and then disable virtual memory. That'll ensure that MemoryError happens before the hard disk starts grinding performance into dust :) Chris Angelico -- http://mail.python.org/mailman/listinfo/python-list