velotron wrote: > On Nov 9, 8:38 pm, "Klaas" <[EMAIL PROTECTED]> wrote: > > > I was referring specifically to abominations like range(1000000) > > However, there are plenty of valid reasons to allocate huge lists of > integers. I'm sure there are some; I doubt there are plenty. Care to name a few?
> This issue has been worked on: > http://evanjones.ca/python-memory.html > http://evanjones.ca/python-memory-part3.html > > My understanding is that the patch allows most objects to be released > back to the OS, but can't help the problem for integers. I could be Integers use their own allocator and as such aren't affected by Evan's patch. > mistaken. But on a clean Python 2.5: > > x=range(10000000) > x=None > > The problem exists for floats too, so for a less contrived example: > > x=[random.weibullvariate(7.0,2.0) for i in xrange(10000000)] > x=None > > Both leave the Python process bloated in my environment. Is this > problem a good candidate for the FAQ? I think floats use obmalloc so I'm slightly surprised you don't see differences. I know that evan's patch imposes conditions on freeing obmalloc arenas, so you could be seeing effects of that. -Mike -- http://mail.python.org/mailman/listinfo/python-list