rfv-370 <[EMAIL PROTECTED]> wrote: > have made the following small test: > > Before starting my test my UsedPhysicalMemory(PF): 555Mb > > >>>tf=range(0,10000000) PF: 710Mb ( so 155Mb for my List) > >>>tf=[0,1,2,3,4,5] PF: 672Mb (Why? Why the remaining 117Mb is > >>>not freed?) del tf PF: 672Mb (unused memory > >>>not freed)
Integer objects that are once generated are kept around in a "free list" against the probability that they might be needed again in the future (a few other types of objects similarly keep a per-type free-list, but I think int is the only one that keeps an unbounded amount of memory there). Like any other kind of "cache", this free-list (in normal cases) hoards a bit more memory than needed, but results in better runtime performance; anomalous cases like your example can however easily "bust" this too-simple heuristic. > So how can I force Python to clean the memory and free the memory that > is not used? On Windows, with Python 2.5, I don't know of a good approach (on Linux and other Unix-like systems I've used a strategy based on forking, doing the bit that needs a bazillion ints in the child process, ending the child process; but that wouldn't work on Win -- no fork). I suggest you enter a feature request to let gc grow a way to ask every type object to prune its cache, on explicit request from the Python program; this will not solve the problem in Python 2.5, but work on 3.0 is underway and this is just the right time for such requests. Alex -- http://mail.python.org/mailman/listinfo/python-list