I have an application with one function called "compute", which given a filename, goes through that file and performs various statistical analyses. It uses arrays extensively and loops alot. it prints the results of it's statistical significance tests to standard out. Since the compute function returns and I think no variables of global scope are being used, I would think that when it does, all memory returns back to the operating system.
Instead, what I see is that every iteration uses several megs more. For example, python uses 52 megs when starting out, it goes through several iterations and I'm suddenly using more than 500 megs of ram. Does anyone have any pointers on how to figure out what I'm doing wrong? Thanks, mohan -- http://mail.python.org/mailman/listinfo/python-list