[EMAIL PROTECTED] wrote: > MKoool wrote: > > I have an application with one function called "compute", which given a > > filename, goes through that file and performs various statistical > > analyses. It uses arrays extensively and loops alot. it prints the > > results of it's statistical significance tests to standard out. Since > > the compute function returns and I think no variables of global scope > > are being used, I would think that when it does, all memory returns > > back to the operating system. > > > > Instead, what I see is that every iteration uses several megs more. > > For example, python uses 52 megs when starting out, it goes through > > several iterations and I'm suddenly using more than 500 megs of ram. > > > > Does anyone have any pointers on how to figure out what I'm doing > > wrong? > > Are you importing any third party modules? It's not unheard of that > someone else's code has a memory leak. >
- sounds like you're working with very large, very sparse matrices, running LSI/SVD or a PCA/covariance analysis, something like that. So it's a specialized problem, you need to specify what libs you're using, what your platform / O/S is, py release, how you installed it, details about C estensions, pyrex/psyco/swig, the more info you supply, the mroe you get back. - be aware there's wrong ways to measure memory, e.g. this long thread: http://mail.python.org/pipermail/python-list/2005-November/310121.html -- http://mail.python.org/mailman/listinfo/python-list