On 05/10/2017 22:06, Fetchinson . wrote:
Hi folks,
I have a rather simple program which cycles through a bunch of files,
does some operation on them, and then quits. There are 500 files
involved and each operation takes about 5-10 MB of memory. As you'll
see I tried to make every attempt at removing everything at the end of
each cycle so that memory consumption doesn't grow as the for loop
progresses, but it still does.
import os
for f in os.listdir( '.' ):
x = [ ]
for ( i, line ) in enumerate( open( f ) ):
import mystuff
x.append( mystuff.expensive_stuff( line ) )
What if you change this line to:
mystuff.expensive_stuff( line )
If it's still growing (perhaps a bit less as it's not added to x), then
something in mystuff is remembering data about each line. You might try
removing the line completely too (temporarily of course).
Other replies suggest that deleting the import [name] doesn't affect any
data it contains. So might as well move it to the top where it belongs.
del mystuff
import mystuff
mystuff.some_more_expensive_stuff( x )
You'll have to comment these lines out first I think.
--
bartc
--
https://mail.python.org/mailman/listinfo/python-list