Fetchinson . wrote:
Hi folks,

I have a rather simple program which cycles through a bunch of files,
does some operation on them, and then quits. There are 500 files
involved and each operation takes about 5-10 MB of memory. As you'll
see I tried to make every attempt at removing everything at the end of
each cycle so that memory consumption doesn't grow as the for loop
progresses, but it still does.

import os

for f in os.listdir( '.' ):

     x = [ ]

     for ( i, line ) in enumerate( open( f ) ):

         import mystuff
         x.append( mystuff.expensive_stuff( line ) )
         del mystuff

     import mystuff
     mystuff.some_more_expensive_stuff( x )
     del mystuff
     del x


What can be the reason? I understand that mystuff might be leaky, but
if I delete it, doesn't that mean that whatever memory was allocated
is freed? Similary x is deleted so that can't possibly make the memory
consumption go up.

Any hint would be much appreciated,
Daniel


Try calling the garbage collector explicitly.

--
https://mail.python.org/mailman/listinfo/python-list

Reply via email to