I have this rather simple batch import program. Problem is that it keeps eating memory. I am processing a 30+ Mb file. To be able to handle it I spit it in files of 10.000 lines. With each new file it is processing the import is getting slower and slower.
When it has processed about 50 files my computer with 1Gb starts swapping memory and the program virtually grinds to a halt. Why is this program eating memory? Thanks very much for you help. Berry ----------------------- import os, os.path import glob import re import csv import datetime os.environ['DJANGO_SETTINGS_MODULE'] = 'MyDjango.settings' from MyDjango.apps.address.models import Address CSVDIRECTORY = '/home/berry/temp/csvtemp/' DELETEPROCESSEDFILES = True # a list of column numbers, use column names in the rest of the program C_STREETNAME = 0 C_POSTCODE = 1 C_PLACE = 2 if CSVDIRECTORY.find('.csv') != -1: path = CSVDIRECTORY else: path = os.path.join(CSVDIRECTORY, '*') for f in glob.glob(path): print 'Working on file: %s' % f count = 0 print 'Number of items processed:' file = open(f,"rb") reader = csv.reader(file, delimiter='\t', quoting=csv.QUOTE_NONE) while True: try: row = reader.next() if row[0].find('Straat') == -1 or row[0].find('street') == -1: a = Address.objects.create( streetname_housenumber = row[C_STREETNAME], postcode = row[C_POSTCODE], placename = row[C_PLACE], countrycode = 'NL', ) a = None count += 1 s = '%s%s' % (count, '\b' * (len(str(count)) +1)) print s, except StopIteration: break file.close() if DELETEPROCESSEDFILES: os.remove(f) --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-users?hl=en -~----------~----~----~----~------~----~------~--~---