If you have DEBUG = True, then the SQL for the queries against the database are collected for debugging purposes. That can consume a lot of memory (especially for 800k+ records). Could that be the problem?
--Ned. http://nedbatchelder.com bcrem wrote: > Hi all, > > I've got a memory issue with a script I'm running to load my > database. It leaks, slowly but surely, until I exceed the memory > limits set by my service provider. It's a very simple script; all it > does is > > 1. Read in a data file, one line at a time > 2. Parse the line to initialize a model object > 3. Save the object > 4. Lather. Rinse. Repeat. > > The code looks like this: > > file = open('data.txt', 'r') > > for line in file: > > tokens = line.split('\t') > > zipcode = Zipcode(zip_code = tokens[1], > latitude = tokens[2], > longitude = tokens[3], > city = tokens[4], > state_or_province = tokens[5], > county_or_prefect = tokens[6]) > > zipcode.save() > > > As it's a verrrrry long data file (800,000+ records), this little > memory leak eventually gobbles all my memory. > > I know garbage collection is supposed to be automatic...but is there > anyway I can force it every once in awhile? And can anyone see some > obvious no-no I'm doing below that would cause this, any way to delete > something somehow & fix the problem? > > Thanks in advance for any advice, > bill > > > > > -- Ned Batchelder, http://nedbatchelder.com --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-users?hl=en -~----------~----~----~----~------~----~------~--~---