On Sat, 2006-06-17 at 04:52 +0000, [EMAIL PROTECTED] wrote: > Hi all, > > I've been writing some code that populates our mysql db's through > django. I have about 55 MB of pickled data (translating to about 80k > new table entries and 78k new users). > > The problem is that python keeps allocating more and more memory --- > eventually eating up over 1.1 GB at which point the whole process > starts paging and crawls to a halt.
[...] > The interesting thing is that if i run the view a second time, python's > allocation doesn't grow again, which makes me think that django is > caching the objects somewhere. If you have DEBUG = True in your settings file (which you probably do), then Django is saving a copy of every SQL query in db.connection.queries. That is not the case when DEBUG = False. When each new request is started, this list is reset, although the memory won't be immediately returned to the system, since Python will do some memory caching of its own (plus even free() does not result in an immediately observable memory decrease of a process, typically). If you want to clear the list manually at any point inside your functions, you can just do connection.queries = [] or call db.reset_queries(). Regards, Malcolm --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-users -~----------~----~----~----~------~----~------~--~---