Hi all, I've been writing some code that populates our mysql db's through django. I have about 55 MB of pickled data (translating to about 80k new table entries and 78k new users).
The problem is that python keeps allocating more and more memory --- eventually eating up over 1.1 GB at which point the whole process starts paging and crawls to a halt. Running python 2.4, Django 0.95 magic removal with mysql 5.0.18 standard on OS X. I ran a test to see what was going on... here's my simple model: class Foo(models.Model): subject = models.CharField(maxlength=256) timestamp = models.DateTimeField() category = models.CharField(maxlength=32) owners = models.ManyToManyField(Owner) def __str__(self): return self.subject and here's my simple view which runs the test: def leak(request): print "adding 10000 models" for i in range(1,10000): d = Foo(subject="test", timestamp=datetime.datetime.now(), category="test") d.save() return HttpResponse("leaked a bunch a mem") Now, in this test, python adds on about 4 megs... which it doesn't release. Even if i add ... for i in range(1,10000): d = Models(subject="test", timestamp=datetime.datetime.now(), category="test") d.save() del d <---------------- .... The interesting thing is that if i run the view a second time, python's allocation doesn't grow again, which makes me think that django is caching the objects somewhere. If that's the case, is there any way to turn that off? Or does someone see a workaround? Many thanks! Andrew --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-users -~----------~----~----~----~------~----~------~--~---