Or, if the issue is at least partly due to buffering for efficiency in
communicating between django and the database engine, is there a way
to  choose to have smaller buffers?


On Dec 24, 12:42 pm, garyrob <gary...@mac.com> wrote:
> I am getting the impression that when I do a django database query
> that iterates through all the rows of the table, django stores every
> model instance in memory.
>
> For instance, just doing
>
> sumValues = 0
> for someModel in SomeModel.objects.all():
>      sumValues += someModel.value
> print sumValues
>
> can eat up gigabytes of memory if the table is large.
>
> But in this example, I don't have any need for the SomeModel instances
> that have already been processed once the value has been retrieved. So
> this is a huge amount of wasted memory.
>
> Is there any way to make django not behave this way? I.e., when
> iterating through a query request, I'd like to be able to tell django
> to just retrieve a row at a time from the database, and  release the
> memory that was used for previously-retrieved rows.
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to