There isn't much difference.  Once you ennumerate,slice, or iterate a
queryset that's when the actual database query occurs.  It will pull
in all matching object in order to save you additional queries.

Why not iterate in batches, by slicing the query?  That way if you set
you step to say 100, you'll have at most 100 records in memory at a
time.  If the records add or delete during your process it might not
be so good though.

count = SomeModel.objects.all().count()

steps=100
offset=0
for i in range(0,count,steps):
   offset=offset+steps
   for o in SomeModel.objects.all()[i:offset]
        # do your stuff


You could also do a query to select all ids using .values(), and
iterate that using .get() to fetch each individually, or filter with
or'd Q objects to get batches.

value_fetch = SomeModel.objects.all().values("id")
for row in value_fetch:
    o=SomeModel.objects.get(pk=row['id'])
    # do your stuff


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to