Has anyone else ever been surprised by the number of duplicate database queries when looking at the Django debug toolbar output?
In my application, we have models for Users, Profiles and Companies. Many of our methods take a user as a parameter, and then look up the related profile and/or company. It's very common for a single page request to end up performing the same profile or company query a dozen times. Of course, the code could be refactored to pass these objects around as needed instead of looking them up again. In fact, I have already started using a middleware that populates these as attributes on the request object. But it got me thinking, is this a common pattern among Django apps? If so, I wanted to propose a solution. What if by default query sets were cached for the duration of that request? For example, you could perform a User.objects.get(id=123) twice inside a single request/response cycle, and the second time it would be retrieved from a per-request cache in memory. Obviously this could break existing code, so it would need to be optional. But I would think that in almost all cases, web applications behave consistently with this enabled. In the cases where you really do want a fresh query, there could be a new chained query set method like .refresh() or .live(). I'm interested on other people's estimates of how many database queries this would save them on their real world projects. I'm estimating 20% for mine. -- You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com. To unsubscribe from this group, send email to django-users+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/django-users?hl=en.