On Thu, Sep 17, 2015 at 2:07 AM, Erik Cederstrand <erik+li...@cederstrand.dk> wrote: >> Den 16/09/2015 kl. 16.45 skrev Mike Dewhirst <mi...@dewhirst.com.au>: >> >> On 16/09/2015 9:53 AM, Erik Cederstrand wrote: >>> issues because the prefetch query does something along the lines of >>> "SELECT ... FROM lesson_subjects WHERE lesson_id IN >>> [insane_list_of_lesson_ids]". >> >> I'm no expert so I'm wondering if len([insane_list_of_lesson_ids]) == >> "hundreds of thousands"? > > In my, case, yes. I think the backend might process the query in chunks if > the length of the SQL exceeds the max SQL query size.
Just curious, which RDBMS are you using? I remember that on MySQL there used to be an advice to populate a temporary table and do a JOIN instead of very big `xxx IN (yyyy)` statements. I'm not sure if there's a hard limit, but anything over a few hundreds would be better with JOIN than with IN(...) With Oracle, there _is_ a hard limit, and a not very high one. I've hit it frequently when doing exploratory SQL on SQLDeveloper. For anything over a thousand items, it's safer to go the temporary table route. The only one where I haven't been able to hit any limit is PostgreSQL. There i routinely do many thousands of items and it works beautifully. I haven't personally tried "hundreds of thousands", but I guess that if it fits within some maximum textual representation, then it will handle it. -- Javier -- You received this message because you are subscribed to the Google Groups "Django users" group. To unsubscribe from this group and stop receiving emails from it, send an email to django-users+unsubscr...@googlegroups.com. To post to this group, send email to django-users@googlegroups.com. Visit this group at http://groups.google.com/group/django-users. To view this discussion on the web visit https://groups.google.com/d/msgid/django-users/CAFkDaoSAwPF%2BuajdhGLLROSW8p%2BkfrOnKLN91DUujxH7BRFEQw%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.