primarily because writes to db are expensive, and reads are less so but do 
add load to an existing db.

besides, why not use the best tool for the job?  redis is much better at 
read/write IO than dbs and functions perfectly well as a key-value store.  
Rabbit is the gold standard for high availbility queue.

not saying you shouldn't use the db, but but when there are better options 
that don't add extra load to an existing db infra that's already under 
somewhat serious load

for example, my company regularly does full reindexes of all their content 
in a solr search engine.  that can encompass somewhere in the neighborhood 
of about 250k tasks and take 2.5-3 days to execute.  Do you _really_ want 
to dump 250k records in your db in one go?  And another place I was at 
indexed somewhere around 25M mirrored web pages, one page per task.  I 
would definitely not wanted to have dumped all that in a db....

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-users+unsubscr...@googlegroups.com.
To post to this group, send email to django-users@googlegroups.com.
Visit this group at https://groups.google.com/group/django-users.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-users/df99c32a-139a-4bbb-bc8f-2156474a7e3d%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to