Hi, so far I have been using one celery instance per project and seems
to work ok, I have setup scripts (fabric) to start and reload the
project celery server (via manage.py celeryd), some notes en backends:

* Shared redis instance does not seem to work well, unless you use
different db numbers.
* I have not tested multiple celerys with one rabbitmq but should be
possible with some routing configuration.
* DB backend works ok and is easy to setup per project (no extra setup
or daemon to run apart from celeryd), this backend is the least
performing.

If all you projects are sharing tasks you may even run celeryd on its
own and configure all apps to send, that tasks it works ok, I have one
celeryd instance accepting shared tasks from many apps using redis
backend (rabbitmq should work as well) and seems to scale ok.

If you are managing multiple celery instances you may want to give a
look at supervisord:

http://supervisord.org/

Regards,
Carlos Daniel Ruvalcaba Valenzuela

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com.
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en.

Reply via email to