On Jan 4, 2007, at 9:55 AM, Robin Becker wrote:
Is there a better way to handle this sort of thing using fastcgi or
scgi? Can we get all requests from a particular host to be handled
in only one group of processes. Our back end process doesn't
respond well to being threaded and it can take a long time to
complete so we seem to need a worker pool for each virtual host.
I'm running nginx + fcgi for Django, so the below is in nginx's
config format. But I'm assuming that lighttpd should be able to do
something similar.
Basically, in nginx, you can specify various upstream pools. For
example:
upstream blah {
server unix:/tmp/blah.com_1.sock;
server unix:/tmp/blah.com_2.sock;
}
upstream foo {
server unix:/tmp/foo.com_1.sock;
server unix:/tmp/foo.com_2.sock;
}
then in the site definitions:
server {
listen 80;
server_name blah.com;
location / {
#insert FCGI params stuff here
...
##
fastcgi_pass blah;
}
}
server {
listen 80;
server_name foo.com;
location / {
#insert FCGI params stuff here
...
##
fastcgi_pass foo;
}
}
The above would tell nginx that for blah.com, use the pool described
in upstream blah, load balancing between the entries. Same for foo.com.
I'm using the the django+flup fcgi things described on the official
site, but in theory, you could use apache as well, I suppose.
---
David Zhou
[EMAIL PROTECTED]
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Django
users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---