On Nov 19, 8:53 am, Alessandro Ronchi <alessandro.ron...@soasi.com>
wrote:
> On Wed, Nov 18, 2009 at 10:29 PM, Graham Dumpleton <
>
>
>
>
>
> graham.dumple...@gmail.com> wrote:
>
> > Yes, don't be so concerned in the first instance of trying to squeeze
> > the most out of mod_wsgi configuration. You will get much more
> > dramatic gains by improving your database queries/performance and page/
> > data caching techniques.
>
> > Placing a nginx front end proxy in front of Apache/mod_wsgi to handle
> > static files also has benefits beyond just static file sharing.
>
> > So, what have you done in those areas to improve performance?
>
> > In other words, you should perhaps be asking about how to improve your
> > Django application performance rather than worrying about the hosting
> > system, as the hosting system is nearly always not going to be the
> > bottle neck in all of this.
>
> I'm using memcached and it's quite optimized I think.
> Good advice the ngix, but If it's possible I don't want to add another
> webserver. So, I want to get the most out of apache, where I can.

Theoretically nginx will make Apache/mod_wsgi work better for you.

In addition to offloading static file serving, the nginx front end
will also isolate you from slow browser clients. This is because
nginx, provided request content size is under default of 1MB (I
think), will buffer up request headers and request content and only
forward the request through to Apache/mod_wsgi when it has everything.
This means that the request only gets handed off to Apache/mod_wsgi
when all information available such that it can be process
immediately.

Doing this means that Apache will not be tied up with dealing with
slow clients and so it can get away with less processes/threads to
handle same number of requests.

One gets similar benefits with the response as well because the
buffering implicit in the socket pipeline and within nginx means that
Apache/mod_wsgi can offload the response quicker and close its
connection. By being able to do that quicker, it can use that thread
immediately for another request, rather than having to wait for slow
client to read prior response.

Further nginx proxy only supports HTTP/1.0 and so it doesn't use keep
alive. This means connection between nginx and Apache is dropped
immediately once response content offloaded by Apache. Thus Apache
also isn't having to wait around to see if another request might come
down the same socket.

Using nginx as a front end therefore potentially enables Apache/
mod_wsgi to handle a higher load of concurrent requests as Apache/
mod_wsgi will be spending less time processing any one request. This
fact will contribute to how you configure Apache and mod_wsgi.

That all said, it is still important to get request times as handled
by your application down as much as possible. This is where caching
and database optimisation is important, along with a focus on ensuring
pages with most number of hits get extra special attention with
perhaps even caching external to the application being used so that
application doesn't even get involved unless page content expired.

Anyway, we were having a discussion about optimising mod_wsgi settings
on mod_wsgi list a while back. That has gone off the boil at the
moment but will be resurrected again later after having come up with
some better middleware to help in monitoring process/thread
utilisation for purposes of tuning settings.

Graham

--

You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-us...@googlegroups.com.
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=.


Reply via email to