Hello all,
I have a small project of distributing a single 30M file to 10.000 Users. I
build up a small php site that will gather user registration information and
generate a link to that file. Then users will be able to download the file.
I was wondering if some of you could help me out with the bottlenecks I
might be facing. Are there particular points that I must take care of? For
the moment I'm using 2.0.53 in the prefork model.
StartServers 20
MinSpareServers 25
MaxSpareServers 50
MaxClients 256
MaxRequestsPerChild 0
Should I also increase ServerLimit?
Should I play with HTTP1/1 or disable it? (The site has one image)
When I launch hundreds of connections to the site and make a top, I see for
example that 98% of my processes are sleeping... with only 2-3 running. but
iptraf clearly shows that the whole bandwidth is being taken... Am I only
serving 2-3 clients? CPU and I/Os are null... Am I missing something?
Advices would highly be appreciated :)
Thanks
LoPo
---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
" from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]