On 01/23/2013 04:26 PM, Chris Adams wrote:

> To expand: when clients browse to a directory, the webserver daemon has
> to generate a directory listing (usually sorted, which means the daemon
> has to retrieve the whole directory into memory, sort it, and then
> generate the HTML to send to the client).  With large directories, that
> blocks that webserver process/thread for a noticable time.  A relatively
> small number of requests can slow down and/or block other access.

Only if the filesystem really sucks.
Sorting and generation is done (or... must be done) in an instant; we are 
talking
about 25,000 stupid strings.

If I test

wget 
http://fedora.inode.at/fedora/linux/releases/16/Everything/x86_64/os/Packages/

which is a unified package dir, I'm not able to see any delay between
the "connect" and the "saving" phase.
And if 100 users are hitting the big dir, you have filesystem caching
helping you a lot.

Best regards.
-- 
   Roberto Ragusa    mail at robertoragusa.it
-- 
users mailing list
users@lists.fedoraproject.org
To unsubscribe or change subscription options:
https://admin.fedoraproject.org/mailman/listinfo/users
Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines
Have a question? Ask away: http://ask.fedoraproject.org

Reply via email to