----- Original Message -----
Sent: Tuesday, April 24, 2001 11:09
AM
Subject: Re: Speed problem in scanning a
directory with 500,000 files
Hello,
We have a directory on Redhat 6.2 with 500, 000
files. In our code we open and read the directory and for each entry in the
directory we use lstat() to check for some information. The whole scanning
takes more than eight hours which is terribly long.
Is there any way we could reduce this
length of time? If the answer is NO, then is there any official documents
about it and where can we find it?
Yes. Stop putting so many files into a single
directory.
Besides this, there is no other ways? This is not the
right solution for us. Because we are developing an application which needs to
handle large directories on client sites and these 500000 files have to be put
in a single directory.
Directory search time becomes linear in the number of
entries
once the size exceeds any directory name cache
capacity.
Repeated directory searches then becomes quadratic in
the
number of entries. 500,000 ^ 2 isn't a small number
...
-- Julie.