> Min Yuan wrote:
>
> Hello,
>
> We have a directory on Redhat 6.2 with 500, 000 files. In our code we open and
> read the directory and for each entry in the directory we use lstat() to check
> for some information. The whole scanning takes more than eight hours which is
> terribly long.
>
> Is there any way we could reduce this length of time? If the answer is NO,
> then is there any official documents about it and where can we find it?
I run into a similar problem 10 years ago, under SVR3, in a system that saved
incoming network messages as files.
The solution, as sombody else already pointed out, is to create
your own directory parsing tools. In my case, I created a version
of "dirent", the tools that SVR3 had for directory scanning and sorting.
The key was to use qsort to order the files, instead of the default n^2 or worse
algorithm.
Another approach, dirtier and easier, would be to split the directory in about
1000
subdirectories using some kind of perfect hash function to rename the files.
--
Diego Pons Pharos Consulting LLC
[EMAIL PROTECTED] Los Angeles, CA
_______________________________________________
Redhat-devel-list mailing list
[EMAIL PROTECTED]
https://listman.redhat.com/mailman/listinfo/redhat-devel-list