Hello,
We have a directory on Redhat 6.2 with 500, 000
files. In our code we open and read the directory and for each entry in the
directory we use lstat() to check for some information. The whole scanning takes
more than eight hours which is terribly long.
Is there any way we could reduce this length
of time? If the answer is NO, then is there any official documents about it and
where can we find it?
Thank you!
Min Yuan
VytalNet, Inc. (905)844-4453 Ext. 241 |
- Re: Speed problem in scanning a directory with 500,000 fil... Min Yuan
- Re: Speed problem in scanning a directory with 500,00... Paul Anderson
- Re: Speed problem in scanning a directory with 500,00... Werner Puschitz
- Re: Speed problem in scanning a directory with 500,00... Thilo Mezger
- Re: Speed problem in scanning a directory with 500,00... Vilius Puidokas
- Re: Speed problem in scanning a directory with 500,00... Diego Pons
- Re: Speed problem in scanning a directory with 500,00... Julie
- Re: Speed problem in scanning a directory with 500,00... Min Yuan
- Re: Speed problem in scanning a directory with 500,00... Dan Kegel