Create your own hash table that indexes the directory for faster searches.
Paul Anderson Min Yuan wrote: Hello, We have a directory on Redhat 6.2 with 500, 000 files. In our code we open and read the directory and for each entry in the directory we use lstat() to check for some information. The whole scanning takes more than eight hours which is terribly long. Is there any way we could reduce this length of time? If the answer is NO, then is there any official documents about it and where can we find it? Thank you! Min Yuan |
- Speed problem in scanning a directory with 500,000 files Min Yuan
- Re: Speed problem in scanning a directory with 500,00... Paul Anderson
- Re: Speed problem in scanning a directory with 500,00... Werner Puschitz
- Re: Speed problem in scanning a directory with 500,00... Thilo Mezger
- Re: Speed problem in scanning a directory with 500,00... Vilius Puidokas
- Re: Speed problem in scanning a directory with 500,00... Diego Pons
- Re: Speed problem in scanning a directory with 500,00... Julie
- Re: Speed problem in scanning a directory with 500,00... Min Yuan
- Re: Speed problem in scanning a directory with 500,00... Dan Kegel
- Re: Speed problem in scanning a directory with 500,00... Robert Soros