Am Samstag, den 03.02.2007, 16:36 +0100 schrieb Marc Santhoff: > Am Freitag, den 02.02.2007, 08:52 -0800 schrieb Cox, Stuart TRAN:EX: ... > > Can anyone recommend a method to search a whole drive, of arbitrary > > size, without running out of memory. > > >From reading this thread I think you must have another problem, likely > in TurboPowers or your own implementation. > > I've been doing the same (listing deep file systems) and never had any > problems with memory. My classes are made mainly for indexing storage > and backup media (CD, DVD, ...) and I've tested it five minutes ago on > an amount of: > > $ wc -l storage.txt > 152811 storage.txt > > lines naming a file or directory each. The list class in use is a > derivation of "TFPList" (think it's from "classes") with an new sorting > routine (qsort) attached. > > I had some problems with the "FindXxxx"-implementation on *nix-like OS, > but that dealt with symlinks.
Since you asked for a method, not the class type to use, I had a deeper look at it: My implementation does not stick anything into one list but uses a tree of nested lists, one TFPList derivate for each directory at each level. For every single file handled there is an item-class object holding file info (name, size, ...) put into the directory (container-)list. Works nice and fast ... only the recursion scheme seems to be somewhat more complex. HTH, Marc _______________________________________________ fpc-pascal maillist - fpc-pascal@lists.freepascal.org http://lists.freepascal.org/mailman/listinfo/fpc-pascal