Bryan R Harris wrote:
> I have a large directory tree that I'd like to build index files for,
> essentially an
>
> ls > index.txt
>
> in each directory in the tree. Obviously I'm having trouble figuring it
> out. =)
>
> I've tried the following:
>
> use File::Find;
> sub process_file {
> if (-d) {
> $tmp = `ls $_`;
> open(OFILE, "> index.txt") || die ("Couldn't open index.txt: $!\n");
You are opening the file in write mode always, this will clobber the
contents.
Finally you will end up with just entry that was written
>
> print OFILE $tmp;
> close(OFILE);
> }
> }
> find(\&process_file, @ARGV);
> print "\n";
>
> But it misses the deepest level of directories. Is there an established
> way of doing this kind of thing?
$File::Find::dir contains the current directory being processed and $_ the
current file
This should work for you
#!/usr/local/bin/perl -w
use strict;
use File::Find;
find (\&process_file, @ARGV);
sub process_file {
open (INDEXFILE, ">> $File::Find::dir/index.txt") or
die "Cannot open $File::Find::dir/index.txt : $!\n";
print INDEXFILE;
close (INDEXFILE);
}
But this seems like a lot of opens considering that you have huge directory
tree
You can build a hash of arrays with the directory name as the key and a
array
of filenames. After you are finished processing dump them individually into
their respective files.
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]