Try something like this. I just tried it on 60+ files and it seems to
work. (If you have multiple worksheets you may need to modify the
cell_handler to exclude the ones you don't want)
use Spreadsheet::ParseExcel;
use File::Basename;
use strict;
my $resultMessage = '';
my $maxrow = 0;
my $oBook;
opendir(DIR, "./files");
my @names = readdir(DIR) or die "could not $@";
foreach my $file(@names) {
next unless $file =~ /.xls$/;
print "Trying File.. $file\n";
$maxrow = 0;
my $excel = new Spreadsheet::ParseExcel(CellHandler =>
\&cell_handler, NotSetCell => 1);
$oBook = $excel->Parse("./files/$file") or die "$file . $!";
$resultMessage.=sprintf("%s,%s\n",basename($file),$maxrow);
}
print $resultMessage . "\n";
sub cell_handler {
my $row = $_[2];
$row++;
$maxrow = $row if $row > $maxrow;
}
On 4/27/05, Craig Moynes <[EMAIL PROTECTED]> wrote:
> Hi Gents,
> I tried both suggestions:
>
> #
> # For each log in the array
> # - grab the directory and name of the file.
> # - send via sendmail
> #
> foreach $hashEntry ( @LOGS )
> {
>
> my ( $localfile) = $hashEntry->{name};
> my ( $err_msg ) = ""; # error message variable
>
> my $cmd = "";
>
> #
> # Get row count of each file, to generate results file
> #
> my $oBook = new Spreadsheet::ParseExcel::Workbook->Parse($localfile);
> my $oWkS = ${$oBook->{Worksheet}}[0];
>
> print "------ SHEET: ".$oWkS->{Name}. "\n";
> print "Row: ".$oWkS->{MinRow}." v ".$oWkS->{MaxRow}."\n";
>
> $resultMessage.=sprintf("%s,%s\n",basename($localfile),$oWkS->{MaxRow});
> }
>
> But I still get an out of memory error on the 10th file opened.
>
> Any additional suggestions?
>
> On 4/27/05, Bakken, Luke <[EMAIL PROTECTED]> wrote:
> > > my $oBook;
> > > my $oWks;
> > > foreach $hashEntry ( @LOGS )
> > > {
> > >
> > > my ( $localfile) = $hashEntry->{name};
> > > my ( $err_msg ) = ""; # error message
> > > variable
> > >
> > >
> > > my $cmd = "";
> > >
> > > #
> > > # Get row count of each file, to generate results file
> > > #
> > > $oBook = new
> > > Spreadsheet::ParseExcel::Workbook->Parse($localfile); my
> > > ($iR, $iC, $oWkS, $oWkC); $oWkS = ${$oBook->{Worksheet}}[0];
> > >
> > > print "------ SHEET: ".$oWkS->{Name}. "\n";
> > > print "Row: ".$oWkS->{MinRow}." v ".$oWkS->{MaxRow}."\n";
> > >
> > > $resultMessage.=basename($localfile).",".$oWkS->{MaxRow}."\n"; }
> > >
> > > The problem I am running into is after 10 files (in testing all 31
> > > files are the same source file with different names), and then I get
> > > an out of memory error. Anyone have any idea how I can clean out the
> > > memory. I have a feeling it might be some autocaching or something
> > > not getting cleaned up within ParseExcel.
> >
> > If you move the 'my $oBook' inside the for loop it the object should be
> > destroyed on each iteration:
> >
> > for my $hashEntry ( @LOGS )
> > {
> >
> > my ( $localfile) = $hashEntry->{name};
> > my ( $err_msg ) = ""; # error message
> > variable
> >
> > my $cmd = "";
> >
> > #
> > # Get row count of each file, to generate results file
> > #
> > my $oBook = new
> > Spreadsheet::ParseExcel::Workbook->Parse($localfile);
> > my ($iR, $iC, $oWkC);
> > my $oWkS = ${$oBook->{Worksheet}}[0];
> >
> > print "------ SHEET: ".$oWkS->{Name}. "\n";
> > print "Row: ".$oWkS->{MinRow}." v ".$oWkS->{MaxRow}."\n";
> >
> > $resultMessage.=basename($localfile).",".$oWkS->{MaxRow}."\n";
> > }
> >
> > --
> > To unsubscribe, e-mail: [EMAIL PROTECTED]
> > For additional commands, e-mail: [EMAIL PROTECTED]
> > <http://learn.perl.org/> <http://learn.perl.org/first-response>
> >
> >
>
> --
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
> <http://learn.perl.org/> <http://learn.perl.org/first-response>
>
>
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>