On 5/7/07, James. L <[EMAIL PROTECTED]> wrote:
the files i need to parse are usually in size of 2M -
10M. will the mod_perl server(2G Mem) use up the
memory pretty quick after few hundred requests on
different files?

You're misunderstanding a little bit.  It's not that the memory used
in parsing a file gets lost permanently.  Instead, the variable that
you loaded the data holds onto the memory from the largest size it got
to.

sub parse {
  my ($class,$file) = @_;
  my @data;
  open my $F, $file or die $!;
  while ( my $line = <$F> ) {
    my @fields = split /=/, $line;
    push @data, [EMAIL PROTECTED];
  }
  close $F;
  return [EMAIL PROTECTED];
}

If you read enough data into @data to use up 20MB, it will stay that
size.  That's a good thing if you intend to read another file of
similar size on the next request.  This would only be bad if you read
a very large amount of data in but only now and then.

The best way to avoid this kind of problem is to not read the whole
thing into RAM.  You can pass an iterator object to TT instead of
loading all the data at once.

- Perrin

Reply via email to