On Thu, 7 Feb 2002, Brian Hayes wrote:
> It appears the problem was using the foreach statement instead of while.
> I have not tested this extensively, but using foreach the whole text
> file (or output of pipe) is read into memory before continuing, but
> using while (and probably for) each line
It appears the problem was using the foreach statement instead of while.
I have not tested this extensively, but using foreach the whole text
file (or output of pipe) is read into memory before continuing, but
using while (and probably for) each line is processed as it is read.
Thanks for all y
> You should be using something like
>
> open(FILE, $file) or die "$!\n";
> while(){
> ## do something
> }
> close FILE;
> __END__
This is what I am doing, but before any of the file is processed, the
whole text file is moved into memory. The only solution I can think of
is to break
On Thu, 7 Feb 2002, Brian Hayes wrote:
> > You should be using something like
> >
> > open(FILE, $file) or die "$!\n";
> > while(){
> > ## do something
> > }
> > close FILE;
> > __END__
>
> This is what I am doing, but before any of the file is processed, the
> whole text file is moved in
but
quick to write it! ;)
-Original Message-
From: Brett W. McCoy [mailto:[EMAIL PROTECTED]]
Sent: Thursday, February 07, 2002 3:49 PM
To: Brian Hayes
Cc: [EMAIL PROTECTED]
Subject: Re: memory issues reading large files
On Thu, 7 Feb 2002, Brian Hayes wrote:
> Hello all. I need to
On Thu, 7 Feb 2002, Brian Hayes wrote:
> Hello all. I need to read through a large (150 MB) text file line by
> line. Does anyone know how to do this without my process swelling to
> 300 megs?
As long as you aren't reading that file into an array (which would be a
foolish thing to do, IMHO), I