It's a matter of memory management.  If you suck in an entire file, you
could run out of memory, making your script run slower.  Also, you have to
do all of the reading before you can start processing the file.

while (<FH>) {
    if (/something/) {
         print "$_\n";
    }
}

or

while (my $i=<FH>) {
    if ($i=~/something/) {
        print "$i\n";
    }
}

-----Original Message-----
From: Teresa Raymond [mailto:[EMAIL PROTECTED]]
Sent: Friday, May 03, 2002 9:18 AM
To: Perl Beginners List
Subject: @array=<FH>


Someone mentioned that sucking a file into an array is not a good
idea and I read the Perl fact on it but still am not sure why this is
not a good idea, especially because a lot of code posted uses this
method.

In addition, if you have the file in an array then you can do foreach:

open(FH, "text.fil") || die "Can't open text.fil\n";
my @array=<FH>;
close(FH);

foreach my $i (@array)
{if ($i=~/something/)
  {print "$i\n";
  }
}

Would one use while instead and what would the code look like?

open(FH, "text.fil") || die "Can't open text.fil\n";
while (<FH>)
{if (????what goes here to emulate the above foreach????)
  {print "????ditto????\n";
  }
}
close(FH);
--
-------------------------------
-  Teresa Raymond             -
-  Mariposa Net               -
-  http://www.mariposanet.com -
-------------------------------

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to