Hello,
Reading a file which is large as 900M to the array,should consume memory too quickly. Could you open a file and obtain the file-handle in your subroutine,then return the file-handle to the caller?For example:

sub your_sub{
   ....
   open (FH,$somefile) or die $!;
   return \*FH;
}


From: "Omega -1911" <[EMAIL PROTECTED]>
To: "Beginners Perl" <beginners@perl.org>
Subject: Re: reading a line at a time inefficient?
Date: Fri, 23 Jun 2006 02:14:52 -0400

Hello list!

I am attempting to lower the memory load on the server that the following
lines of code creates. Is there any way to speed up this process and lower
memory usage? I read through FILE::SLURP documentation but not sure if that
would help as I need to keep the array @remaining_file_lines.

NOTE: Each file has a different size (ranging from 2kb up to 900mb)

open FILE, "$file.txt"; #  $file is untainted by code before we open the
file
my ($data,$data1,$data2,$data3,$data4,@remaining_file_lines) = <FILE>;
   close FILE; chomp
($data,$data1,$data2,$data3,$data4,@remaining_file_lines);
   return ($data,$data1,$data2,$data3,$data4,@remaining_file_lines);

TIA !!!!
-David



--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to