> 
> Hi all,
> I have question with respect to memory utilization using perl. I am
reading a huge file close to 786432 lines of hex values & am storing in
an array. I do a data reformatting using the data in these array & in
the sequence of process generate a number of arrays & eventually write
into a file at the end of the subroutine. The problem I get is in the
middle of subroutine execution I get "Out of memory" indication & I have
used close to 2 Gigs of memory. So inorder to avoid this Out of memory
issue what I did was, after I send the array elements to a different
array, I initialize the original array with null. For eg: this is what I
do with one of the array,
>  
> for ($init_cnt=0;$init_cnt<=$#out_array_bin;$init_cnt++) {
> $out_array_bin[$init_cnt] = "";
> }
>  
> I followd the same approach with other arrays in my subroutine.
> I thought this would solve my Out of memory problem but it did not.
> Can some one tell me what could be an alternative solution for this
problem or kindly suggest me if sometning I should need to correct in my
existing solution.
>  
> Thanks for the help in advance,
> Hari
> 

Do you need to read the whole file initially?  This is a pretty common
beginner mistake, don't know your expertise so will suggest it first. 
If possible read only the parts that you are acting on.

Are you doing,

use strict;
use warnings; 

In your script?  Using strict will force proper (or at least better)
variable scoping which should allow earlier garabage collection on
memory that can be reused.

This sounds more like a design problem and you have shown only a trivial
piece of code, so our help is somewhat constrained.  If this doesn't
help, show us some more code...

http://danconia.org


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to