I've been using <FILEHANDLE> to read an entire text file into an array.

@buffer = <MY_BIG_FILE>;

I string-manipulate the individual array elements and then sometime
later, do a

$buffer = join "", @buffer;

...and this worked OK for a 80M text file. I couldn't resist and tried
it out on
a gigabyte monster.

The script aborted with "Out of memory during request for 26 bytes
during sbrk()".

26 bytes, coincidentally enough :), is the record size.

I realize this is a pretty extreme test. Now, my question is, is this an
operating system (SCO unix in this case) virtual memory problem? There
is enough physical disk space to double the size of the input file and
still have a gig left over. Our sysadm thinks the box was configured
with 500M of swap space (but is not 100% sure).

Any way to persuade perl to use less memory if this is the case? Thanks!





-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to