I have a program that reads through a log file (from a firewall
actually) and generates a few statistics from it.  Problem is that is uses a
ton of memory.  I had to add a lot (500MB) of swap to a Linux box with 128
MB to begin with.  BTW - Nothing else is running.

I commented out everything and added lines back one at a time until I got to
where things went bad.  It was a command that does a split.  Here's the
uncommented code that causes me to run out of memory - nothing fancy, just a
loop that reads and does a split on the input line:


#!/usr/bin/perl
#
# Program to count which firewall rules were applied how many times
#
#


open ( FWLOG, "./logfile.010626");

while ( $LINE = <FWLOG> ) {

        chomp ($LINE);
        @fw_log = split (/;/, $LINE);

} 
The file, logfile.010626 is big, about 2.3 million records, but the records
are not that long.  Is new memory allocated for each instance of $LINE or
for @fw_log?  If so, is there a way I can make it reuse the same memory?
Or, is this just the way Perl does IO?

Thanks in advance!


Rob Blader
Naval Surface Warfare Center
(540)653-7270
[EMAIL PROTECTED]

Reply via email to