On Sun, 2008-04-20 at 13:49 -0400, Richard Lee wrote:
> can this be optimized in anyway?
> open (my $source, '-|', "tail -100000 /server/server.log")
> 
> is this the best way to get large portion(well file itself is over 20 
> times) of the file into find handle?
> 

This will not optimize processing the file.  The tail command would have
to run through the entire file and save the last 100_000 lines.  It
might save on RAM memory, but with today's multi-gigabyte machines, the
file would have to be gigabyte sized to overflow it.  2_000_000 lines at
80 characters each is 160_000_000 bytes (well under 1GB).

There may be two problems with this:

1.  If you are running a memory hog like Vista, you may run out of
memory reading a measly 160 MB file.

2.  If you are running on a server, it may be so busy that you run out
of memory.

All I can say is TITS (Try It To See).


-- 
Just my 0.00000002 million dollars worth,
    Shawn

When I was in college, my Karate instructor said,
"The first hundred are easy."
Since then, I discovered he'd lied.
Life's too big to describe in hundreds.


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/


Reply via email to