friend...@gmail.com wrote:
> 
> I analyzing some netwokr log files. There are around 200-300 files and
> each file has more than 2 million entries in it.
> 
> Currently my script is reading each file line by line. So it will take
> lot of time to process all the files.
> 
> Is there any efficient way to do it?
> 
> May be Multiprocessing, Multitasking ?

Reading about 40GB of data line by line is going to take several seconds I'm
afraid, but what are you doing with the lines as you read them? You may be able
to speed things up a little using techniques appropriate to your application.

I suggest you start by writing a benchmark program that just reads through all
of the files without processing the data and see what your baseline speed is.
Then see how much overhead the processing adds so that you know which code to
optimise.

Rob


-- 
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to