Hi,

I analyzing some netwokr log files. There are around 200-300 files and
each file has more than 2 million entries in it.

Currently my script is reading each file line by line. So it will take
lot of time to process all the files.

Is there any efficient way to do it?

May be Multiprocessing, Multitasking ?


Thanks.


-- 
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]
http://learn.perl.org/


Reply via email to