On Thu, 2008-12-11 at 12:28 -0800, friend...@gmail.com wrote:
> Hi,
> 
> I analyzing some netwokr log files. There are around 200-300 files and
> each file has more than 2 million entries in it.
> 
> Currently my script is reading each file line by line. So it will take
> lot of time to process all the files.
> 
> Is there any efficient way to do it?
> 
> May be Multiprocessing, Multitasking ?

Are all these files on the same disk?  Are they in the same partition?
What's going to slow down processing the most is disk I/O.  Speeding
that up will give quicker results than multiprocessing, multitasking or
threading.


-- 
Just my 0.00000002 million dollars worth,
  Shawn

The key to success is being too stupid to realize you can fail.


-- 
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to