friend...@gmail.com wrote:
>
> I analyzing some netwokr log files. There are around 200-300 files and
> each file has more than 2 million entries in it.
>
> Currently my script is reading each file line by line. So it will take
> lot of time to process all the files.
>
> Is there any efficient w
On Thu, 2008-12-11 at 12:28 -0800, friend...@gmail.com wrote:
> Hi,
>
> I analyzing some netwokr log files. There are around 200-300 files and
> each file has more than 2 million entries in it.
>
> Currently my script is reading each file line by line. So it will take
> lot of time to process all
Hi,
I analyzing some netwokr log files. There are around 200-300 files and
each file has more than 2 million entries in it.
Currently my script is reading each file line by line. So it will take
lot of time to process all the files.
Is there any efficient way to do it?
May be Multiprocessing, M