On 01/-10/-28163 02:59 PM, Abhishek Pratap wrote:
Hi Guys
My experience with python is 2 days and I am looking for a slick way
to use multi-threading to process a file. Here is what I would like to
do which is somewhat similar to MapReduce in concept.
# test case
1. My input file is 10 GB.
2. I want to open 10 file handles each handling 1 GB of the file
3. Each file handle is processed in by an individual thread using the
same function ( so total 10 cores are assumed to be available on the
machine)
4. There will be 10 different output files
5. once the 10 jobs are complete a reduce kind of function will
combine the output.
Could you give some ideas ?
So given a file I would like to read it in #N chunks through #N file
handles and process each of them separately.
Best,
-Abhi
You should probably forget threads, and simply do them as 10 separate
processes, all launched by a single parent. Since they don't share any
state, there's no need to get the inefficiency of threads.
DaveA
--
http://mail.python.org/mailman/listinfo/python-list