I am not sure how many of you have had a chance to use generators to implement "co-operating" aka "micro" threads in Python.
Dr. David Mertz gives a good introduction to the topic in his charming Python article here. http://www.ibm.com/developerworks/library/l-pythrd.html The other day I had a problem of creating a text file containing all the paths of C/C++ source code inside a directory tree containing rather a large number of source files (around 10 million). The normal approach I take in these cases is to write an os.walk function which keeps writing to a list and finally dump the list to a file. However since there could be around a million or more entries in the list, I thought I will whip up a solution using micro-threads with a "writer" and a "traverser" light weight thread. The traverser willl traverse a folder, write contents to the list and yield wherein the scheduler will automatically call the writer. The writer flushes the list to the file and yields wherein the scheduler switches back to the traverser. The code is here. It essentially makes use of the same skeleton as the code sample presented by David in his article. http://dpaste.com/hold/260673/ My questions are aimed at those who have some experience working with light-weight threads/generators in Python or with Stackless. 1. Can this be done better - Say in Python 3.0 ? 2. How does one do this in Stackless ? 3. If you have any experience with countless other Python concurrency libraries for solving problems like this ? 4. How will you do this using "multiprocess" module ? I hope this makes up for an interesting discussion. Thanks, -- --Anand _______________________________________________ BangPypers mailing list BangPypers@python.org http://mail.python.org/mailman/listinfo/bangpypers