On Jan 14, 2:02 am, Catherine Moroney <catherine.m.moro...@jpl.nasa.gov> wrote: > Hello everybody, > > I know how to spawn a sub-process and then wait until it > completes. I'm wondering if I can do the same thing with > a Python function. > > I would like to spawn off multiple instances of a function > and run them simultaneously and then wait until they all complete. > Currently I'm doing this by calling them as sub-processes > executable from the command-line. Is there a way of accomplishing > the same thing without having to make command-line executables > of the function call? > > I'm primarily concerned about code readability and ease of > programming. The code would look a lot prettier and be shorter > to boot if I could spawn off function calls rather than > subprocesses. > > Thanks for any advice, > > Catherine
There is an example explaining how to implement exactly this use case in the documentation of my decorator module: http://pypi.python.org/pypi/decorator/3.0.0#async The Async decorator works both with threads and with multiprocessing. Here is an example of printing from multiple processes (it assumes you downloaded the tarball of the decorator module, documentation.py is the file containing the documentation and the Async decorator; it also assumes you have the multiprocessing module): $ cat example.py import os, multiprocessing from documentation import Async async = Async(multiprocessing.Process) @async def print_msg(): print 'hello from process %d' % os.getpid() for i in range(3): print_msg() $ python example.py hello from process 5903 hello from process 5904 hello from process 5905 -- http://mail.python.org/mailman/listinfo/python-list