On 24/02/2012 17:00, Eric Frederich wrote:
I can sill get it to freeze and nothing is printed out from the other except block. Does it look like I'm doing anything wrong here?
[snip] I don't normally use multiprocessing, so I forgot about a critical detail. :-( When the multiprocessing module starts a process, that process _imports_ the module which contains the function which is to be run, so what's happening is that when your script is run, it creates and starts workers, the multiprocessing module makes a new process for each worker, each of those processes then imports the script, which creates and starts workers, etc, leading to an ever-increasing number of processes. The solution is to ensure that the script/module distinguishes between being run as the main script and being imported as a module: #!/usr/bin/env python import sys import Queue import multiprocessing import time def FOO(a, b, c): print 'foo', a, b, c return (a + b) * c class MyWorker(multiprocessing.Process): def __init__(self, inbox, outbox): super(MyWorker, self).__init__() self.inbox = inbox self.outbox = outbox print >> sys.stderr, '1' * 80; sys.stderr.flush() def run(self): print >> sys.stderr, '2' * 80; sys.stderr.flush() while True: try: args = self.inbox.get_nowait() except Queue.Empty: break self.outbox.put(FOO(*args)) if __name__ == '__main__': # This file is being run as the main script. This part won't be # run if the file is imported. todo = multiprocessing.Queue() for i in xrange(100): todo.put((i, i+1, i+2)) print >> sys.stderr, 'a' * 80; sys.stderr.flush() result_queue = multiprocessing.Queue() print >> sys.stderr, 'b' * 80; sys.stderr.flush() w1 = MyWorker(todo, result_queue) print >> sys.stderr, 'c' * 80; sys.stderr.flush() w2 = MyWorker(todo, result_queue) print >> sys.stderr, 'd' * 80; sys.stderr.flush() w1.start() print >> sys.stderr, 'e' * 80; sys.stderr.flush() w2.start() print >> sys.stderr, 'f' * 80; sys.stderr.flush() for i in xrange(100): print result_queue.get() -- http://mail.python.org/mailman/listinfo/python-list