On 15 Apr 2007 23:12:34 -0700, Paul Rubin <"http://phr.cx"@nospam.invalid> wrote: >I'd like to suggest adding a new operation > > Queue.finish() > >This puts a special sentinel object on the queue. The sentinel >travels through the queue like any other object, however, when >q.get() encounters the sentinel, it raises StopIteration instead >of returning the sentinel. It does not remove the sentinel from >the queue, so further calls to q.get also raise StopIteration. >That permits writing the typical "worker thread" as > > for item in iter(q.get): ... > >without having to mess with the task-counting stuff that recently got >added to the Queue module. The writing end of the queue simply >calls .finish() when it's done adding items. > >Someone in an earlier thread suggested > > # writing side > sentinel = object() > q.put(sentinel) > > ... > # reading side > for item in iter(q.get, sentinel): ... > >however that actually pops the sentinel, so if there are a lot of >readers then the writing side has to push a separate sentinel for >each reader. I found my code cluttered with > > for i in xrange(number_of_worker_threads): > q.put(sentinel) > >which certainly seems like a code smell to me.
Instead of putting multiple sentinels, just pre-construct the iterator object. work = iter(q.get, sentinel) Re-use the same iterator in each thread, and you'll get the behavior you're after. Jean-Paul -- http://mail.python.org/mailman/listinfo/python-list