"Antoon Pardon" <[EMAIL PROTECTED]> wrote:
> The problem is this doesn't work well if you have multiple producers. > One producer can be finished while the other is still putting values > on the queue. > > The solution I have been thinking on is the following. > > Add an open and close operation. Only threads that have the queue > open can access it. The open call should specify whether you > want to read or write to the queue or both. When all writers > have closed the queue and the queue is empty a q.get will > raise an exception. This may be done by putting a sentinel > on the queue when the last writer closed the queue. > This is beginning to look like a named pipe to me. The nice thing about queues is that there is currently so little BS about them - you just import the module, create one by binding a name to it, and you are in business, and anyone can read and/or write to it. If I were faced with the sort of thing addressed by this thread, I would probably use some sort of time out to decide when the end has happened. After all - if the task is long running, it never stops (hopefully), and if its a batch type job, it runs out of input and stops putting stuff on the queue. It means you have to use non blocking gets and try - except, though. But then - to use any of the methods put forward in this thread, you have to use try - except anyway... Why does this remind me of COBOL: read input_file at end go to close_down ? : - ) - Hendrik -- http://mail.python.org/mailman/listinfo/python-list