On 2017-09-09 22:23, iurly wrote:
Hi,
I'm writing a multiprocessing program whose behavior I don't understand.
Essentially, the main process collects data and then passes it to a consumer
process.
For performance reasons I'm using a "static" circular buffer created through
array.array(), and then passing it "as-is" by pushing it onto a queue.
According to:
https://docs.python.org/3/library/multiprocessing.html#pipes-and-queues
I would expect the array to be pickled by the sending process and then
unpickled at the other end (i.e. no memory would be shared among the two
processes).
Thus, overwriting data on the buffer should be safe in my understanding.
What happens instead is that the consumer thread may indeed receive a corrupted
array, in that some elements might have already been overwritten by the
producer.
I did somehow overcome this limitation by just passing a copy.copy() of the
buffer, but I really don't understand why this would be necessary at all.
Could someone please shed some light on this?
Thank you!
[snip]
I suspect it's down to timing.
What you're putting into the queue is a reference to the array, and it's
only some time later that the array itself is pickled and then sent (the
work being done in the 'background').
Modifying the array before (or while) it's actually being sent would
explain the problem you're seeing.
--
https://mail.python.org/mailman/listinfo/python-list