wabu added the comment:

Sorry for the confusion, yes i do the yield from. The stdout stream for the 
process is actually producing data as it should. The subprocess produces a high 
amount of data (pbzip2), but is only consumed slowly. 

Normally when the buffer limit is reached for a stream reader, it calls 
pause_reading on the transport inside the feed_data method (see 
https://code.google.com/p/tulip/source/browse/asyncio/streams.py#365),
but here this is not happening, as the returned reader has no transport set 
(p.stdout._transport == None). So it fills up all the memory.

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue22685>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to