Eryk Sun <eryk...@gmail.com> added the comment:

> In your code, huge data passed to .write(huge) may be 
> remained in the internal buffer.

If you mean the buffered writer, then I don't see the problem. A large bytes 
object in pending_bytes gets temporarily referenced by 
_textiowrapper_writeflush(), and self->pending_bytes is cleared. If the 
buffer's write() method fails, then the bytes object is simply deallocated.

If you mean pending_bytes in the text wrapper, then I also don't see the 
problem. It always gets flushed if pending_bytes_count exceeds the chunk size. 
If pending_bytes is a list, it never exceeds the chunk size. It gets 
pre-flushed to avoid growing beyond the chunk size. If pending_bytes isn't a 
list, then it can only exceed the chunk size if it's a bytes object -- never 
for an ASCII str() object. _textiowrapper_writeflush() does not call 
PyBytes_AsStringAndSize() for that case.

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue43260>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to