Martin Panter added the comment:

For what it’s worth, it would be better if compressed streams did limit the 
amount of data they decompressed, so that they are not susceptible to 
decompression bombs; see Issue 15955. But having a flexible-sized buffer could 
be useful in other cases.

I haven’t looked closely at the code, but I wonder if there is much difference 
from the existing BufferedReader. Perhaps just that the underlying raw stream 
in this case can deliver data in arbitrary-sized chunks, but BufferedReader 
expects its raw stream to deliver data in limited-sized chunks?

If you exposed the buffer it could be useful to do many things more efficiently:

* readline() with custom newline or end-of-record codes, solving Issue 1152248, 
Issue 17083
* scan the buffer using string operations or regular expressions etc, e.g. to 
skip whitespace, read a run of unescaped symbols
* tentatively read data to see if a keyword is present, but roll back if the 
data doesn’t match the keyword

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue19051>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to