Xiang Zhang added the comment:

Quick and careless scanning at night lead me to a wrong result, Sorry.

> You would need to compress just under 4 GiB of data that requires 5 MB or 
> more when compressed (i.e. not all the same bytes, or maybe try level=0).

With enough memory, compressing with level 0 does raise a error while the 
default level didn't. 

Except for overflow fix, does zlib have to support large data in one operation? 
For example, it's OK that zlib.compress does not support data beyond 4GB since 
we can split data in application and then call zlib.compress on each part and 
finally concatenate the results.

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue27130>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to