New submission from Denis Dmitriev :
There's a bug in python's zlibmodule.c that makes it raise SystemError whenever
it tries to decompress a chunk larger than 1GB is size. Here's an example of
this in action:
dmitr...@...:~/moo/zlib_bug> cat zlib_bug.py
import zlib
def test_zlib(size_mb):
print "testing zlib with a %dMB object" % size_mb
c = zlib.compressobj(1)
sm = c.compress(' ' * (size_mb*1024*1024)) + c.flush()
d = zlib.decompressobj()
dm = d.decompress(sm) + d.flush()
test_zlib(1024)
test_zlib(1025)
dmitr...@...:~/moo/zlib_bug> python2.6 zlib_bug.py
testing zlib with a 1024MB object
testing zlib with a 1025MB object
Traceback (most recent call last):
File "zlib_bug.py", line 11, in
test_zlib(1025)
File "zlib_bug.py", line 8, in test_zlib
dm = d.decompress(sm) + d.flush()
SystemError: Objects/stringobject.c:4271: bad argument to internal function
dmitr...@...:~/moo/zlib_bug>
A similar issue was reported in issue1372; however, either this one is
different, or the issue still exists in all versions of Python 2.6 that I
tested, on both Solaris and Mac OS X. These are all 64-bit builds and have no
problem manipulating multi-GB structures, so it's not an out-of-memory
condition:
dmitr...@...:~/moo/zlib_bug> python2.6
Python 2.6.1 (r261:67515, Nov 18 2009, 12:21:47)
[GCC 4.3.3] on sunos5
Type "help", "copyright", "credits" or "license" for more information.
>>> len(' ' * (6000*1024*1024))
6291456000
>>>
--
components: Library (Lib)
messages: 104522
nosy: ddmitriev
priority: normal
severity: normal
status: open
title: zlib causes a SystemError when decompressing a chunk >1GB
type: crash
versions: Python 2.6
___
Python tracker
<http://bugs.python.org/issue8571>
___
___
Python-bugs-list mailing list
Unsubscribe:
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com