[issue8571] zlib causes a SystemError when decompressing a chunk >1GB

2010-04-29 Thread Denis Dmitriev

New submission from Denis Dmitriev :

There's a bug in python's zlibmodule.c that makes it raise SystemError whenever 
it tries to decompress a chunk larger than 1GB is size. Here's an example of 
this in action:

dmitr...@...:~/moo/zlib_bug> cat zlib_bug.py 
import zlib

def test_zlib(size_mb):
print "testing zlib with a %dMB object" % size_mb
c = zlib.compressobj(1)
sm = c.compress(' ' * (size_mb*1024*1024)) + c.flush()
d = zlib.decompressobj()
dm = d.decompress(sm) + d.flush()

test_zlib(1024)
test_zlib(1025)

dmitr...@...:~/moo/zlib_bug> python2.6 zlib_bug.py 
testing zlib with a 1024MB object
testing zlib with a 1025MB object
Traceback (most recent call last):
  File "zlib_bug.py", line 11, in 
test_zlib(1025)
  File "zlib_bug.py", line 8, in test_zlib
dm = d.decompress(sm) + d.flush()
SystemError: Objects/stringobject.c:4271: bad argument to internal function

dmitr...@...:~/moo/zlib_bug>

A similar issue was reported in issue1372; however, either this one is 
different, or the issue still exists in all versions of Python 2.6 that I 
tested, on both Solaris and Mac OS X. These are all 64-bit builds and have no 
problem manipulating multi-GB structures, so it's not an out-of-memory 
condition:

dmitr...@...:~/moo/zlib_bug> python2.6   
Python 2.6.1 (r261:67515, Nov 18 2009, 12:21:47) 
[GCC 4.3.3] on sunos5
Type "help", "copyright", "credits" or "license" for more information.
>>> len(' ' * (6000*1024*1024))
6291456000
>>>

--
components: Library (Lib)
messages: 104522
nosy: ddmitriev
priority: normal
severity: normal
status: open
title: zlib causes a SystemError when decompressing a chunk >1GB
type: crash
versions: Python 2.6

___
Python tracker 
<http://bugs.python.org/issue8571>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue8571] zlib causes a SystemError when decompressing a chunk >1GB

2010-04-29 Thread Denis Dmitriev

Denis Dmitriev  added the comment:

Alright, I think this is caused by the following:

static PyObject *
PyZlib_objdecompress(compobject *self, PyObject *args)
{
int err, inplen, old_length, length = DEFAULTALLOC;
int max_length = 0;

The problem is that inplen, length, old_length, and max_length are all ints, 
whereas they should be Py_ssize_t. I think replacing them should make the bug 
go away. (I can't test it right now though because I'm having trouble compiling 
zlibmodule on my current machine)

--

___
Python tracker 
<http://bugs.python.org/issue8571>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue8571] zlib causes a SystemError when decompressing a chunk >1GB

2010-05-07 Thread Denis Dmitriev

Denis Dmitriev  added the comment:

Is there a reason to keep inplen and max_length ints instead of making them 
Py_ssize_t too? I'm a little worried that keeping them ints will cause a 
similar problem further down the line.

--

___
Python tracker 
<http://bugs.python.org/issue8571>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com