New submission from igor voltaic <ibol...@gmail.com>: MemoryError: null ... File "....", line 13, in repack__file shutil.unpack_archive(local_file_path, local_dir) File "python3.6/shutil.py", line 983, in unpack_archive func(filename, extract_dir, **kwargs) File "python3.6/shutil.py", line 901, in _unpack_zipfile data = zip.read(info.filename) File "python3.6/zipfile.py", line 1338, in read return fp.read() File "python3.6/zipfile.py", line 858, in read buf += self._read1(self.MAX_N) File "python3.6/zipfile.py", line 948, in _read1 data = self._decompressor.decompress(data, n)
shutil.unpack_archive tries to read the whole file into memory, making use of any buffer at all. Python crashes for really large files. In my case — archive: ~1.7G, unpacked: ~10G. Interestingly zipfile.ZipFile.extractall handles this case more effective. ---------- components: Library (Lib) messages: 389652 nosy: igorvoltaic priority: normal severity: normal status: open title: MemoryError on zip.read in shutil._unpack_zipfile type: crash versions: Python 3.6, Python 3.7, Python 3.8, Python 3.9 _______________________________________ Python tracker <rep...@bugs.python.org> <https://bugs.python.org/issue43650> _______________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com