Changes by Klamann :
--
nosy: +Klamann
___
Python tracker
<http://bugs.python.org/issue29842>
___
___
Python-bugs-list mailing list
Unsubscribe:
https://mail.pyth
Klamann added the comment:
Thanks for pointing this out.
*closed*
--
resolution: -> duplicate
stage: -> resolved
status: open -> closed
___
Python tracker
<http://bugs.python.or
Klamann added the comment:
Yes, I was wrong in my assumption that simply replacing the list comprehension
with a generator expression would fix the issue.
Nevertheless, there is no need to load the *entire* generator into memory by
converting it to a list. All we have to read are the first n
New submission from Klamann:
The Executor's map() function accepts a function and an iterable that holds the
function arguments for each call to the function that should be made. This
iterable could be a generator, and as such it could reference data that won't
fit into memory.
The
Klamann added the comment:
Thanks Xiang and Martin for solving this, you guys are awesome :)
--
___
Python tracker
<http://bugs.python.org/issue27130>
___
___
Klamann added the comment:
> You should be able to use a compression (or decompression) object as a
> workaround.
OK, let's see
>>> import zlib
>>> zc = zlib.compressobj()
>>> c1 = zc.compress(b'a' * 2**31)
>>> c2 = zc.compress(b
Klamann added the comment:
> But you can only get that feature with Python3.5+.
Well, I have Python 3.5.1 installed and the problem still persists. I'm not
sure that 25626 ist the same problem - in the comments they say this was not an
issue in Python 3.4 or 2.x, but this is clearly
New submission from Klamann:
zlib fails to compress files larger than 4gb due to some 32bit issues.
I've tested this in Python 3.4.3 and 3.5.1:
> python3 -c "import zlib; zlib.compress(b'a' * (2**32 - 1))"
> python3 -c "import zlib; zlib.compress(b'a