he GNU project.
-See http://www.cdrom.com/pub/infozip/ for more about zip and unzip.
+See http://www.info-zip.org/ for more about zip and unzip.
========
Thanks,
--
Greg Roelofs n...@pobox.com http://pobox.com/~newt/
Newtware, PNG Group, AlphaWorld Map, Yahoo! Grid/Hadoop, ...
Hi Paul,
>> Are there any plans to address this?
> Not until you mentioned it, but I just now installed a patch for this;
> please see the end of this message.
Awesome! Many thanks.
> Can you please help out by supplying some test cases?
I can certainly provide one, currently part of a not-qu
Paul Eggert wrote:
>> http://gregroelofs.com/test/testCompressThenConcat.txt.gz
> Thanks, I've verified that the new code works with that example.
> It's a bit much to turn that into a test case. Perhaps if I
> find time I'll write a smaller one.
I just attached to https://issues.apache.or
Paul Eggert wrote:
>> - algorithm.doc needs updating:
> Thanks, I updated that too (see 2nd patch below).
Thanks, Paul. You might want to rewrote the multi-part section a bit more,
though; with the loss of the flag bit, gzip is now entirely dependent on
external split/combine utilities (not th
[Cc'ing bug-gzip, as requested.]
Mark Adler wrote:
> On Aug 15, 2010, at 5:25 PM, Paul Eggert wrote:
> > * If the file size is 2**32 or larger, gzip should emit an extra field
> > that records the size divided by 2**32 (discarding fractions). gzip
> > -l should read this field when reporting t
Mark wrote:
> Eventually new versions of gzip that don't issue that warning in
> that case would replace the old ones. We could stick in a small
> extra field that simply indicates that those extra four bytes
> are there. (We'd have to always accept that hit in the streaming
> case, since you wo
> On Aug 19, 2010, at 2:54 PM, Paul Eggert wrote:
>> In that case I'm afraid that we need to give up on the goal of always
>> providing a correct uncompressed length. At this point the gzip
>> format is so widely used that an incompatible change to it would cause
>> far more trouble than the rela