Also, while there was other opportunities for making it more effective (3
byte offsets, even shorter repeat codes), I choose to keep it as close to
Snappy as feasible so the same decoder works for both formats.
Re "better (compression)": Yeah, it does require to read the docs. Oh well.
/Klau
On Wednesday, 28 August 2019 12:57:33 UTC+2, Nigel Tao wrote:
>
> On Wed, Aug 28, 2019 at 7:11 PM Klaus Post > wrote:
> > TLDR; LZ4 is typically between the default and "better" mode of s2.
>
> Nice!
>
> Just a suggestion: rename "better" to either "betterSize / smaller"
> (i.e. better compre
On Wed, Aug 28, 2019 at 7:11 PM Klaus Post wrote:
> TLDR; LZ4 is typically between the default and "better" mode of s2.
Nice!
Just a suggestion: rename "better" to either "betterSize / smaller"
(i.e. better compression ratio, worse throughput) or "betterSpeed /
faster", otherwise it's not immedi
On Wednesday, 28 August 2019 02:37:25 UTC+2, Nigel Tao wrote:
>
> On Mon, Aug 26, 2019 at 8:29 PM Klaus Post > wrote:
> > This package is aimed at replacing Snappy as a high speed compression
> package. If you are mainly looking for better compression zstandard gives
> better compression, but
On Mon, Aug 26, 2019 at 8:29 PM Klaus Post wrote:
> This package is aimed at replacing Snappy as a high speed compression
> package. If you are mainly looking for better compression zstandard gives
> better compression, but typically at speeds slightly below "better" mode in
> this package.
Do
On Tue, Aug 27, 2019 at 7:25 PM Klaus Post wrote:
> Can the concurrent decompression run on a pure stream, or does it need to
> read the index first?
It cannot run a pure stream. As the
https://github.com/google/wuffs/blob/master/doc/spec/rac-spec.md
Overview section says, an explicit non-goal i
On Tuesday, 27 August 2019 10:38:36 UTC+2, Nigel Tao wrote:
>
> Nice work, Klaus!
>
>
> On Mon, Aug 26, 2019 at 8:29 PM Klaus Post > wrote:
> > Concurrent stream compression - several GB/sec.
> > Faster decompression
>
> A number of modern compression formats and implementations allow for
>
Nice work, Klaus!
On Mon, Aug 26, 2019 at 8:29 PM Klaus Post wrote:
> Concurrent stream compression - several GB/sec.
> Faster decompression
A number of modern compression formats and implementations allow for
concurrent *compression*.
Coincidentally, I've been working on a format that allows
Hi!
S2 is an extension of Snappy. It can decode Snappy content but S2 content
cannot be decoded by Snappy. S2 is aimed for high throughput, so it
features concurrent compression for bigger payloads (streams).
Benefits over Snappy:
-
Better compression
-
Concurrent stre