Ludovic Courtès <l...@gnu.org> writes:

> Hi Guix!
>
Hi Ludo

> Quick decompression bench:

I guess this benchmark follows the distri talk, doesn't it? :)

File size with zstd vs zstd -9 vs current lzip:
- 71M uc.nar.lz
- 87M uc.nar.zst-9
- 97M uc.nar.zst-default

> Where to go from here?  Several options:

>   1. Since ci.guix.gnu.org still provides both gzip and lzip archives,
>      ‘guix substitute’ could automatically pick one or the other
>      depending on the CPU and bandwidth.  Perhaps a simple trick would
>      be to check the user/wall-clock time ratio and switch to gzip for
>      subsequent downloads if that ratio is close to one.  How well would
>      that work?

I'm not sure using heuristics (i.e., guessing what should work better,
like in 1.) is the way to go, as temporary slowdowns to the network/cpu
will during the first download would affect the decision.

>   2. Use Zstd like all the cool kids since it seems to have a much
>      higher decompression speed: <https://facebook.github.io/zstd/>.
>      630 MB/s on ungoogled-chromium on my laptop.  Woow.

I know this means more work to do, but it seems to be the best
alternative.  However, if we go that way, will we keep lzip substitutes?
The 20% difference in size between lzip/zstd would mean a lot with slow
(mobile) network connections.

Nicolò

Reply via email to