On 04 Nov 2021 13:15, Paul Eggert wrote: > On 11/3/21 22:13, Mike Frysinger wrote: > > with the rise of commodity multicore computing, tar feels a bit antiquated > > in > > that it still defaults to single (de)compression. it feels like the > > defaults > > could & should be more intelligent. has any thought been given to > > supporting > > parallel (de)compression by default ? > > Not really. Some thought would be required, I assume. For example, > parallelizing decompression might hurt performance, as 'tar' is > typically I/O bound in that case. It'd be nice if someone could think > this through and do some performance measures.
how are you defining "performance" ? to me, the only metric that matters is $ time tar xf <archive> we use parallel (de)compressors heavily in CrOS w/tar, and we show significant improvements (especially with XZ) every time. the only thing we debate is how we can smooth out the CPU spike at the beginning as it overtakes tar's ability (I/O wise) to keep up. -mike
signature.asc
Description: PGP signature