$ cd <path_to_mounted_backup_partition> $ for tar_archive in *.tar; do pixz "${tar_archive}"; done
-Ramon [1] * https://www.zlib.net/pigz/ [2] * https://github.com/vasi/pixz [3] * https://launchpad.net/pbzip2 * http://compression.ca/pbzip2/ [4] * https://facebook.github.io/zstd/ On 26/09/2021 13:36, Simon Thelen wrote:
[2021-09-26 11:57] Peter Humphrey <pe...@prh.myzen.co.uk>part text/plain 382 Hello list,Hi,I have an external USB-3 drive with various system backups. There are 350 .tar files (not .tar.gz etc.), amounting to 2.5TB. I was sure I wouldn't need to compress them, so I didn't, but now I think I'm going to have to. Is there a reasonably efficient way to do this? I have 500GB spare space on /dev/sda, and the machine runs constantly.Pick your favorite of gzip, bzip2, xz or lzip (I recommend lzip) and then: mount USB-3 /mnt; cd /mnt; lzip * The archiver you chose will compress the file and add the appropriate extension all on its own and tar will use that (and the file magic) to find the appropriate decompresser when you want to extract files later (you can use `tar tf' to test if you want). -- Simon Thelen
-- GPG public key: 5983 98DA 5F4D A464 38FD CF87 155B E264 13E6 99BF
OpenPGP_signature
Description: OpenPGP digital signature