You are assuming the savings are substantial. That's not clear. When
files are compressed, if you then start doing binary diffs, well it
isn't clear that they will consistently be much smaller than plain new
files. it also isn't clear what the impact on repo disk usage would
be.
The most straig
May 28 11:28
xserver-xorg-video-nvidia_375.66-1_amd64.deb
-rw-r--r-- 1 root root 3101944 Jul 16 17:14
xserver-xorg-video-nvidia_375.66-2~deb9u1_amd64.deb
blacklab%
On Sun, Aug 13, 2017 at 12:43 PM, Julian Andres Klode wrote:
> On Sun, Aug 13, 2017 at 10:53:16AM -0400, Peter Silva wrote:
>>
ian Seiler wrote:
> On 08/13/2017 07:11 PM, Peter Silva wrote:
>>> apt by default automatically deletes packages files after a successful
>>> install,
>>
>> I don't think it does that.
>
> The "apt" command line tool doesn't, but traditi
Isn't there kind of a universal issue that tar and compression happen
sort of in the wrong order? Wouldn't it make more sense to make files
that were .gz.tar (ie. compress the files individually, then have an
index into them via tar.) Then tar works perfectly well for
extracting individual files
4 matches
Mail list logo