Hi, >>"Peter" == Peter Galbraith <[EMAIL PROTECTED]> writes:
>> !/usr/bin/perl -pi >> >> href2gz : Replace HREF tags to point to compressed HTML files >> >> This script runs on all original .html files. >> Peter> s/((HREF|SRC)=\"[^\"]+)\.htm[l]?/$1.html.gz/gi; Unfortunately, this too is not good enough, since it also munges any non-local links the docs may have (like a link to the canonical home page). The reason we shot down compressing HTML before was precisely this: this is not as easy as it seems, and most proposed methods reduce the capability of the system. We should ask why do we need to do this? The obvious reason is that HTML docs may be large, and there may be disk space issues. A solution would be to require that the documentation be pulled into a separate package (I think the policy already mentions this?) if it is too big (and I think we can decide for ourselves what too big is), so that people do not have to install documentation when they are short of space. manoj -- "Die? I should say not, dear fellow. No Barrymore would allow such a conventional thing to happen to him." John Barrymore's dying words Manoj Srivastava <[EMAIL PROTECTED]> <http://www.debian.org/%7Esrivasta/> Key C7261095 fingerprint = CB D9 F4 12 68 07 E4 05 CC 2D 27 12 1D F5 E8 6E