Let me add my 2 cents' worth. I don't know what algorithm is used by lzma, but I think there are other factors than CPU speed and size that matters. Namely memory.

As an example, I can tell you that in the past we have experienced problems with the quite serious memory requirements of bunzip2. In several instances, we have seen bunzip2 fail for apparently mysterious reasons. Eventually, it turned out, the only way to solve the problem was to change the memory sticks on the motherboard. Even though memtest86+ did not reveal any problems with the RAM, bunzip2 seems to be extremely sensitive to (I think) how well the particular type of memory is supported by the motherboard. It is possibly some kind of timing issue.

I want to emphasize that we never had problems running gunzip decompression even on the systems affected by the bunzip2 issue. As I said, I don't know the lzma algorithm at all, but I fear that in such an efficient compression procedure, there is a risk that similar problems could appear. Needless to say, failure to decompress packages properly could completely brick the system.

The gzip algorithm may not be the most efficient of all, but it is extremely reliable, fast, and memory-efficient.

IMHO, the 10% gain on the size of an install CD is quickly eaten by new/expanded packages, and soon, the same problem/discussion will return. I think the effort is better spent in making bone-hard priorities on what goes on the CD and what remains available from the archives.

And, perhaps, a special "try-me-out" CD edition could be designed, with samples of some of the latest and greatest software, but without some of the server tools and other stuff one would normally select for a running system.

Cheers,
Morten



Attachment: PGP.sig
Description: This is a digitally signed message part

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss

Reply via email to