Changing dpkg-deb default compression from gzip to lzma for Hardy
Phillip Susi
psusi at cfl.rr.com
Thu Dec 20 20:09:26 UTC 2007
Morten Kjeldgaard wrote:
> Let me add my 2 cents' worth. I don't know what algorithm is used by
> lzma, but I think there are other factors than CPU speed and size that
> matters. Namely memory.
lzma IS the name of the algorithm, and it is used by the 7zip program.
> As an example, I can tell you that in the past we have experienced
> problems with the quite serious memory requirements of bunzip2. In
> several instances, we have seen bunzip2 fail for apparently mysterious
> reasons. Eventually, it turned out, the only way to solve the problem
> was to change the memory sticks on the motherboard. Even though
> memtest86+ did not reveal any problems with the RAM, bunzip2 seems to be
> extremely sensitive to (I think) how well the particular type of memory
> is supported by the motherboard. It is possibly some kind of timing issue.
Then the hardware was bad... nothing we can do about that. It is
unlikely, but possible, for there to be a hardware defect that memtest86
will not find after a lengthy run, but will crop up under different
memory loads.
> I want to emphasize that we never had problems running gunzip
> decompression even on the systems affected by the bunzip2 issue. As I
> said, I don't know the lzma algorithm at all, but I fear that in such an
> efficient compression procedure, there is a risk that similar problems
> could appear. Needless to say, failure to decompress packages properly
> could completely brick the system.
>
> The gzip algorithm may not be the most efficient of all, but it is
> extremely reliable, fast, and memory-efficient.
lzma does use more memory to decompress ( and GOBS to compress ) but I
don't think we should hold back progress to continue to support ancient
machines with only 8 megs of ram.
> IMHO, the 10% gain on the size of an install CD is quickly eaten by
> new/expanded packages, and soon, the same problem/discussion will
> return. I think the effort is better spent in making bone-hard
> priorities on what goes on the CD and what remains available from the
> archives.
I think the number was more like 20%, but even if it was only 10%,
fitting 10% more on a CD is _extremely_ useful. You can prioritize all
you like but at the end of the day, the difference between being able to
fit x more packages on the cd than you could otherwise is quite a
difference, unless you already have so many on there that the additional
ones are so far down the list of priorities as to be not useful to most
people.
> And, perhaps, a special "try-me-out" CD edition could be designed, with
> samples of some of the latest and greatest software, but without some of
> the server tools and other stuff one would normally select for a running
> system.
The desktop CD doesn't come with any server tools.
More information about the Ubuntu-devel-discuss
mailing list