Compression on CDs -- an amusing topic for a Saturday
pvanderdoes at gmail.com
Sun Oct 28 01:58:58 UTC 2007
On Sat, 27 Oct 2007 15:29:07 -0500
Loye Young <loye.young at iycc.net> wrote:
> > Your analogy with meteorites is not correct.
> > --
> > Peter van der Does
> You are so close to the trees that you are missing the forest. The
> comparison is between a full CD that has low or no compression versus
> a full CD that has very high compression being written using the same
> physical device. My original observation was that there is a tradeoff
> between getting more on the CD via higher compression on the one
> hand, versus getting lower failure rates on CDs with lower
> compression. Such is in accord with observations and with theoretical
> When speaking of compression, we aren't actually speaking about disk
> surface that doesn't have data written to it. Instead we are really
> referring to whole sections of disk real estate that are filled with
> zeros or other repetitive and unimportant data. If you need to make
> better use of areas that are literally unpopulated, you employ
> defragmentation, which is a related but different technique needed
> for antiquated file systems.
> On a typical CD, the entire CD is populated, as Soren rightly
> mentioned, with zeros and ones. The physical device makes the same
> number of read/writes whether the data is compressed or not, and the
> error rate is the same either way. On the surface, it would appear
> that compression doesn't introduce more significant error.
> The difference is that on an uncompressed CD, much of what is written
> is not important. Text files, for instance, are mostly a bunch of
> zeros at the physcial layer. Compression uses algorhythms to
> represent all those zeros in a shorthand way, so that the device
> doesn't actually have to write each one of them. This frees up disk
> real estate for more information. The consequence is that the
> compressed disk has a higher density of important bits and bytes on
> the same disk.
> Assuming that the device has a constant error rate and assuming that
> the CD is filled to the same capacity, it is more likely that the
> errors on compressed disks will affect something important and cause
> a failure, simply because there is more important data on the CD.
This is relative. What's important and what's not. Like I said earlier
it all depends on what you need. Lets say Ubuntu ships out a version
with Gnome and KDE combined, for the sake of this example everything
fits on one disc. If there are errors in the KDE files and not in the
Gnome files it doesn't really matter to the person who wants to install
In order to supply the same amount of data you'll need more CD's
resulting in more chances of a read/write error.
A typical compression compresses about 2.5 times now a days which means
you need 2.5 times more CD's to hold the same amount of data as a
> I actually remember when "floppy disks" were flexible 12 inch disks
> and how amazed everyone was to get so much information on 5 1/2 inch
> disks. Engineers have made remarkable progress over the last 30
> years. Much of the heavy lifting to make that possible was the
> improvements in error prevention, detection, and correction
> necessitated by the compression.
The fact that more data could be written on a 5 1/4, followed by 3 1/2
inch floppy had to do with miniaturizing of the components, not with
compression. I know that DLT tapes have the option of hardware
compression, they will compress the data before writing it to tape and
back in the days you could buy a massive 30MB hardrive, which
usually was a 20MB drive but the firmware did RLE.
On CD's there isn't hardware compression.
Peter van der Does
GPG key: E77E8E98
IRC: Ganseki on irc.freenode.net
Jabber ID: pvanderdoes at gmail.com
GetDeb Package Builder
http://www.getdeb.net - Software you want for Ubuntu
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 189 bytes
Desc: not available
More information about the ubuntu-server