Yet another incremental package download proposal
mehmet.kose at gmail.com
Fri Feb 11 04:27:47 UTC 2011
If I'm seeing this correctly, we don't need deb deltas. We can save a
lot of bandwidth by making 100% backward compatible changes on .deb
data.tar will compressed by 100K chunks. Gzip and xz permits this and
can extract without problems. For gzip, there is avoidable small
increase at archive size. For xz, about 10%. Xz with 1 MB chunks, no
A few new files in control part contains:
Compressed sizes of tar chunks.
md5sums of 'conffiles'
sizes and permissions of all files (just numbers, can be aligned to
names of symlinks
So, how it works:
Apt starts downloading the package. If package format is 2.1 and there
is an older version of .deb in local storage, stops after getting
Extracts older version .deb it already has. Checks md5sums and symlink
names, determines new and changed files, calculates where this files
are in tarball. (Files are sorted alphabetically, there is no empty
directory, symlinks are always at the end.)
Downloads only necessary parts of the package and builds new deb.
Does this look feasible, should I work on it?
If it sounds crazy, another option is simply using 'dar'.
More information about the ubuntu-devel