Using zsync for .deb downloads: initial benchmark results

Lars Wirzenius lars at ubuntu.com
Fri Jul 17 15:22:45 BST 2009


ke, 2009-07-15 kello 01:37 +0100, Paul Sladen kirjoitti:
> On Tue, 14 Jul 2009, Martin Pitt wrote:
> > Lars Wirzenius [2009-07-14 19:19 +0300]:
> > > is a 25% reduction in download sizes worthwhile to pursue?
> 
> A daily fetch of Packages.gz for main/restricted/universe/multiverse/* is
> ~10MB (IIRC), times 30 days per month and X million users, this is where the
> low-hanging fruit is likely to be.

That is probably true. However, for a release, only -security and
-updates are interesting, and those mostly don't change every day, in
which case it's a very quick check to see if there is a change or not.

> > I had expected something like a 90% saving
> 
> To do it properly requires fixing the compiler and linker to make the same
> address/load/optimisation choices as in the previous run.  (So that a
> two-line patch only creates a ten-byte binary churn).

Aren't we using the same toolchain for -security as for the release?

> The thing that is holding back the possibility of *all* of these fancy delta
> methods is that our secure distribution is based on the signing hashes of
> *encoded data*, not of the content within it.[2] Until that is changed,
> deploying most of these optimisations is over-complicated (and held back)
> because of the requirement to have to recompress the exact same data.tar.gz,
> rather than just produce the _equivalent_ uncompressed data.tar.

That is true, but fixing that is going to take a lot of changes in a lot
of places, and require some careful thinking about transitions (since
this is fundamental to the security of all Debian and Ubuntu and derived
systems). That makes it inpractical to achieve for this cycle, but it'd
be good to start the discussion now.





More information about the ubuntu-devel mailing list