increase apt flexibility (for third party apps)

Mike Hearn mike at plan99.net
Sat Oct 8 15:15:16 CDT 2005


On Sat, 08 Oct 2005 15:34:23 +0200, MadMan2k wrote:
> Another system for displaying the packages would be necessary as well,
> since the naming sheme firefox-* would not be clear any more if there
> were hundreds of plugins for firefox. Therefore the system of displaying
> the packages as a tree, like it is already used to display the
> categories, should be extended. (i.e. deepen the tree)

I question any user interface that has deeply nested trees (really any
trees at all are questionable). Usability testing has shown that many
people have difficulty with trees, which is why GNOME upstream has been
gradually eliminating them over time. See spatial Nautilus, the new GTK+
file picker and so on.

Search/navigational interfaces seem to be easier for people,
which implies that finding and installing software should have a web-like
user interface. Why not use the web itself then? This is the user
interface we settled on with autopackage and it seems to work nicely
(though others - like an apt-get style UI - are certainly possible!)

> My Point is: if we dont want to support autopackage or any other system
> which requires ABI-compatibility, we need to offer something else, since
> there *is* a need for decentralization and third party integration.

It's not exactly correct that autopackage "requires" ABI compatibility and
Ubuntu native packaging does not. Every operating system, regardless of
what package management they use, requires some level of backwards
compatibility, usually the more the better. The less you have, the less
efficient your system is and the more prone to mistakes and unreliability
it is.

The autopackage project provides programs like relaytool and apbuild to
work around less-than-stellar binary compatibility on Linux, but this is
not significantly more complex than the Ubuntu approach of centrally
compiling everything on build servers.

Consider this. It takes time to ensure binary compatibility, whether it's
done using apbuild/relaytool style addons or guaranteeing it upstream.

However, it also takes time to use the Debian approach. Look at the
massive timesink that upgrading GCC can be - it refuses to compile more
and more code with every release. This code must either be modified to
compile with the new compiler, or left behind. Where source code breaks,
sometimes binaries continue to work correctly. 

It also makes security updates rather inefficient. Look at how
security.debian.org has been utterly flattened by a simple X server
update. Now imagine that every time some low level library like libpng or
glibc was updated every other package had to be updated to, due to the
lack of backwards compatibility. Where are these CPU/bandwidth resources
going to come from?

This isn't some black and white issue, regardless of what a few free
software promoters may imply. Source, binary and semantic backwards
compatibility is vital for *any* platform regardless of how it's managed,
and the sloppy and careless approach to it today makes Linux look
decreasingly credible in the desktop operating system space.

thanks -mike




More information about the ubuntu-devel mailing list