Distributed development toolset (Re: ArchiveReorganisation and sponsoring)

Emmet Hikory persia at ubuntu.com
Thu Sep 4 04:27:04 BST 2008

Steve Langasek wrote:
> If one doesn't think that fine-grained revision history is necessary, then
> they might not agree with my conclusions; but I can't see someone taking
> that position if they work on a package that sees any significant amount of
> development between releases (either upstream or Ubuntu).

    I believe this is precisely the point.  For packages that receive
0-1 uploads to Ubuntu per cycle, and for which the Debian version is
essentially clean, but it needs some change (e.g. create /var/run/foo
at runtime rather than at unpack time) that hasn't been accepted back
to Debian, the overhead of a VCS as well as processing the merge and
upload seems extra.  This is especially true for those projects where
a new upstream bugfix release may happen every 18 months or so, or for
which Debian may not maintain a visible VCS, so that one needs to
review the changes as a patch in any case, and moreso for those cases
where MoM performs the correct merge reliably anyway (yes, one needs
to review, but sometimes one gets lucky and the merge is already
completed correctly).

    Once the new tools are complete, and there already exists a branch
for each package, and there is some mechanism that allows one to
commit and upload without it feeling like two entirely separate
activities (yes, there are many use cases where one wishes to commit
without uploading, but I think few where one wishes to upload without
committing), I suspect most of the feeling of additional effort will
dissipate.  Until that time, there's a marked difference in the use
cases for people looking after specific packages with significant
activity, and people looking over arbitrary packages to see if
something ought be touched and perhaps doing an update for a package
they expect never to touch again.

    One example where the overhead can be particularly painful is NBS
processing.  It's fairly trivial to rebuild-test and upload a number
of packages to use a new (API-compatible) version of a library.  Some
of us have uploaded over 100 packages in a single day for this
purpose.  Creating packaging branches for each of these packages to
store the additional changelog entry, when one expects an automated
sync to render the branch obsolete at the start of the next cycle
feels like wasted overhead, and may in fact complicate the task of
someone wanting to work on the package a couple cycles later, as the
last posted branch will be fairly out-of-date, and it may take a while
to determine that all posted commits are meaningless in the context of
the introduction of new Ubuntu variation.  With the completion of the
mooted tools, this is less likely to be "useless overhead", and more
likely to be a seamless part of updating a package (or if it's not,
either the tools still aren't complete or I've misunderstood the many
descriptions of that which will be).

    I'm not arguing against VCS for packaging: I've seen it work very
well for collaboration for some projects, and ensure that the quality
of that added to the archive is maintained.  I'm just not sure that
the tools are currently in a state where a focus on driving all
packaging to use some specific VCS, and for branches to be published
is a useful conversation: it's far too easy to get stuck arguing about
issues with the tools, or conversations about that which will be based
on differing experiences of something that isn't yet what will be, and
lose the meaning of the discussion.  That said, in cases where the use
of bzr improves the experience of a developer, and that developer
expects they will be at least monitoring the state of the package in
the future, I don't see any reason they shouldn't choose a toolset
that matches their understanding of how to improve their workflow.

    Oh, and I'm still interested in more input on the original thread :)


More information about the ubuntu-devel mailing list