Activity scores for REVU

Emmet Hikory persia at
Sat Aug 30 13:19:53 BST 2008

Siegfried-Angel wrote:
> Basically it is about letting REVU calculate two scores for all
> packages, "uploader activity" and "reviewer activity", which would
> provide a fast way to see how responsive the uploader of a given
> package is and how much attention reviewers have given it.
> I'm sending this mail here to ask for ideas on how such scores should
> be calculated.

    It may be best to track two separate activity scores: for
uploaders and for reviewers.  There are a number of individuals who
fit in both categories, but it may be inaccurate to indicate that they
have a high activity score if they gain this though only one of the
two means, and the person looking at scores expects the other.

    For any type of score, it may make sense to model the algorithm
after that used for karma in launchpad: specifically that some actions
earn points, and that points decay over time (six months is probably a
good timeframe).  On the other hand, I don't think it needs any
weighting. One oughtn't lose points simply because others are more
active: the transparency is improved when one can understand what
actions gain points, and those are the actions we want to encourage.
Given the relatively small volume of activity on REVU, and that
everything is widely visible, those gaming the system ought be easily
detected and appropriately counselled: while having a scoring system
is fun, it ought not detract from the essential work involved.

    In terms of activity scores for reviewers, there are two types of
reviewers: those who can advocate/reject and those who can only
comment.  It may not be worth making a distinction between these two
classes in terms of activity, although there may be value in giving
additional credit for the advocation or rejection actions which ought
naturally buoy the scores of those who may perform these actions, and
thereby give a better picture of who is not only active, but active in
an appropriate role.  While it likely needs some balancing, perhaps
these scores could start with 4 points for a comment on a package in
the "Needs Review" state, 2 points for an advocation, and 3 points for
a rejection.  This encorages rejections (although those who routinely
reject without reason will likely raise the ire of uploaders: not a
desireable state), and encourages activity.  It also discourages
comments to previously rejected packages.  While there are arguments
for weighting the score based on the time between an upload and a
comment, or the age of the upload (so that all uploads are likely to
get at least one comment), the quality of review is likely best
improved by not encouraging reviewers to follow-up on previously
reviewed packages (it takes many eyes to make bugs shallow) and there
are enough special cases that looking always for the oldest package
may not be the ideal solution.

    In terms of activity scores for uploaders, there is a much more
complex set of issues.  Some uploaders are prolific, and some only
upload one or two packages per cycle.  While the most sensible thing
might be to assign points based on the time between a rejection and a
new upload, indicating that the uploader cares for the package and is
paying attention, this may not encourage behaviour that we wish to
reward, especially if there is a decay cycle for any points earned.
Granting maxium points for those who upload many packages and who
upload immediately upon presented comments discourages two behaviour
patterns I believe to have great value: 1) working with upstream to
ensure that the packaging and the software packaged are well aligned,
and 2) working with Debian to have the software included there in
preference to being in Ubuntu alone.

    I believe the first of these to be important because we have a
number of cases of confusion with upstream, and if someone wants
something to be in Ubuntu sufficiently, and expects that it will
closely follow upstream, it is essential that a communication path be
opened with upstream to better coordinate release timing,
miscellaneous licensing issues, differences with other distributions,
bug and patch management, etc.  This communication and coordination
takes time, but without it we end up with many essentially orphaned
packages that must later be dropped.  While specific package
maintainers are discouraged in Ubuntu, it is expected that those with
an interest in specific packages will be watching them and ensuring
they are in good condition.  This is especially true for packages for
which there is no Debian maintainer, and so we cannot rely on Debian
to assist with upstream coordination.

    I believe the second of these to be important for almost the same
set of reasons.  If someone wants a package, but doesn't want to
maintain the package, I wonder at the value of having the package.
While some Debian processes may be opaque, and some people may have
difficulty in finding sponsors, it is always best practice to file an
RFP or ITP in Debian when packaging something in Ubuntu that is
expected to also be appropriate for Debian.  The Debian Maintainer
(who may be the initial Ubuntu packager) will then provide most of the
coordination with upstream, and with luck the package can remain in
sync, reducing the merge work necessary each cycle.  Note that I
specifically don't discourage those who are following such a path of
working with Debian from using REVU, as I believe REVU is a useful
source of package review and comments, and further believe there exist
some subset of packages which may be interesting in Ubuntu but for
which no Debian Developer has an interest in helping to be part of

    So, If we don't want to encourage either lots of uploads or
responsiveness (as this may indicate not properly coordinating with
others), and a decay model unfairly punishes those uploaders who don't
publish so many packages (although the packages they publish may be
both essential and perfect, and they may be immediately responsive to
comments), I'm not sure how to effectively assign points to uploaders,
and further, for those models I can imagine (e.g. percentage of
packages advocated, average number of reviews required to be uploaded,
time from reject to upload, percentage of packages archived without
response, etc.), none actually reward specific behaviours that I would
prefer to encouragem so I think only rewarding points for reviewers
makes more sense.

    That conclusion reached, I still think there is value in providing
reviewer points for those who cannot advocate or reject, as I do think
that the behaviour of reviewing others' packages ought be rewarded.


More information about the Ubuntu-motu mailing list