Analysis of 10 years of

Christopher Armstrong radix at
Tue Jan 26 20:54:02 GMT 2010

On Tue, Jan 26, 2010 at 2:47 PM, Bryce Harrington <bryce at> wrote:
> On Tue, Jan 26, 2010 at 11:36:15AM -0500, Karl Fogel wrote:
>> Matt Zimmerman <mdz at> writes:
>> >I haven't watched the video, but scanned over the slides and found them
>> >interesting:
>> >
>> >
>> I agree -- that slide deck is very comprehensible.  Actually, I found it
>> easier to digest than the video presentation.  Here are the slides:
>> Most interesting graph: correlation between the level of experience of
>> bug submitter and tendency of that bug to get fixed.
> I found the "Time to verify software defect" graph fascinating.  It
> shows that having more than 30 people commenting on a bug adds little to
> getting the bug verified unless the developer *really* groks the
> codebase inside and out, and can actually hinder the verification time.
> And having more than 75 commenters on a bug always makes the bug harder
> to verify.
> These findings correlate with my own experience.  Having a few people
> participate on a bug report can be quite helpful, but past a certain
> threshold it turns into noise.

Question every graph. Correlation is not causation. Maybe bugs getting
>75 comments is caused by something *else*, and that *other* thing is
also causing the bug to be hard to verify. For example, a race
condition bug would be hard to verify, *and* random changes to
environments tend to "fix" it, so you'd get lots of confused
commenters claiming they've found fixes and also claiming that
previous fixes don't work. So in this case, I would not say the
commenters made the bug hard to verify, but the race-condition nature
of the bug did.

Christopher Armstrong

More information about the ubuntu-devel mailing list