brainstorming for UDS-N - Performance

Kees Cook kees.cook at canonical.com
Fri Oct 1 22:19:19 BST 2010


On Fri, Oct 01, 2010 at 10:25:07AM +0100, Matthew Paul Thomas wrote:
> Kees Cook wrote on 30/09/10 23:10:
> >
> > On Wed, Sep 29, 2010 at 07:43:49PM +0100, Matthew Paul Thomas wrote:
> >...
> >> Measurement. Where can I go to see the equivalent of Firefox's
> >> <http://arewefastyet.com/> for Ubuntu startup speed? Where's the
> >> equivalent graph for Ubiquity? For Unity? For Ubuntu Software Center?
> >> How much better or worse is yesterday's Natty nightly compared with
> >> Ubuntu 10.10? With Ubuntu 10.04 LTS?
> >...
> > Changes to the compiler toolchain, the kernel, etc, all have an impact
> > on everyone's workloads, but most teams haven't actually stepped
> > forward and said "THIS workload is important to us, here's how to
> > reproduce the measurement, and here's where we're tracking the daily
> > changes to that measurement."
> >...
> 
> Have you asked them why that is? Maybe they don't know how to automate
> the measurement, where to host it, or who to tell about it.

In discussions at the last UDS, it seems that most teams could not agree
on what would be valuable to measure. For the teams that did have things
they wanted to measure (e.g. a specific firefox rendering speed test),
no one stepped up to automate it.

In a test-driven development style, it really seems like these measurements
must be defined and automated before work on performance can be done. The
trouble is that the performance work is rarely being done in the same team
that will feel the impact, so it's non-trivial to understand the effect on
another team's performance numbers.

Oddly, I want these things not to measure how awesome upcoming performance
improvements are, but to justify security/performance trade-offs. :P
"It's only 10% slower, but you'll never have this class of high-risk
security vulnerability again!" :)

-Kees

-- 
Kees Cook
Ubuntu Security Team



More information about the ubuntu-devel mailing list