Releasing Alphas and Betas without "freezing"
Gema Gomez
gema.gomez-solano at canonical.com
Fri Jun 22 10:05:26 UTC 2012
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hi,
I am a bit lost with the discussion and so many arguments /
counter-arguments, so I am going to focus on the things that the
Canonical Platform QA team is doing that I believe not to be widely
understood. We discussed all this at UDS, but of course not everyone
can be in all the sessions.
Needless to say (but I will say it anyway) that our testing is nowhere
near where I'd like it to be and we are working hard as a team, and
imho as a community, to change that. There are no magic wands, just
effort to be put in the right work items to get there as fast and
reliably as we can.
On 21/06/12 19:09, Michael Casadevall wrote:
> On 06/21/2012 09:43 AM, Jono Bacon wrote:
>> On Thu, Jun 21, 2012 at 9:02 AM, Michael Casadevall
(...)
>>> How does ARM factor into this?
>>>
>>> I have already cited the failure by Canonical QA to test the
>>> armhf+omap4 image for precise. Were it not for manual
>>> intervention, and the normal Ubuntu QA testing by both myself
>>> and other members of the community, I'm certain that we would
>>> have shipped an image that was not only substandard, but
>>> completely broken.
What failure? Can you mention exactly what went wrong, which bugs we
missed and what were the consequences so that I can carry out a root
cause analysis to prevent this from happening again? I am very
interested in the details so that we can add new tests for that kind
of problem.
The Canonical Platform QA team is putting a lot of effort into
sourcing ARM hardware in our lab and getting the automated testing we
currently run on other platforms running on ARM. Every member of the
team has a board and we've done the due diligence during precise, i.e.
running the test that is there for ARM. That test is not very good and
hence the coverage needs to be increased, and we are working on that.
Having ARM images tested by the community is tricky, since people
don't tend to have reference hardware sitting a home for them to try
Ubuntu on it, we are working on that too.
>>> It should be cited for completeness that this was caught at
>>> *Beta 2*, and only because there were voiced concerns on image
>>> quality that I sat and tested it extensively.
What does it mean you tested it extensively? What tests did you run,
could you send them to me so that we include them in the official test
list?
>>> Where were the QA efforts by Canonical QA for all the previous
>>> milestones? As it stands, I'm only aware of one person that
>>> was responsible for all ARM images that were supported during
>>> precise.
The whole Canonical Platform QA Team has ARM boards and was doing
testing during Precise on ARM. How good was that testing? Well here is
my viewpoint:
The only required/official test that, according to the official
documentation is required to sign off an ARM image is this:
http://testcases.qa.ubuntu.com/Install/ARM/PreinstalledImage
This is not just substandard, to me, it is unacceptable. The only
thing it tests is the installation process and by no means
exhaustively, then it points to a bunch of tests that are not relevant
to ARM in particular and there is no way to report those results and
associate them with the ARM image (no traceability). I am fully aware
of how bad the testing for ARM is at the moment. All the testing that
is going on and is not documented needs to be documented, so that we
can run those important tests that everybody talks about but I haven't
manage to find yet. If we are talking about adhoc testing, that we did
as well during last cycle, unmeasurable and untraceable, but it was done.
This is why we are focusing on automating ARM testing in the lab so
that we, the very least, run the same smoke tests we are running on
other images for ARM, and that will catapult us way beyond anywhere
we've ever been on ARM testing, in my opinion. Repeatable, measurable,
auditable daily testing.
>>> Furthermore, speaking as the ARM Server Tech Lead in Canonical,
>>> the PES team has to manually QA our own images due to the
>>> fact that in general, we are the only teams with access to the
>>> hardware. We do NOT have the bandwidth required for this
>>> additional testing initiative.
What manual testing did you do, how many test cases where run? Do you
have any pass rates? Bugs you reported? Were they reported on the
tracker or somewhere else? This is important information the Canonical
QA team is missing.
(...)
>
> The confusion on the term milestone was already cleared up. Just
> for the record, when I say milestone, I consider it milestone
> images (mostly because that's when we confirm during QA testing
> that everything slated for that milestone went in, and wasn't
> accidently mismarked as completed or such).
What worries me about losing milestones, is the attention those bugs
get. Losing the milestones means that the bugs found during testing
are going to get that same attention during all the cycle or does it
mean we lose that bug fixing powered weeks altogether? I hope it means
all the bugs get the right attention during the whole cycle.
(...)
>>> Precise was considerably less broken during development due to
>>> the simple fact we were syncing from Debian testing vs.
>>> unstable, which ensured a minimum of package breaking due to
>>> Debian's strict policies on how packages enter the testing
>>> archive.
>
>> That is one of the reasons why Precise was less broken, but there
>> was also automated testing, acceptance criteria, increased
>> manual testing, and a stronger focus on QA.
>
> I'm awaiting an explanation on how a Canonical supported image
> then was nearly released in a broken state at Beta 2.
Which image is this? again, root cause analysis would help us avoid
this kind of problem int he future. We discussed about doing this
during UDS but I doubt any of the people in that release meeting will
have the time required to dig on all the respins we did and analyze
them one by one. Can you give us the details of that image that was
nearly released but in the end it wasn't?
It would be useful to have this feedback as soon as it happens, not
months later. If you don't know who to complain about this, just tell
me and I will make sure we investigate the issue with the release team
and figure out which test cases we need to add to our automated QA to
avoid this problems going forward.
(...)
> Scott has made it clear that if the current system of alpha/betas
> is abolished, they will not be able to manage to maintain their
> current level of quality. Kubuntu (and by extension, the other
> flavors) depend on the current system as is.
I am not sure I understand fully how flavors do their milestones, so
excuse my 5 year-old logic following statement. If the flavors are
based on Ubuntu, and Ubuntu gets better and better in terms of
quality, stability and reliability, won't that reduce the effort it
takes to release a flavor considerably?
> As ARM and PowerPC is not actively part of these new QA efforts, it
> is clear that we are also dependent on the current system to keep
> quality.
ARM is part of the Canonical Platform QA efforts, quite high on the
list. We'll be hopefully there by the end of the cycle, until then,
we'll need to keep doing manual validation of builds. We are planning
to set up automated testing on ARM by early August.
Thanks,
Gema
- --
Gema Gomez-Solano <gema.gomez-solano at canonical.com>
Ubuntu QA Team https://launchpad.net/~gema.gomez
Canonical Ltd. http://www.canonical.com
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.11 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
iQEcBAEBAgAGBQJP5ENXAAoJEGrsYwHIaYlgjGwH/RYF7wenaF6XiLxNKKVOd7Zc
qZyQBb01n5m9zQ7BY/NcNdO52fRmDryqqG1wjPIOdB0RBmDkayerVrtYbrmU89dw
dJFh7O+fnDBR1209gfzcXNvxJEZZBO+bcpvsSk+dgYaDIdBaT31g1GOy+nN7hCpl
u467ORWHbEltqVdDqyLx6hTvPmWgqkIlhHIoDtzhzWCIbQvt49eEaaBZF2qCuRLN
k47yaUHlN7DQae+9Ay0pIUdlwrcs4FBFKhU5FhrM/CW8id33LDr5gUk1ri1cBRnT
o6YImEUk97nOJzMrXr0ZAgw2PDV7AlzQRNeTAg3bJoUThGt8iG60ULDrm9zkh3E=
=Rgb8
-----END PGP SIGNATURE-----
More information about the Ubuntu-release
mailing list