QA tasks available
Gema Gomez
gema.gomez-solano at canonical.com
Tue Dec 6 20:17:24 UTC 2011
After reading through all the emails I have put together a little table
that shows who is interested in what, so far only task 1 has been started.
https://wiki.ubuntu.com/QATeam/TasksPrecise
Thanks,
Gema
On 06/12/11 09:29, Brendan Donegan wrote:
> On 05/12/11 16:24, Gema Gomez wrote:
>> Dear QA Team,
>>
>> as promised, here it is a list of tasks that need to be done and we are
>> in the process of doing that you could own if you have the time:
>>
>> - ISO testing tasks
>> (https://blueprints.launchpad.net/ubuntu/+spec/other-p-builds-smoke-testing):
>>
>>
>> 1) Compile a list of applications that are installed by default by the
>> ISO installers (one for Desktop, one for Server) and propose two or
>> three basic test cases that could be run post install giving us basic
>> confidence that the ISO is good for further testing (i.e. compile a list
>> of post-install smoke tests that we could run with Jenkins).
>> - This task is not about generating code, but about thinking of what
>> packages of the ones installed are important and worth testing in a
>> daily test suite. We could split it in different tasks for different
>> people if we generate first a list of apps that we can use for the
>> generation of test cases.
> I'd like to start by encouraging some debate here. As a developer of
> Ubuntu, there are certain tools that are critical such as Python itself.
> However to what extent is it worthwhile testing that Python works for
> example? I guess it can't do any harm.
>
> One thing I would like to see tested is apt - the ability to install
> packages is critical.
>
> Personally I think the most important thing is to test as many different
> configurations as possible (which we may already be doing), such as
> encrypted home and selecting (or not) to install non-free
> software/updates while installing.
>> 2) We need to fix the existing test cases in the tracker and convert
>> them to a better, more understandable format. Basically we need to
>> convert them to unambiguous and meaningful test cases. Some of them are
>> redundant, some of them are too long to be just one test case, some
>> others do not make sense anymore. This is a tidy up task that needs to
>> be done.
> Is this to be done before putting them into Litmus? I can gladly help
> either way.
>>
>> - Metrics
>> (https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-metrics):
>> 3) I have some tasks here that could use some help. We need to look at
>> the codebase of Ubuntu main and see how to instrument the code so that
>> we can start generating code coverage metrics. This is about compiling
>> the Ubuntu code with gcov and generating binaries that can be used
>> (still to be seen how to install them) for this end.
>> - This task requires code in-depth knowledge and familiarity on how
>> things are built and can be changed to build in a different way. We
>> should decide where to start instrumenting and why.
> The software-center developers have a very good implementation of code
> coverage reports, so it's worth looking at that package (in the 'tests'
> directory) at least for how to do this with a Python application. This
> is the task I'd be most interested in helping with.
>>
>> 4) Look into how to do test escape analysis with launchpad. TEA is an
>> analysis that will tell us, after Precise, if we missed some problems
>> that were found by someone after we did our testing and that should help
>> us understand whether we should be adding new test cases in those
>> "missed" areas or not.
> Are we tagging bugs that 'we' found (I'm assuming the 'we' here means
> the routine testing such as smoke testing and ISO testing rather than
> regular use by end-users) in some way?
>>
>> 5) Gather test cases from defects. This is about making a list of
>> defects that have been fixed for Oneiric and that have a set of steps to
>> reproduce the problem that needs to be gathered and written into a
>> proper test case.
> Does someone already have this list? Release managers for example?
>>
>> - Test Case Management System
>> (https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-test-case-management-tool)
>>
>> 6) Still not available, but when it is, review and give feedback
>> about litmus and its usability. Also help decide how to configure it to
>> make it more suitable for the Ubuntu community testing.
> Just let us know when it's ready ;)
>>
>>
>> - QA Backlog tasks
>> (https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-backlog)
>> 7) Review and change the wiki to reflect the new approach to QA.
>>
>>
>> Please, bear in mind that since we don't have the test case management
>> tool up and running yet, we need to keep our test cases in text files or
>> open office documents (prefereably spreadsheets) for now. As soon as we
>> have chosen a tool to handle them, we will be using that.
>>
>> I have added a template at the bottom of the test cases page, feel free
>> to use it for your newly generated test cases:
>> https://wiki.ubuntu.com/QATeam/TestCase
>>
>> You can also modify it to contain a link to the old test case whenever
>> you are improving an existing test case.
>>
>>
>> Let us know which tasks you are interested in and I will be mapping
>> tasks in the blueprints to people, so that we keep track of what
>> everyone's doing and do not duplicate work. I have numbered the tasks to
>> make it easier to discuss about them.
>>
>> You don't need to take an entire task if you feel you can only work on
>> two or three test cases, you just say so and we will make sure nobody
>> else is working on the same ones as you.
>>
>> Looking forward to your answers!
>> Gema
>>
>>
>
>
--
Gema Gomez-Solano <gema.gomez-solano at canonical.com>
QA Team https://launchpad.net/~gema.gomez
Canonical Ltd. http://www.canonical.com
More information about the Ubuntu-qa
mailing list