OT How to Write a Software Review

newcr newcr at mysoul.com.au
Mon Nov 19 10:58:00 GMT 2007


I am not sure if I should reply as it is not Ubuntu related ...  but 
here goes .... *I hope it is helpful* ...


If the IT team did their job you should have been given testing 
templates/documents to work with. Testing seems to be a massive 
under-utilised area and ends up being a (expensive) black hole of 
service requests down the track.

For each problem you find, a standard template (for consistency sake) 
would be good. eg

*For a end user test plan* ..... (maybe landscape page layout) ... The 
fill in fields could be, something like ....

    * Document Header
          o Software product/project
          o Date period tested or some referential date
          o Feature tested or Category of test or Aspect of product
            tested or something
          o Type of test eg Testing requirements, stress test,
            correctness test, etc (See notes below)
          o Name of person testing
          o Some reference identifier for this test document (for
            communication purposes)
    * Problems found (one or more) maybe tabular format on a landscape
      page layout.
          o Date test started
          o Date test ended
          o Description of test
          o Test inputs
          o Expected result
          o Actual result
          o Priority
          o Sign off date (date when problem was corrected)
          o Reference link/s to follow up testing / related
            testing/problems.
          o Comments


Some Notes:

    * *Programmers need to try and replicate your problem*. Screen shots
      are good and if you have the software, screen videos can be very
      helpful. Was the network slow at the time of testing? Did you have
      other programs running? etc ...
    * Testing should have started from the very beginning of the
      project. But late is better than not at all. Some examples of
      types of tests ....
          o Requirements testing - Does the software actually do what
            the end users need it to do. Are there features missing that
            you need. Are there bells and whistles included that you
            don't need. etc...
          o Correctness testing - eg expected and actual results ...
          o Stress testing - Does the software perform under pressure
            ... eg lots of data, network/Internet load issues (including
            bottlenecks), Does it handle more than one person accessing
            the same data at once properly, etc ...
          o Error handling - When an error happens does the software
            give a friendly error message or are you and the programmers
            guessing what happens.
          o Robust - eg It doesn't fall over every 3 days.
          o Timely - eg You are not waiting half an hour for a report to
            be produced (probably part of stress testing)
          o Usability - Is it easy to use? Is it intuitive? Simple
            layout as possible? Easy to navigate? User happiness factor?
            Does the software actually make it easier / help get more
            work done than the last system (Some exceptions to this
            might be the new software was bought in to handle new
            elements eg longer customer codes etc...) . etc ...
          o Security issues/risks
          o Migration issues - If you have to migrate data from old
            system to new system .... all the issues involved ... eg
            data integrity, roll back procedures, etc... (Programmers
            job but it affects you if it goes wrong)
    * Other notes
          o Priority and risk assessment work together. The  consequence
            of the risk affects the priority.
          o Fixing problems can (and do) cause follow on problems or
            show up other problems. When re-testing consider more than
            just the isolated incident.
          o If it is an internal product, priority is important. With
            time, finances, resources as a reality check, some issues
            may get delayed (possibly forever). Be honest with priority.
          o Internal programming versus outsourced software.
                + If the programming team is internal to your company,
                  then usually the programmers are fairly helpful and
                  communication can be a bit more relaxed.
                + If the work is outsourced, then make sure the problems
                  are written, clear and understandable. Keep written
                  communication of any important (even a follow up email
                  to a phone conversation). Do not be pressured into
                  signing off on anything otherwise face being stuck
                  with it. Once they have your money, it is "Catch me if
                  you can".

The review

    * Would summarise your findings and include the filled out testing
      plan documents.
    * Would group issues together, etc ...
    * Make recommendations
    * Possibly include a priority schedule
    * Possibly include risk assessment from a users view point. eg Maybe
      your product is getting rolled out in increments. And if these
      problems don't get fixed by this date or these features are not
      ready by this date ... How it impacts us, etc...
    * If there are a number of end users testing the software product
      then a review template should be set up.
    * etc ...



    http://en.wikipedia.org/wiki/Software_testing


Regards
Chris



Sebastian wrote:
> Hi all,
> I know this might be slightly OT, but I thought that OS people are in
> frequent contact with software and reviews of it.
>
> So here the thing, at work we got a software tailored to our needs.
> Written for us only :-) I am in the process of working with the
> software and took lots of notes during my testing.
> The notes contain errors, interface corrections, feature requests and
> changes, and so on.
> Things like price, system requirements, beauty of the interface are
> not really what I need to concentrate on.
>
> As I am not a programmer I was wondering if there is a HowTo about
> writing such a review, what else do I have to test, check. By now I
> have the feeling I have to write to, one for my boss and one for the
> programmer... :-|
> How should I structure my review, errors first?
> Which is the best way to document and/or report such errors or wanted changes?
>
> I had a look but could not come up with many useful sites (e.g.
> http://www.ehow.com/how_2091728_write-educational-software-review.html)
>
> Any thought are appreciated!
>
> Cheers,
>
> Sebastian
>
>   

-------------- next part --------------
An HTML attachment was scrubbed...
URL: https://lists.ubuntu.com/archives/ubuntu-au/attachments/20071119/2e8b0da0/attachment.htm 


More information about the ubuntu-au mailing list