Rockbox.org home
release
dev builds
extras
themes manual
wiki
device status forums
mailing lists
IRC bugs
patches
dev guide



Rockbox mail archive

Subject: Re: Pre-release testing framework

Re: Pre-release testing framework

From: Lorenzo Miori <memorys60_at_gmail.com>
Date: Tue, 29 May 2012 20:28:46 +0200

I just read the email about the test cases: well it is a very good
idea. I completely agree this and actually doing a project with a
methodology can help quite a lot.
I don't know if you guys are aware of the "agile" methodologies, well
look for "user stories" and "acceptance tests" (XP/Scrum and Agile
methodologies)

Basically an user story is a table with a title, a description (with
point / risk and priority) and acceptance test name (every card is
enumerated with a name).
Acceptance test defines input(s), precondition(s), and expected result(s).

Example:

User story # 1: playAudio
Priority: 1
Points: 3
Description: player must be able to output some sound. [Avoid specific
terminology, implementations. Audio can be read also from a wifi
stream or a file, doesn't matter here]
Test: playAudioTest

playAudioTest
Input: user choses a file and playback starts.
Precondition: rockbox is running, no other audio are playing, volume
is set to a reasonable level...
Output: the audio output to the user [doesn't matter headphones or
other devices, we need to state what the user shall be able to do /
receive]

Of course this is pretty boring etc but why not. It can help since we
define some common steps and identifying a problem will be easier for
sure!

My 2 cents :)

Some stuff:

http://www.slideshare.net/ruthenry/agile-acceptance-tests

2012/5/29 Bertrik Sikken <bertrik_at_sikken.nl>:
> Hi all,
>
> Last week at devcon euro 2012 we talked about pre-release testing and
> I'm volunteering to help things forward w.r.t. testing.
>
> One of the problems we've seen with the last release was that really
> basic functionality like audio file playback and radio playback did
> not work on some targets and we were not aware of it until many days
> later. Back in 2010 Björn already proposed a framework to recruit
> the help of users to test release candidates in a systematic way:
> http://www.rockbox.org/mail/archive/rockbox-dev-archive-2010-10 /0114.shtml
> This helps to at least be *aware* of any problems (what to do with
> that information is another discussion).
>
> I probably can't help with setting up a web-based framework, but
> I can help with thinking up test cases.
>
> The test case format I'm familiar with, defines the following
> properties:
> * some unique test case id, like
>  "basics.playback.001"
> * a descriptive title, like
>  "Verify basic audio playback for all supported audio formats"
> * (a reference to a requirement or spec, not really relevant for
>  rockbox I think)
> * a pre-condition, like:
>  "The audio format test file set 1 has been copied to the player"
> * action to perform the test, like:
>  "Using the file browser, browse to the test file set and select the
>  first audio file. Verify that all files play correctly."
> * expected result, like:
>  "All files play and sound correct.
>  Failure criteria: * Audio playback shows audio artifacts (like
>  crackling, pitch or volume errors). * Audio playback is skipped."
> (perhaps this is already an advanced test)
>
> Stuff the tester can fill in:
> * test result: PASS, FAIL, SKIPPED
> * test remarks: free text
>
> With respect to test strategy: In my opinion, we can start with
> a set of rather basic tests to verify high-level behaviour, rather
> than to go for test completeness. A test suite taking about (say)
> one hour should keep the barrier low for people who want to
> participate in testing.
>
> What do you think?
>
> Kind regards,
> Bertrik
Received on 2012-05-29


Page was last modified "Jan 10 2012" The Rockbox Crew
aaa