Demonstrate quality improvement

Top management wants each group to show year-over-year improvement (i.e. showing gains with data, not just opinion). How have you shown the improvement in quality? What metrics did you use?

This is not one tester's score over another. It is about showing the growth of the department and giving the individual tester the opportunity to highlight personal improvements.

+2


source to share


4 answers


It is important to know exactly what your QA department is doing. This will differ slightly from company to company, but ultimately QA is a data collection operation. The number of bugs filed per person / project is easy to measure, but has little to do with how hard the QA team works or how effective they are.



Better to look at the percentage of major bugs found by customers after release, as well as the QA found. As testing improves, this number should decrease. Also measure the number of test cases performed for each version. As the QA process matures, you should see testers become more productive (through familiarity or automation).

+3


source


There are a number of erroneous QA metrics including bugs found. This is not bad, but if the software has changed little, the number of bugs found over time will tend to zero.

Measuring individual testers and the number of errors they raise is a way of encouraging competitive types, but can lead to many small problems (which can be good or bad).

Some possible useful metrics:



  • the measured number of new errors found in the field (i.e. you missed) - this should go down
  • time to retest and close fixed issues.
  • number of errors sent for clarification (should be decreasing)
  • number of bug reports closed for invalid test approval - shows understanding, should be decreasing

If your goals are also specified - for example, go to an automated testing system - this might be a way to measure. So if you have 10,000 test cases, your metric might be the automatic number of automated tests, how many pass / fail.

There's a really good article discussing this at: http://www.claudefenner.com/content/detail/QAMetricsPage.htm

+3


source


How sophisticated are the bugs found, for example. it's easy, just load the webpage and it'll work or there are multiple steps required to reproduce the error, one metric can be used that might be interesting to see how it goes, although it somewhat depends on how well the developers are first create the software.

How often bugs are submitted for clarification can also be helpful, as developers spend many hours paired with QA to figure out bugs, this is not the most useful way to do it once.

Finally, it might be worthwhile for someone to create a QA 101 guide here so that some practice and knowledge can be written down and revised over time to show growth in terms of understanding different testing methods and using useful ones for this situation. These are my suggestions.

+1


source


I think the best way to measure the QA team's performance in relation to bug reports is: Bug fixes. If you have most of the bugs fixed by the developer, then it shows that you are finding quality bugs that need attention. The number of invalid bugs should be a negative measure, because such bugs waste developer time.

0


source







All Articles