Question: What metric or easily understood information can my test team provide users, to show our contribution to the software we release?

I just got back from vacation and am looking at a beautiful pie chart that shows the following per iteration:

  • # of features delivered
  • # of bugs found in test vs. prod
  • # of bugs fixed
  • # of test cases executed
After a series of buggy production releases, my team (or at least the BAs) have decided to provide users with colorful charts depicting how hard we’ve been working each iteration. My main gripe is providing my BAs with a # representing executed test cases.

First, I feel uncomfortable measuring tester value based on test case count, for obvious reasons.

Second, the pie chart looks like all we do is test. One slice lists 400 tests. Another lists 13 features...strange juxtaposition.

Third, I’m not even sure how to provide said count. I certainly don’t encourage my test team to exhaustively document their manual test cases, nor do I care how many artifacts they save distinct tests within. Do I include 900+ automated UI test executions? Do I include thousands more unit test executions? Does the final # speak to users about quality? Does it represent how effective testers are? Not to me. Maybe it does to users...

PR is important, especially when your reputation takes a dive. I, too, want to show the users how hard my QA team works. I want to show it in the easiest possible way. I could provide a long list of tests, but they don't want to read that. What am I missing? What metric or easily understood information can my test team provide users, to show our contribution to the software we release?


  1. wildcrayons said...

    Show them the video you did.

  2. Michael Bolton said...

    My reply was too long for Blogger, so I answered here.


    ---Michael B.

  3. Danil said...

    Looks to me as though the question is backwards.

    What do your users care about?
    What response do you want back from your users?

  4. Eric Jacobson said...


    Good question. My answer: the users care about working software.

    The scenario described in my post is more of a public relations plan. It is a development team trying to reassure its users that programmers/BAs/testers are all working hard doing our best to deliver a top notch product.

  5. Marlena said...

    2 things:

    1. I've been thinking about this recently because I'm working at getting some testing done earlier in the software process.

    This has meant being much more diligent about reading through specs and communicating with designers.

    Consequently, I've noticed that there's never a record if I have a quick conversation with a PM or a designer that results in a better spec or a better design.

    2. This is one reason why I'm very glad I can use a wiki for recording my testing. It sounds like your BA's want more visibility into the testing process. I found that BA's I worked with did not like looking at my TCM system, but once I put tests in the wiki, they were much more visible and business was much more clued in.

    It can be challenging to get good information about tests in an aggregated form out of the tools we have today.

Copyright 2006| Blogger Templates by GeckoandFly modified and converted to Blogger Beta by Blogcrowds.
No part of the content or the blog may be reproduced without prior written permission.