Confidence in quality
"Confidence in quality" is a concept I've been thinking about quite a bit lately and I thought it'd be an interesting topic to cover here. I'm curious what other people think about this idea and what you do to increase confidence in your quality assessments.
It's one thing to report a certain pass rate for a test pass, it's another thing to get yourself and your product group to believe it. For example, you might have spent 3 weeks testing and can report that your test pass ended with a 97% of all tests passing (meaning 3% of the product scenarios still have bugs), but that doesn't really say what you covered or how you covered it. These are important things to be able to understand and to convey when making a statement about product quality.
So how can you increase confidence in your quality assessments? Here are a few suggestions:
1. Test Often. Ideally, you're doing some level of testing before your developers make changes. But certain types of tests (e.g. setup validation) must be run on "real" builds. Test as quickly as possible afterwards. All too often (though less so today, I suspect), products are developed in-house and then rolled up and shipped off to test vendors after a major milestone. With the exception of post-deployment this is the most expensive time to find a bug. By the time the bug reports come in, the product has already evolved further because developers have been working on the next milestone in parallel. Some of the bugs may be fixed and some of the features may have been cut or completely redesigned in the interim! If you need to spend time working on other testing activities (e.g. test planning, hotfix testing for an earlier release, etc.) while your dev team marches on, rely on daily automation runs to hold down the fort until you can switch focus back to the current project.
2. Understand your code coverage. Note that I didn't say "drive code coverage to 80%" or some other number. If you understand what code your tests hit and what code they don't, you have some insight into how well you've covered the product. As I alluded to in my earlier post on test automation, driving for a certain code coverage goal does have significant value, but it's not the only thing you should drive towards. You can't stand up in front of your product's management team and say "I got a 100% pass rate with 80% code coverage, so we're ready to ship." Who knows if you actually validated anything while touching all that code? Surely you wouldn't ever say "I ran through 80% of the rooms in this new house so I'm ready to buy it" - while blindfolded? You might have covered 80% of the total blocks in your application but missed an entire set of smaller but critically important files/assemblies/modules/DLLs/binaries/etc. If you can drill down on your coverage and both understand and convey what is covered and what isn't, you'll go a long way in establishing confidence that you've done the right (or not) level of testing.
3. Review test plans with the team - Get input from as many stakeholders as possible on the test plan. This process will not only help you find testing holes but will help inspire confidence in your team members that you're doing the right level of testing. If the product plan or spec changes in any way, re-review the affected sections of the test plan with your team again.
4. Keep your test automation running smoothly! The more "test failures" your daily automation runs have, the less valuable your automation suite is in day-to-day quality measurements. If you have tests that are failing for non-product bug reasons, I suggest taking them out of your daily test runs. It's especially embarrassing to report a 75% pass rate due to "test issues" - basically it means you have no idea of what the quality of the product is because you can't rely on your automation to make an accurate assessment.
I can think of lots more, but I want to hear what you have to suggest too.
Happy testing!
-Jason
Comments
- Anonymous
May 17, 2006
Rob Caron posts:
Requirements Authoring Starter Kit
Hands on with Microsoft Team System
DSL Tools:...