If you read my previous post, you know I help test a product on iteration 80. This product has few automated regression tests (we are working to change that). Currently, my test team faces regression testing armed with nothing more than their brains, their hands, and a big ticking clock.
Each 4 week iteration provides us with a little less than a week to spend regression testing. If you can relate to my situation, you may be interested in two painless practices we have adopted to better handle our ever increasing challenge of regression testing.
- Group Regression Test Planning – We get the devs, BAs, and testers into a room and spend an hour brainstorming, to determine the top priority regression test areas of focus. We scan through the iteration's changes and draw on each other’s knowledge to create a draft of what will become the priority “1” tests or areas of focus. This occurs on a shared MS Excel regression test list spreadsheet. The prior iteration’s regression tests drop in priority...and so on.
- Group Regression Test Sessions – We fill a large bowl with chocolate and invite the BAs and devs to join the testers in our classroom for two 3-hour sessions of group regression testing. All participants track their progress on the shared regression test list and we tackle it as a team. The team approach also occasionally shakes out multi-user-scenario-bugs, since our product is heavily dependent on user locking.
Our approach is not to complete a set amount of regression tests before we ship. Instead, we complete as many tests as we can, within the time provided. The prioritized test list makes this possible and shields us from things outside of our control.
Is anyone else doing regression testing with mostly manual testing? How do you pull it off?