Successful test automation is the elephant in the room for many testers. We all want to do it because manual testing is hard, our manager and devs would think we were bad-ass, and…oh yeah, some of us believe it would improve our AUT quality. We fantasize about triggering our automated test stack and going home, while the manual testers toil away. We would even let them kiss the tips of our fingers as we walked out the door.

…sounds good.

So we (testers) make an attempt at automation, exaggerate the success, then eventually feel like losers. We spend more time trying to get the darn thing to run unattended and stop flagging false bugs, while the quality of our tests takes a back seat and our available test time shrinks.

We were testing one product. Now we are testing two.

The two obvious problems are, 1.) Most of us are not developers. 2.) Writing a program to test another program is more difficult than writing the original program. ...Ah yes, a match made in heaven!

I watched an automated testing webinar last week. It was more honest than I expected. The claim was, to be successful at test automation the team should not expect existing testers to start automating tests. Instead, a new team of developers should be added to automate tests that testers” write. This new team would have their own requirement reviews, manage their own code base, and have their own testers to test their test automation stack. This does not sound cheap!

While watching this webinar, something occurred to me. Maybe we don’t need test automation. Why do I think this? Simple. Because somehow my team is managing to release successful software to the company without it. There is no test automation team on our payroll. Regression testing is spotty at best, yet somehow our team is considered a model of success within the company. How is this possible when every other test tool spam email or blog post I read makes some reference to test automation?

In my case, I believe a few things have made this possible:

  • The devs are talented and organized enough to minimize the amount of stuff they break with new builds. This makes regression testing less important for us testers.
  • The BAs are talented enough to understand how new features impact existing features.
  • The testers are talented enough to know where to look. And they work closely with devs and BAs to determine how stuff should work.
  • The user support team is highly accessible to users, knowledgeable about the AUT and the business, and works closely with the BAs/devs/testers to get the right prod bugs patched quickly. The entire team is committed to serving the users.
  • The users are sophisticated enough to communicate bug details and use workarounds when waiting on fixes. The users like us because we make their jobs easier. The users want us to succeed so we can keep making their jobs easier.
  • The possibility of prod bugs resulting in death, loss of customers, or other massive financial loss is slim to none.
I suspect a great deal of software teams are similar to mine. I'm interested in hearing from other software teams that do not depend on tester-driven test automation.

I do use test automation to help me with one of my simple AUTs which happens to lend itself to automated testing (see JART). However, from my experiences, there are few apps that are easy to automate with simple checks.

2 comments:

  1. Eric Jacobson said...

    Paul,

    Thanks for the flattery, I think.

    It's all stuff you can say on network TV but I appreciate your candor. I changed one word to "darn" and removed the word for the place said to be the opposite of heaven. But I just couldn't bring myself to change "kick-ass" to "Kick-butt". It just sounded wimpy. In fact, I decided "bad-ass" fits much better.

    I'll try to keep the profanity out of my posts if you try to keep reading them. Thank you!

  2. Unknown said...

    Hi Eric, I'd be interested to hear about the release cycles for the applications you refer to. The largest automation suite I worked on recently was for an online engine that was high-revenue and mission critical unlike your one, but that also has releases every one to two months and likelyhood that code changes are widely spread. As such regression testing is a big job and automation seems to have produced a return in terms of cost but more importantly cycle time and confidence. cheers, John www.webtest.co.nz



Copyright 2006| Blogger Templates by GeckoandFly modified and converted to Blogger Beta by Blogcrowds.
No part of the content or the blog may be reproduced without prior written permission.