Per one of my favorite podcasts, WNYC’s On the Media, journalists are finding it increasingly more difficult to check facts at a pace that keeps up with modern news coverage. To be successful, they need dedicated fact checkers. Seem familiar yet?
Journalists depend on these fact checkers to keep them out of trouble. And fact checkers need to have their own skill sets, allowing them to focus on fact checking. Fact checkers have to be creative and use various tricks, like only following trustworthy people on Twitter and speaking different languages to understand the broader picture. How about now, seem familiar?
Okay, try this: Craig Silverman, founder of Regret the Error, a media error reporting blog, said “typically people only notice fact checkers if some terrible mistake has been made”. Now it seems familiar, right?
The audience of fact checkers or software testers has no idea how many errors were found before it was released. They only know what wasn’t found.
Sometimes I have a revenge fantasy that goes something like this:
If a user finds a bug and says, “that’s so obvious, why didn’t they catch this”, their software will immediately revert back to the untested version.
…Maybe some tester love will start to flow then.
There’s no incentive to look the other way when we notice bugs at the last minute.
We are planning to release a HUGE feature to production tomorrow. Ooops! Wouldn’t you know it…we found more bugs.
Back in the dark ages, with Scrum, it’s possible we may have talked ourselves into justifying the release without the bug fixes; “these aren’t terrible…maybe users won’t notice…we can always patch production later”.
But with Kanban, it went something like this:
“…hey, let’s not release tomorrow. Let’s give ourselves an extra day.”
- Nobody has to work late.
- No iteration planning needs to be rejiggered.
- There’s no set, established maintenance window restricting our flexibility.
- Quality did not fall victim to an iteration schedule.
- We don’t need to publish any known bugs (i.e., there won’t be any).
I just came from an Escape Review Meeting. Or as some like to call it, a “Blame Review Meeting”. I can’t help but feel empathy for one of the testers who felt a bit…blamed.
With each production bug, we ask, “Could we do something to catch bugs of this nature?”. The System 1 response is “no, way too difficult to expect a test to have caught it”. But after 5 minutes of discussion, the System 2 response emerges, “yes, I can imagine a suite of tests thorough enough to have caught it, we should have tests for all that”. Ouch, this can really start to weigh on the poor tester.
So what’s a tester to do?
- First, consider meekness. As counterintuitive as it seems, I believe defending your test approach is not going to win respect. IMO, there is always room for improvement. People respect those who are open to criticism and new ideas.
- Second, entertain the advice but don’t promise the world. Tell them about the Orange Juice Test (see below).
The Orange Juice Test is from Jerry Weinberg’s book, The Secrets of Consulting. I’ll paraphrase it:
A client asked three different hotels to supply said client with 700 glasses of fresh squeezed orange juice tomorrow morning, served at the same time. Hotel #1 said “there’s no way”. Hotel #2 said “no problem”. Hotel #3 said “we can do that, but here’s what it’s going to cost you”. The client didn’t really want orange juice. They picked Hotel #3.
If the team wants you to take on new test responsibilities or coverage areas, there is probably a cost. What are you going to give up? Speed? Other test coverage? Your kids? Make the costs clear, let the team decide, and there should be no additional pain on your part.
Remember, you’re a tester, relax.