We’ve been interviewing to fill a couple QA positions on our team. My favorite part of each interview is my “test this light switch” exercise. It reveals interesting skills about each test candidate.

I point to the light switch in the room and say “test this light switch”. Here is a sampling of how candidates have responded:

  • some asked if there are any requirements (this is a great way to start!)
  • some just start testing with lots of assumptions (no so great)
  • one candidate smiled and thought I was kidding. After asking lots of questions to prime him, he stared uncomfortably at the light switch and offered me close to nothing (embarrassing for both of us)
  • one candidate walked up to the light switch and began testing it as she walked me through her thought process. After some solid high level tests, she wanted to see electrical schematics for the building and asked me all kinds of questions about emergency backup power, how many amps the room’s lights draw, and what else was on that circuit. She wanted to remove the trim plate to check the wiring for electrical code standards. She asked if the room’s lights could be controlled by a master switch somewhere else or an energy saver timer for off-hours. (these types of questions/tests make her a good fit for my team because my AUT’s weakest test area is its integration with other systems)
  • one candidate was good at coming up with cosmetic and usability tests (e.g., Is the switch labeled well? Can I reach it from the doorway when the room is dark? Does the trim plate match the room’s trim in color and style?)…not so important for my AUT but good tests for others perhaps.
  • one candidate went right for stress tests. He flipped the lights on/off as quickly as he could. He tried to force the switch to stay in the halfway-off-halfway-on position to see if it sparked or flickered the lights.
More was revealed about the confidence of each candidate, their creativity, how technical their brain was, how quickly their mind worked, their persistence, and finally how interested they were in determining their mission and what I thought was important to know.

Most bug reports include Severity and Priority. On my team, everyone is interested in Priority (because it affects their work load). Severity is all but ignored. I propose that testers stop assigning Priority and only assign Severity.

Priority is not up to the tester. It is usually a business decision. It wastes the tester’s time to consider Priority, takes this important decision away from someone more suited to make it, and finally, it may misguide workflow.

Bugs without Priority have to be read and understood by the customer team (so they can assign priority themselves). This is a good thing.

What about blocking bugs, you ask?

Some bugs are important to fix because they block testing. These bugs are best identified as blocking bugs by testers. They can be flagged as “Blocking Bugs” using an attribute independent of the Priority field. Think about it…if BlockingBugA is blocking testing that is less important than other, non-blocked testing, perhaps BlockingBugA only deserves a low priority.

Tell me where I’m wrong.

I recently read about 15 resumes for tester positions on my team. None of them told us anything about how well the candidate can test.

Here is what I saw:

  • All candidates list a ton of ”technologies” they are familiar with (e.g., .Net, Unix, SQL, XML, MS Office)
  • They also list a bunch of off-the-shelf testing tools (e.g., TestDirector, LoadRunner, QuickTest Pro, SilkTest, BugZilla)
…So far I don’t know anything about how well they can test.
  • All candidates string together a bunch of test buzz words…something like, “I know white box testing, gray box testing, black box testing, stress testing, load testing, functional testing, integration testing, sanity testing, smoke testing, regression testing, manual testing, automated testing, user acceptance testing, etc.”
…as if I would be thinking, “yes, but do you know Glass Box Testing? That’s really what we’re looking for.”
  • Some candidates will say something like “I wrote a 50-page test plan”, or “I’m responsible for testing an enterprise application used by 1000 users”
…okay, so how well can you test? To be fair, this is a difficult skill to convey in a resume and perhaps I am just not good at reading between the lines and determining which candidates would thrive as testers on my team. However, the candidate would have probably gotten an instant interview if they had included any of these:
  • My approach to testing is as follows…
  • See my software testing blog for my opinions on how to test.
  • My favorite testing books and blogs are…
  • I enjoy testing because…
Sigh. I guess the modern resume has not advanced far enough to reflect candidate traits to said extent. That's why the interview questions will be so important. I get to participate in my first interview tomorrow and I have come up with a list of fun questions/activities to help me see how well the candidate tests. One of them will be to "Test that light switch over there on the wall completely." (if the candidates are cool enough to read my blog, they'll have a head start).

What are your favorite tester interview questions?



Copyright 2006| Blogger Templates by GeckoandFly modified and converted to Blogger Beta by Blogcrowds.
No part of the content or the blog may be reproduced without prior written permission.