We’ve been interviewing to fill a couple QA positions on our team. My favorite part of each interview is my “test this light switch” exercise. It reveals interesting skills about each test candidate.
I point to the light switch in the room and say “test this light switch”. Here is a sampling of how candidates have responded:
- some asked if there are any requirements (this is a great way to start!)
- some just start testing with lots of assumptions (no so great)
- one candidate smiled and thought I was kidding. After asking lots of questions to prime him, he stared uncomfortably at the light switch and offered me close to nothing (embarrassing for both of us)
- one candidate walked up to the light switch and began testing it as she walked me through her thought process. After some solid high level tests, she wanted to see electrical schematics for the building and asked me all kinds of questions about emergency backup power, how many amps the room’s lights draw, and what else was on that circuit. She wanted to remove the trim plate to check the wiring for electrical code standards. She asked if the room’s lights could be controlled by a master switch somewhere else or an energy saver timer for off-hours. (these types of questions/tests make her a good fit for my team because my AUT’s weakest test area is its integration with other systems)
- one candidate was good at coming up with cosmetic and usability tests (e.g., Is the switch labeled well? Can I reach it from the doorway when the room is dark? Does the trim plate match the room’s trim in color and style?)…not so important for my AUT but good tests for others perhaps.
- one candidate went right for stress tests. He flipped the lights on/off as quickly as he could. He tried to force the switch to stay in the halfway-off-halfway-on position to see if it sparked or flickered the lights.
Labels: software testing career
Most bug reports include Severity and Priority. On my team, everyone is interested in Priority (because it affects their work load). Severity is all but ignored. I propose that testers stop assigning Priority and only assign Severity.
Priority is not up to the tester. It is usually a business decision. It wastes the tester’s time to consider Priority, takes this important decision away from someone more suited to make it, and finally, it may misguide workflow.
Bugs without Priority have to be read and understood by the customer team (so they can assign priority themselves). This is a good thing.
What about blocking bugs, you ask?
Some bugs are important to fix because they block testing. These bugs are best identified as blocking bugs by testers. They can be flagged as “Blocking Bugs” using an attribute independent of the Priority field. Think about it…if BlockingBugA is blocking testing that is less important than other, non-blocked testing, perhaps BlockingBugA only deserves a low priority.
Tell me where I’m wrong.
I recently read about 15 resumes for tester positions on my team. None of them told us anything about how well the candidate can test.
Here is what I saw:
- All candidates list a ton of ”technologies” they are familiar with (e.g., .Net, Unix, SQL, XML, MS Office)
- They also list a bunch of off-the-shelf testing tools (e.g., TestDirector, LoadRunner, QuickTest Pro, SilkTest, BugZilla)
- All candidates string together a bunch of test buzz words…something like, “I know white box testing, gray box testing, black box testing, stress testing, load testing, functional testing, integration testing, sanity testing, smoke testing, regression testing, manual testing, automated testing, user acceptance testing, etc.”
- Some candidates will say something like “I wrote a 50-page test plan”, or “I’m responsible for testing an enterprise application used by 1000 users”
- My approach to testing is as follows…
- See my software testing blog for my opinions on how to test.
- My favorite testing books and blogs are…
- I enjoy testing because…
What are your favorite tester interview questions?
Labels: software testing career