Deeds, not Words: Usability Testing at Perforce Software
At a tech-writing seminar I attended years ago, the instructor, who was very experienced, very practical and highly opinionated, said, “I don’t care whether my readers like my documents. I do care whether they succeed.” Of course we want our customers to like our products, but tastes vary wildly. Success can be measured. The intent, then, of our usability tests has been to produce measurable results that enable us to immediately detect and fix problems.
It’s essential that the development team be involved in usability testing from the start. There’s a saying: A consultant is a guy who borrows your watch, then tells you what time it is. We ask the team what they want to know and who their target users are. After all, if we ran tests with no involvement from the team, why should they believe the results? No one wants to hear the question “You know what’s wrong with you?”
We draft a test and create test data, and the team reviews and approves both. We recruit users according to a profile that the team specifies, and the team approves each tester. When the software is ready, we run the tests, either on- site or remotely using Web-based meeting software (a godsend for usability testers). The team watches all the tests, draws its conclusions, and changes the product accordingly. This approach guarantees that relevant information is generated and delivered to the experts who can best use it.
My favorite part? Meeting the customers. They are an enthusiastic and wonderfully bright bunch, very tolerant of the frailties of work-in-progress software. I’ll never forget an early P4V test with a user, who, upon encountering a rather basic and egregious bug said calmly, “Fascinating: I drag it to the left…and it moves to the right.” And they’re always very pleased when we conclude the test and I hand them a nifty Amazon gift card and a Perforce t-shirt. Users are fond of schwag.
We’ve taken many positive steps to improve our GUI products: we’ve hired user interface professionals, taken seminars, and tested usability. In fact, things have improved to the point where I’ve had to abandon my second-favorite part of the process: the goodie reel. Time was, I could compile a six-minute video composed of shots of software misbehaving (I call them "magic moments"). No longer; the tests are still productive, but the goofy things that used to happen simply don’t happen anymore. In fact, it’s been a long time since I’ve seen any such excitement, and I’m O.K. with that.