LOOK I MADE A USABILITY TEST

If you ever run into a boss or a startup founder who tells you they don't have time or resources for "research", gently point out that the essence of research is testing hypotheses against data, and you can do that in many ways (as long as the testing is well-formed and not leading – see also "focus groups").

I spent a few hours testing a few basic usability questions and (spoilers) it was genuinely helpful, highlighting that some of our information organizing assumptions weren't obvious, where the tone of the copy we'd written fell flat, and a few other things that weren't dealbreakers, but barriers to offering a simple, powerful, and delightful experience to users.

Caveat: There’s a lot of training and learning that goes into making a good usability test, stuff I am not mentioning here. Having someone who doesn’t understand the difference between open-ended and close-ended questions, or the foundations of social-cue behavior, or cognition theory, can sink the value of any test, even if it’s based on a rock-solid protocol. And learning how to build a testing protocol is itself a necessary (though not sufficient) requirement for gathering good data. My point in this article is not to say anyone can be an expert at this right off the bat, but that once someone has invested in the learning and training, the cost of running value-increasing tests is small.

What I did was think about specific research questions we needed answers to, adapted a testing protocol I already had to allow users to attack tasks related to these questions, shared screens via Skype, recorded the video and audio with QuickTime Player (all free).

Of course, how you moderate/run a test is critical, and dependent on professional training and experience (I'm still learning how to do it better), but it's not rocket science, or surgery. Could it have been done better, or taken more time and resources, and dived more into gathering other types of data? Sure. But the goal of this round was like when you ask someone to read something you've written: "I've looked at this for too long – can you tell me if it makes sense?"

The format of the resulting report was simple. List the objectives (the questions you want to explore), sum up the learnings (break out what was working and what was problematic), think about it, then list some next steps for iterating the design and further research. Bob's your uncle. Not all your problems are solved, of course, but you probably have more reliable guidance towards further development of something someone will actually use, and actually like.