Getting started – What You Need to Interview

View from Google SF
In part one we talked about getting business owners on board for user interviews. You’ll find this next piece helpful if you’re testing out the usability of a product (new or existing), looking for feedback for market research or a survey, or trying to understand how a business owner/user goes about using your tool in their daily life.
There are many, many different ways to conduct a user interview. Below is just one combination of tools and methods used.
Tools
In person:
- Video Camera and/or Voice recorder
- Two people (One to interview, one to take notes on the spot)
- Props – if you’re testing a beta or closed pilot, bring along a personal (or company!) mobile device so the user doesn’t struggle setting theirs up with Testflight
- Script – very important. It’s easy to get distracted or forget how to run the interview like you had planned in your head. Take a script with you, practice beforehand, and you’ll be fine. Running through the same script with different participants is key to a standard approach that will let you identify issues across multiple users.
Remote:
- Google Hangouts/ Go-to-meeting for video calls and recording – for Google Hangouts, try hangouts on air to record to a private Youtube channel. You can also use Quicktime to record your screen while using Hangouts.
- Quicktime– will let a user share their iPhone screen with a Mac, which can then be shared through the video call. The draw back is you can’t see their face and reactions. (Lookback.io is a possible solution to this – it’s been a bit buggy for me on Mac, but the value is huge.)
- Script – still important. For quality reasons, and because you can still get distracted during a remote interview.
Methods
There are countless articles out there describing in great detail, which method to use, and for what. For your convenience, the below articles seem to bring the most authority:
User Research Methods (Usability.gov)
When to use which user research methods (NN Group)
Certain things are difficult to describe, and are best learned by doing. Understanding how to appropriately format survey questions – both in how the question is asked, and in how the answer is collected – depend on many different variables. It helps to create a large list of what answers you hope to collect by the end of the survey, and how you’ll present your results. If you prefer to use direct quotes, include lots of open ended questions. Alternatively, if you plan on showcasing numbers and graphs, make sure you can quantify your data easily.
One of the main issues we ran into while conducting research within the Customer Experience Lab was quantifying our qualitative data. Parsing through paragraphs of text, for each question, was overly time consuming – but the data was invaluable. Consider the following technique to save hours of manpower if you find yourself in a similar situation (with many open ended questions).
Create themes and buckets ahead of time
The more survey responses you gather, the quicker you’ll be able to identify themes or consistent problems arise. Much like how NN/g states that you only need to test with 5 users, for open ended surveys that deal with product usage, after roughly 10 participants you’ll have most the key issues identified. By 15 you’ll feel quite confident, and by 25 it will be glaringly obvious. Of course, there are more statistically relevant ways to analyze this data as well. This is more for RITE (Rapid Iteration, Testing, Evaluation) scenarios where quick feedback is required.
Jeff Sauro of MeasuringU (one of the best quantitative research consultants around) published an article that went into more detail about this in 2012. He also suggests taking the NPS (net promoter score), and going further, possibly combining them with the general comments the user provides. We did see great success when applying this method, as you’ll understand with greater insights why a user is struggling or why the enjoy using the product.
As you, the researcher, read through the individual responses, you will become familiar with what needs to change – but how can you easily share that with the development team? From the beginning, start a side document where you begin to list reoccurring themes, patterns, suggestions, complaints. As more and more responses come in, move these into a spreadsheet where you can begin tallying them up.
Can’t edit photos – 8
Upload is slow – 12
Love the tutorial – 15
How do I change pw? – 18
In the examples above, you can see these sample participants generally agree that the upload is slow, the tutorial is good, and they’re having trouble changing their password. Less have trouble editing photos, though not an insignificant amount. Videos of the user struggling to edit may convince any PM or lead engineer to prioritize a fix if it’s particularly painful.
One aspect of this type of research worth considering is that while you can derive numbers and percentages from the feedback, it’s not necessarily statistically relevant. Rather, it’s a different way of visualizing the qualitative data to identify otherwise missed patterns or opportunities. Where statistically significant numbers can provide reasoning into how or where ‘X’ is or isn’t being adopted, quantifying qualitative data can provide reasoning behind the why.