- Types of Research
- Surveys
- Preparing for User Interviews
- Conducting User Interviews
- Affinity Mapping
- After You Empathize, You Define
Preparing for User Interviews
One of the primary ways you learn from others when designing is by talking with them about their opinions and experiences. You do this through user interviews—a technique that allows you to have a conversation with users, learn their goals and frustrations, and hear stories about their experiences. You then use the data generated this way to move forward in the design process, creating design artifacts that help you empathize with users as you start to think about solutions.
So much of design thinking involves designing for other people. Imagine yourself in a scenario and think about what you would do in that scenario, but you must remember that you aren’t the user. Although you may have opinions about a situation, you can’t assume that your wants and needs are the same as those of the people you are designing for. The best way to understand the target audience is to learn from them directly.
Let’s take a closer look at what a user interview is and how to prepare for one.
What Is a User Interview?
A user interview is a research method in which a researcher asks a participant questions with the intent to uncover information regarding that participant’s behavior and preferences. User interviews are a great way to collect insights directly from users, one on one. They can be done at multiple stages of the design process, depending on where your team is at and what you’re trying to learn. Most commonly, they are done at the beginning of the design thinking process, when you are trying to empathize and understand the users.
Generally speaking, a user interview consists of the following elements:
A facilitator to conduct the interview
A participant to provide information during the interview
A note-taker or note-taking device to capture information from the interview
A script that the facilitator uses to conduct the interview
Each one of these elements helps ensure you are prepared for the conversation, able to pay full attention to the user, and can capture the data you need to move forward in the design thinking process.
How Do You Start Preparing?
To have an efficient, successful interview, it’s important to do as much preparation as you can. One way to prepare is by coming up with a list of questions you have about your users’ experiences. By knowing what questions you want to ask in advance of the interview, you can create a well thought-out, structured interview that gets the most information possible from the people you speak with. To do this, you can create something called a user interview script.
A user interview script is a list of questions you want to ask research participants, structured around the topic you wish to understand better. It’s a roadmap, or a set of guidelines, to conduct research and navigate a conversation with real users to obtain the information you need to empathize with them and design for them.
To build one, align as a team around the questions you want to ask participants. Debate the pieces of information you need to learn during a conversation so that you are well-prepared to ask the right questions and not waste users’ time or bias the test in any way. By taking the time to build a script, you can enter an interview with clear, unbiased, and non-leading questions.
Let’s look at the anatomy of a user interview script.
Introduction
At the beginning of each user interview, take time to introduce yourself. Explain who you are, why you’re here today, and the context around what you’re working on. Set the expectations for how much time it will take and the types of questions you’ll ask. Provide time for your participant to ask any questions as well.
Here’s a sample introduction I’ve used before on projects:
Hi there! I’m a designer working at [company] on [product vertical]. I’m looking to learn more about your thoughts and opinions on the [industry I work in]. I’m going to ask you some questions today about how you interact with [product]. Overall, this should take around [number of minutes]. How does that sound? Do you have any questions before we begin?
The goal at this stage is to make sure the user is comfortable, that expectations are clear, and that everything is ready to go for the interview.
I also like to add one extra part at the beginning that helps segue into the interview and allow the user to feel comfortable with recording the conversation:
Great. One last thing—before we begin, would you mind if I record this session? It’ll be used only to share internally with my team later.
This piece is important to ask so that the user is aware they are being recorded and gives you their permission to do so. You’re capturing data about your users, and it’s respectful to inform users you are doing so and to gain their permission before you start recording.
You might think users won’t give you permission if you ask explicitly, but it’s the right thing to do, to preserve their wishes and treat their information as sensitive. Additionally, in some states, it is illegal to record someone without their permission. Thankfully, just about every user I’ve asked this question has given me permission—out of all the user interviews I’ve done before, I’ve had someone request not to be recorded only once.
Opening Questions
These questions get the interview started. They should be easy to answer and easy to follow. They aren’t typically demographic in nature, as that information comes from something like a screener survey. Rather, they’re used to get the user comfortable and ready to think about the topics you want to discuss.
What do you do for a living?
Are you familiar with [product]?
How often do you use [product]?
The goal here is to get users warmed up for the interview. These questions are supposed to be simple, introductory questions that establish rapport with the user and get them comfortable with providing answers. A few simple questions that invite the user to participate and show that you’re listening to their responses opens the interview for deeper questions (and responses) later.
These questions also frame the user interview, as they get the user to start thinking about the product and how they use it.
Specific Questions
After the users are comfortable and thinking about the main topic, ask questions that allow you to better empathize and design. These are the questions that help inform the features, the product you want to make, or how you can better solve the users’ problems.
Imagine you were working on a payments product and trying to understand more about users’ behaviors around making donations. You seek to understand how people find and donate to charitable causes. To do so, you need to learn more about the specifics around these topics—how do people find them? How often, in general, do they donate to them? What motivates them to donate, and why? You could ask several different questions, like so:
How do you usually discover causes to donate toward?
When was the last time you donated to a cause?
What motivates you to donate to a cause?
Additionally, you may have research goals around creating campaigns for causes, not just discovering and donating to them. You can probe into this behavior, with the goal of creating a marketplace for people to create causes and for others to find and donate to them:
Have you ever created a campaign for people to donate toward?
How did you promote your campaign?
What would encourage you to create a campaign?
Closing Remarks
At the end of the interview, it’s important to provide a few minutes to wrap things up and give the space for any information you didn’t plan to discover. To do so, you will want to invite the user to share anything they haven’t yet. End the interview with the question:
Is there anything else you’d like to share with us?
This will allow you to learn about new pieces of information you may want to explore in your research. If a user responds with:
Yes, I’d love to tell you about this product I use all the time…
Then you have a new source of inspiration you can look at for your ideation, a step that occurs later in the design thinking process. Give users the space to tell you about unprompted information. Great questions to ask at the end of an interview include:
Is there anything else you’d like to share with us?
Is there anything we didn’t talk about today?
What’s one thing I didn’t mention yet that I should know about?
Once the interview is over, thank your users for their time, and take some time on your end to review your notes or write down any observations that stood out in your mind during the interview. Ideally, you’ll have another person taking notes, you’ll record the session, or both so that you can review it later.
Question Quality
To help scripts be as effective as possible, you should consider the quality of questions and how you structure them, to achieve the best information possible from users so that you may help them.
Open-ended
Open-ended questions are crucial for getting the most information out of users. They require elaboration and can’t be answered as easily as a binary question. Look at the following question:
Do you like our product?
What are the possible responses here? A user will answer the question—yes or no. Did that give you enough information? Do you know more about their preferences or attitudes? Unfortunately, these types of questions don’t add enough to give you more context and need to be elaborated on. Usually, you’ll have to follow up with “why” to get more information.
Let’s improve it:
What do you like about our product?
Now, it’s more open ended. It requires the user to think and say what they like. This question will give you more information.
Unbiased
Questions should avoid biasing users’ responses. Instead of assuming a certain quality or attribute about a situation or attitude, remain neutral—and the questions should as well. Consider the previous question:
What do you like about our product?
Sounds pretty innocent—it is an open-ended question about the user’s attitude toward the product. However, this question already assumes part of the answer—that our product is liked.
Let’s improve it:
What do you think about our product?
Now, that bias is gone—the user may not like it. Maybe they don’t like anything about the product. That’s information you need to know! This gives the user the opportunity to share what’s on the top of their mind, rather than stay in the confines of a liked aspect of the product.
Past Experiences
By nature, user interviews are attitudinal. You are asking questions about how users feel, what they perceive, and what they think about situations and experiences. You get data about their current state of thinking.
However, it can be advantageous to ask users to recall how they acted, to gain context around their use cases. Getting users to think about the last time they took an action will get them to think about the situation, and the context around that situation, which will improve the quality of the information you get from them.
Consider the question from before:
What do you think about our product?
This is a good interview question, and one you should ask, but let’s toss another question in front of it:
Could you tell me about the last time you used our product?
Here, you are asking users to imagine a scenario in which they used the product—in this case, the last time they used it. They’ll think deeply of that experience, trying to recall the goal they had, the steps they took, and the outcome. By asking this question first, then asking what they think about it, they’ll be able to recall their most recent experiences with the product information you’ll want to know.
How Can You Structure the Script?
People always want to know how many questions to ask in an interview. Too few and you waste a lot of opportunity to learn the answers you wish to know. Too many and you run out of time or fail to go deep enough into behaviors and desires to get great data.
I find it’s good to work backward from the questions I have and how long the interview will be. I like to approach my interviewing flexibly, with enough content to make the interview worth it but not so much content that I fail to gain answers to my most critical questions.
Generally, I structure interviews like so:
Have a core set of your most important questions. These are your biggest, deepest unanswered curiosities, which will represent the bulk of your interview time.
Have a set of follow-up questions to your most important questions. These allow you to probe more deeply in case your core questions don’t reveal enough data.
Have a set of backup questions. These are things you’re curious about but are secondary to the most important unanswered areas for your research.
I then structure my interview script around my core questions, with follow-up questions for the core questions should I need them, and backup questions to help probe for ancillary wants and needs. It would look something like this:
Greetings
Permission to record
Icebreaker/easy-to-answer question
Core question 1
Follow-up
Core question 2
Follow-up
Core question 3
Follow-up
Core question 4
Follow-up
Core question 5
Follow-up
Backup question 1
Backup question 2
Backup question 3
Final thoughts
If I make it through all my script, great! If not, that’s OK too—I got most core questions answered because I put them first.
Structuring the script in this way allows for multiple interviewee participation styles:
If the participant talks too little, I have enough content to fill the interview and follow-up questions to promote dialogue.
If the participant talks too much, I prioritize the main content and if I don’t hit the extra content, that’s not a big deal.
I like having a short, simple warmup question to start most interviews. This helps ease the participant into the interview, breaks the ice so the participant feels more comfortable, and sets the tone for the conversation. Having an easy-to-answer question at the start gives the participant confidence in their answers and loosens them up to be more candid for later questions.
I also like to structure my scripts so the participant isn’t jumping around mentally. I’ll group related ideas and make sure the script flows between ideas naturally, rather than jump around between similar concepts. Asking a participant to recall an answer from two questions ago adds cognitive load and confuses them. I like to keep the script flowing between related concepts rather than ask about a thing, ask about a new thing, and then go back to the old thing.
It looks something like this:
Greetings
Permission to record
Concept 1
Concept 2 (builds off 1)
Concept 3 (builds off 2)
Concept 4 (unrelated to any previous concepts)
Concept 5 (builds off 4)
…
This way, each question flows into the next (and sometimes builds off the previous) so that the participant logically proceeds from one step of the interview to the next—just like a product experience I would design for them.
As for the number of questions to ask, that depends on how much time you have and how complicated your questions are. I can’t give a general answer that will cover all use cases, but a good rule of thumb for a 30-minute interview is six to eight core questions and as many backups as you’d like. Setup and final thoughts take a few minutes, and you don’t want the interview to feel rushed, so expect to get good answers to six to eight of the questions you have.
Prepare Beforehand for a Smooth, Well-Structured Interview
By preparing and putting thought into your script—the way you ask questions, the language in those questions, and the order to those questions—you can construct a well thought-out, well-flowing script to interview users with.
Let’s Do It!
Now that you’ve seen how to create a user interview script, let’s make one for our project. Remember that the problem space you want to apply design thinking to is solo travel—what can you do to encourage or otherwise support solo travel? You want to enrich and improve the lives of solo travelers.
To do so, you need to understand more about the solo traveler. What are their wants, their needs, their frustrations? Why do they travel alone? What stops them from doing so?
To understand more about their experiences, you need to talk with solo travelers. With your screener survey, you were able to find good candidates to speak with. Now, you want to ask them all your questions.
Take some time to think about what questions you have for solo travelers. Do you want to know why they travel alone? Perhaps how they travel alone? What about the difference between traveling alone versus in a group? These are all valid questions to have in mind for conversations with solo travelers.
Your task is to create a user interview script that you will use to talk with the solo travelers you found via your screener survey.
Remember to keep the tenets of good interview scripts in mind:
Ask permission to record and remind participants that conversations are private.
Keep questions open-ended and unbiased. Don’t assume answers and put those assumptions into your questions.
Make sure your questions have a flow to them—try to think of a logical progression to your questions, and make sure you group themes into similar sections in your script.