User interview analysis – turning raw data into insights effectively
Analyzing user interviews is the most exciting, but also the most messy step when conducting user research. It’s exciting because it can reveal eye-opening insights that help to create game-changing products and services. At the same time, it’s messy because there is no standard procedure to follow, no objective measure of progress or success and the sheer amount of unstructured data can be paralyzing.
This article will show you how to get started and provides concrete tips on how to analyze user interviews to uncover valuable insights.
We start off by introducing the goals of interview analysis, sheds light on what good results look like and show at what times analysis can take place. The core part is a step by step approach describing the process of how to analyze user interviews for best outcomes. It ends with a list of common mistakes that you can learn from.
- The goal of analyzing user interviews
- What makes good insights
- When analysis takes place
- Step by step approach to user interview analysis
- Common mistakes
Note: The user interviews we – the founders of Condens – conducted as a basis for our user research tool will serve as an example throughout this article. These interviews were with UX researchers, so it’s going to be a bit meta, but also fun :-)

The goal of analyzing user interviews
The question about the goal of analyzing user interviews may seem trivial, but it turns out it’s not. In fact, there are two goals both of which are important. One is obvious and the other not so much.
Turn raw data into insights
This is what most people think of when it comes to interview analysis. Starting with a bunch of notes and recordings to extract valuable learnings. And indeed the transformation of raw data into insights is a central objective of the analysis phase.
While analysis makes raw interview data more actionable, it is important to note that this step doesn't generate absolute truths. At best, we have increased certainty about a hypothesis or approach.
Get buy-in from stakeholders
The second and non-obvious goal of the analysis phase is to achieve buy-in among stakeholders and get them behind the findings. Insights are only valuable if they are subsequently used as a basis for decisions. This requires that colleagues or clients truly understand, acknowledge and retain these insights.
A proven way to achieve that is to let stakeholders actively participate in the analysis instead of merely presenting results to them. The activity of analyzing interviews as a group is just as important as the resulting insights.

At Condens, we consciously decided that all three founders (from UX, engineering and business) would take part full time. In hindsight we would definitely do this again, as it increased trust in the findings and created a shared understanding in the team which was the basis for efficient decision making later on. An additional positive effect was that the diverse backgrounds resulted in a great variety of perspectives and new insights. So whenever possible invite designers, product managers, engineers or whoever has a stake in the project to take part.

What constitutes a good insight
How do you know you have done a good job and the analysis was successful? While there is no objectively measurable way to determine a good insight, there are indicators that show the right direction.
1. Trustworthy - grounded in data
The process of interview analysis is the abstraction of raw observations into more general insights. For those to be reliable, it’s important that they are based on evidence from the interviews.
A common problem to watch out for here are cognitive bias, e.g. the tendency to confirm what we already believe. These can distort the analysis process and misguide decisions. Read this article about common bias in user research and tips to avoid them.
While it’s important to ground insights in evidence, we shouldn’t forget the limitations of qualitative data. For instance, user interviews will not yield statistically significant results. Rather focus on the strengths of qualitative data in revealing causal relationships, emotional states of users and thus far unnoticed perspectives.
2. Relevant - fitting to the research goal
Interview analysis will likely take several hours over the span of multiple days. It’s easy to get distracted in details and lose sight of the bigger picture. This may result in spectacular findings that have nothing to do with the initial research questions (hint: still save those findings as they may become relevant in the future). To avoid unintended deviations continuously remind yourself of the main questions to be answered and make them visible regularly during the process.
3. Novel - uncovering what was hidden
To be clear: it’s totally fine if insights confirm previous beliefs. Analysis shouldn’t bend evidence just to produce new and exciting findings. At the same time, looking a bit deeper into the data instead of only scratching the surface can reveal unexpected connections or entirely new topics. These unexpected findings multiply the value of user interviews.
When analysis takes place
User interview analysis is a distinct step in the course of a research project. But this doesn’t mean analysis only happens during this designated time. The brain immediately starts to process new data by trying to make sense of it. Ideas can spark and patterns can emerge already while conducting the interviews. Also a joint recap with the team after each interview is a great way to identify early ideas while memory is fresh. Be sure to capture all these ideas immediately when they arise.
There are two possible approaches when to schedule the main analysis part within the course of the project.
Analysis in one go
With this approach analysis starts after all interviews are completed. All data is available from the start, which might make it easier to recognize patterns since there is more related evidence. One longer block of analysis facilitates to get into a state of flow as there are less interruptions. Of course you have to be aware of fatigue and keep an eye on the team’s energy level.

Batch-wise analysis
The idea of this approach is to divide interviews into batches and to conduct shorter analysis sessions after each batch. One advantage is that you can still adjust the questions of upcoming interviews, for instance to focus on an underrepresented topic. It also allows to provide preliminary results to stakeholders or is used when there is not enough time for analysis after the last interview, e.g. due to a tight deadline. From a practical standpoint, it is easier to find several shorter blocks of time in the calendars of busy stakeholders than one large block.
A disadvantage of this approach is higher switching cost. Mentally preparing for analysis requires time, in particular to get evidence into short term memory which happens multiple times here.

A related question at this point is how much time should be allocated for analysis. You may have guessed that the answer is: it depends. The tendency is to underestimate how long it takes and to not reserve enough time. If faced with the decision, we’d rather recommend decreasing the scope and focusing on the most important topics first instead of rushing through. On the other hand you could always do more analysis, so be sure to timebox the sessions.
As a reference, we spent a full week to analyze the 18 hours of interview data we collected for Condens.
Step by step approach to user interview analysis
Now let's start digging into the data. The basis for successful analysis – or synthesis as it is also referred to – is good note taking and we assume you have documented all interviews amply and consistently. When working collaboratively with stakeholders, block sufficient time in their calendars and inform them up front about what to expect.
We will tackle the analysis in three steps:
- Familiarize with the data
- Synthesize
- Convert findings into output
Step 1: Familiarize with the data
The goal of this first step is to prepare the brain to forge connections by getting the data into the short term memory. It’s like loading information into a computer to be able to work with it. In practical terms, this usually means reading the interview notes carefully. This is easier if team members were involved in the interview phase, for example as note takers.
To turn this familiarization step into a group activity assign each stakeholder to a participant, let them read through the respective notes and present themselves from their assigned participant’s perspective to the team. Then take some time to discuss each participant in the group. As there usually are more interviewees than team members you can repeat this multiple times.
Step 2: Synthesize
This part doesn’t follow a very strict process. Here we’ll show you four techniques which can serve as starting points. Use them flexibly and adapt them to your needs if necessary.

Structure data into themes
As qualitative data is inherently unstructured and thus difficult to analyze, the initial task is to make it comparable across participants. For that, we assign individual responses to more generic themes.
The topics that you asked about during the interviews make good starting points for these themes. For instance, when we interviewed UX researchers before starting to work on Condens we addressed topics like participant recruiting, data privacy and sharing research findings.
When working with digital tools, a practical way to assign notes to themes is using tags. A tag is a label indicating which theme a note belongs to. Regular text editing software doesn’t allow this sort of tagging easily, so rather use a spreadsheet or a dedicated user research tool.

In the screenshot above you can see interview notes in a spreadsheet. Each note has its own row and the columns on the right show the respective tags.
In a dedicated user research tool like Condens you have a text document format for notes and can assign multiple tags directly to the relevant words and sentences.

A common question is whether to come up with the tag names before starting with the tagging or create them on the go. The short answer: both is possible. Usually you need to iterate over these tags while working through the data as new themes come up or two themes merge into one.
A tip for collaborative synthesis in a digital tool: book a meeting room and use a large screen to do the tagging and further steps of the analysis together. This is an approach that Freeletics likes to use to facilitate collaboration in user research. This also works well remotely during a video call.

After you have tagged the first few interviews together as a team and built a common understanding, you may want to split up and do the remaining tagging in smaller groups or individually to progress more quickly.
The analogue alternative to tagging data is to use post-its. For that, write one response per post-it and place similar responses closely together on the wall. Make sure to note the participant’s name or a short sign to have a link to the raw data and be able to get the context again. Colors are great for segmentation, e.g. to indicate a certain user group. In the image below we used yellow for in-house researchers, pink for freelancers and blue for researchers in agencies.

Look for cross-participant connections and cluster related evidence
With the notes organized into themes, you can now dig into each of these themes separately. In a digital tool, use filters to focus on one tag in particular and look for commonalities or contradictions among the responses.

Encourage team member to share their thoughts to the group as this is helps to form new concepts and understanding.
Next you can pull together related observations into clusters. This method is also called affinity mapping or affinity diagramming and enables you to connect pieces of evidence to build up a broader understanding. For instance, we identified the sharing of video clips to be a recurring pattern within the theme of sharing research findings.
When working with post-its, just rearrange them to fit your observed clusters.

Admittedly, affinity diagramming is a bit difficult in a spreadsheet. One way to do it is to put the cluster a note belongs to in an additional column next to the tags, but it’s not really an elegant solution.
Depending on the amount of data it can take one hour or more to work through each topic. Pay attention to newly emerging themes and be prepared to split up or unify existing ones. Remember that it’s not a rigid process and you may need to start over with a newly discovered theme. That’s how you iterate your way towards the insights.
Use segmentation to reveal underlying patterns
It helps to look at the research data from different perspectives to get a deeper understanding of a topic. It’s like applying different lenses that help to see connections more clearly. Depending from where you look, the world can seem very different.
The metadata about the participants can be a key to discover hidden patterns. In a B2B context, that could be the participants’ job title, the size of the company they are working at or the industry. In a consumer context demographic data or level of experience with a certain product could be relevant criteria.
At Condens, after our interviews with UX researchers we recognized that some participants lamented about the time pressure to deliver research results while others didn’t mention this issue. We looked a bit deeper and found that this pain point was mostly coming from freelancers and researchers in agencies. In-house researchers didn’t complain about tight deadlines a lot. In hindsight this makes sense as external researchers work on a contractually defined project and the next client might already be waiting.

Of course not all phenomena are explainable with the available data and it requires critical thinking and potentially some more research to avoid presuming causal relationships that aren’t actually there.
Analyze across themes
Besides changing the perspective, changing the resolution is another method to get a clearer understanding of the data. After we identified themes and did a deep dive into each of these in the previous steps, now we zoom out and look at the bigger picture.
Identify how the themes relate to each other and try to understand their relative importance, chronological order or causal relationships. We used a purple post-it to indicate a theme and put the respective name on it.

Step 3: Convert findings into output
With the extensive analysis phase finished there comes the point when you ask yourself what to do with all of these insights. The final step is to turn what you learned into a tangible output.
There are two purposes for this:
- It makes it easier to convey the insights to stakeholders who were not directly involved in the project and also helps them to retain what you found. So think of the output as a tool to share findings.
- It initiates the transition towards putting the insights into action and thereby helps to move from learning mode to doing mode.
The best form of output depends on your initial research questions. Examples for commonly used outputs are:
- A prioritized list of pain points and opportunity areas
- A user journey including highlights and lowlights
- Jobs to be done
- User personas

Before the team parts, make sure to have concrete next steps planned to bring findings to action. This could be a decision workshop, prototyping session or a design sprint. Also think about how you want to store your data and findings to have them accessible for the future. It should be easy for stakeholders to go back and look up certain aspects of the research at any time.
Common mistakes
The best way to improve your user interview analysis skills is to practice and to learn from your experience. Here are some mistakes others already did to help you improve even faster:
Quantifying data
Be careful with formulating quantitative statements based on qualitative data, for instance: “75% of the participants mentioned data security to be a problem.” Especially the use of percentage values easily leads to wrong generalizations about market sizes.
User interviews aren't designed to yield statistically relevant results. At best you can formulate hypotheses about the general market that can be validated using quantitative data.
If using figures, stick to the “6 out of 8” format to remind your audience that total numbers are low. In general using numbers is not bad, as it helps to spot outliers, e.g. an opinion that only one participant had.
Expecting perfect clarity
Qualitative data is messy and it will still be (although hopefully a lot less) after the analysis. Remember the goal is to reduce uncertainty, you won’t get perfectly clear answers. You want just enough certainty to make a decision about the next steps.
In fact, if there is very little uncertainty left you have probably overanalyzed and it would have been more effective to move on and try things out.
Do you have additional tips for interview analysis that we should include in this article? Or have you tried the techniques above and want to share your feedback? Write us at hello@condens.io. We look forward to hear from you.