Create your survey

Create your survey

Create your survey

The best questions for ux interviews: how to run user interview ux sessions that reveal user experience insights

Adam Sabla - Image Avatar

Adam Sabla

·

Sep 11, 2025

Create your survey

Getting valuable insights from user interview UX sessions depends entirely on asking the right questions—and knowing how to follow up when users share something interesting.

In this article, I walk through the 25 best questions for UX interviews, grouped by research goal and paired with actionable AI-powered follow-up prompts. These are especially powerful in conversational surveys that dig deeper for authentic, actionable answers.

Questions about user goals and motivations

Understanding user goals and motivations is the foundation of impactful UX research. When you know what drives your users, you can design experiences that actually help them get what they want. Plus, 73% of UX professionals believe AI improves their workflow efficiency—making it even more crucial to use smart, probing survey tools for this work. [1]

Here are essential questions and AI follow-ups to use in your interviews or when creating questions with an AI survey generator:

  • Question: "What are you trying to accomplish with [product/feature]?"
    Why it matters: Centers the conversation on the user's real objectives, not assumptions.
    AI follow-up prompt:

    Ask why this goal is important to them and what happens if they can't achieve it. Probe for emotional impact and business consequences.

  • Question: "What made you start looking for a solution like this?"
    Why it matters: Reveals the catalyst behind user engagement and prior pain points.
    AI follow-up prompt:

    Explore the main trigger that made them search for a solution. Ask about alternatives they considered and why those weren't enough.

  • Question: "What's the most important outcome you hope to see after using this?"
    Why it matters: Surfaces the user's personal or business success metrics.
    AI follow-up prompt:

    Probe for how they measure success—time saved, cost, personal satisfaction, or something else? Ask for examples.

  • Question: "How would you describe the ideal experience when using [product/feature]?"
    Why it matters: Gives vision for what 'great' looks like from the user's point of view.
    AI follow-up prompt:

    Ask what specifically would make that experience ideal and whether they’ve seen it done elsewhere.

  • Question: "If this product disappeared tomorrow, what would you miss most?"
    Why it matters: Identifies core value as perceived by users.
    AI follow-up prompt:

    Explore which tasks or results would become harder and how they would try to replace the missing value.

  • Question: "What other tools do you turn to for similar needs?"
    Why it matters: Pinpoints competing products or real-world workarounds.
    AI follow-up prompt:

    Ask what those other tools do better or worse, and why they sometimes choose those alternatives over your product.

High-quality, well-targeted questions at this stage set up all your later research for success.

Questions to uncover pain points and frustrations

Understanding pain points is where conversational surveys show their power—users often downplay or skip details of frustration in traditional forms, but open up with the right probing. AI follow-ups, like those built into Specific's automatic AI follow-up questions feature, help unearth these hidden details. This translates into more opportunities for UX improvement.

  • Question: "What's the most frustrating part of your current process?"
    Context: Zeroes in on what's not working, often revealing ideas for improvement.
    AI follow-up configuration:

    When they mention a frustration, ask for a specific example of when this happened last. Then explore what they tried to do instead and how much time/money it cost them.

  • Question: "Have you ever gotten stuck or confused using [product/feature]? What happened?"
    Context: Confirms real struggles with usability or logic.
    AI follow-up configuration:

    Ask what they did next: did they look for help, give up, or try something else? Probe for feelings or thoughts at that moment.

  • Question: "Can you recall a situation where the product didn’t meet your expectations?"
    Context: Opens the door to specific stories of disappointment.
    AI follow-up configuration:

    Explore what specific expectation wasn’t met. Ask how important that was to their overall satisfaction.

  • Question: "Are there parts of the experience that take longer or require more steps than you’d like?"
    Context: Surfaces friction that leads to indifference or churn.
    AI follow-up configuration:

    Ask which step is the worst offender and how they imagine it could be faster or simpler.

  • Question: "When do you feel most frustrated or annoyed when interacting with [product/feature]?"
    Context: Finds the emotional low points in the user journey.
    AI follow-up configuration:

    Ask what triggers that feeling and what, if anything, helps them recover from it.

  • Question: "What problems did you try to solve before this one?"
    Context: Reveals longstanding or recurring pain.
    AI follow-up configuration:

    Ask whether those problems still happen and what attempts failed to solve them.

  • Question: "Is there anything about the product that makes you hesitant to recommend it to a friend?"
    Context: Captures sources of skepticism or perceived risk.
    AI follow-up configuration:

    Probe for what would need to change for them to feel confident recommending it—and why.

  • Question: "What’s the one thing you wish you could change instantly?"
    Context: Cuts through the noise to the single biggest ask.
    AI follow-up configuration:

    Push for why this is the top priority over other issues and how it would change their experience.

AI-driven conversational surveys don't just ask for problems—they dig for context and alternatives, helping you prioritize fixes that make a real difference.

Jobs-to-be-done questions for deeper understanding

The jobs-to-be-done (JTBD) framework is all about uncovering the 'job' users are hiring your product to do—focusing not on features, but on the progress or outcome users are seeking. These questions are gold for finding both expected and totally new use cases.

  • Question: "When you started using [product/feature], what was happening in your life or work?"
    Why it matters: Clarifies context and triggers for adoption.
    AI follow-up prompt:

    Ask about alternative ways they tackled these challenges before using your product. Probe for what changed that led them to switch.

  • Question: "What progress were you hoping to make with this tool?"
    Why it matters: Frames the user's deeper intention beyond just using a tool.
    AI follow-up prompt:

    Explore how using your product helped (or didn't help) them make that progress. Ask about moments when they felt real advancement.

  • Question: "What would you have done if this product wasn’t available?"
    Why it matters: Surfaces true competition and substitutes, not just competitors.
    AI follow-up prompt:

    Ask for details about that alternative—how effective it was and whether they'd go back to it now.

  • Question: "Have you recommended [product/feature] to anyone? Why or why not?"
    Why it matters: Measures advocacy and can hint at unmet jobs or pain points.
    AI follow-up prompt:

    Probe for what would make them more likely to recommend, or reasons they hesitated to do so.

  • Question: "What surprised you about using this product compared to what you expected?"
    Why it matters: Reveals hidden value or unexpected pain.
    AI follow-up prompt:

    Push further: Was the surprise positive or negative, and how did it shape their overall experience?

  • Question: "Has your use of the product changed over time? In what ways?"
    Why it matters: Indicates a broader or evolving job that could inform future design.
    AI follow-up prompt:

    Ask for examples of how their workflow changed before and after using your solution.

Jobs-to-be-done questions focus on the progress users crave. When you leverage smart AI probing, you’ll often surface unexpected jobs—sometimes the most valuable insight of all.

Questions about behavior and decision-making

What users say and what they actually do aren’t always the same. That’s why focusing on behavior—not just opinions—matters for real UX insight. Behavioral questions, paired with AI-powered probing and analysis like AI survey response analysis, help you understand what is actually happening, not just what people wish would happen.

Type

What You Learn

Example

Opinion Questions

Aspirations, beliefs, or perceptions

"Would you recommend this to a friend?"

Behavioral Questions

Concrete actions and frequency

"When was the last time you used the feature?"

  • Question: "Can you walk me through how you typically use [product/feature]?"
    AI follow-up prompt:

    Ask for the last time they went through this process step-by-step, including any shortcuts or workarounds.

  • Question: "When was the last time you used [feature]? What did you do?"
    AI follow-up prompt:

    Have them recall what led up to using it and if the outcome matched their expectation.

  • Question: "How do you decide which tool to use for [task]?"
    AI follow-up prompt:

    Probe for criteria or triggers that make them pick one tool over another. Get examples of recent decisions.

  • Question: "Which features do you use the most, and which do you ignore?"
    AI follow-up prompt:

    Explore why they skip certain features—are they hard to find, confusing, or just not useful?

  • Question: "Have you ever stopped mid-way while using the product? What happened?"
    AI follow-up prompt:

    Ask what made them pause and how often this pattern repeats.

  • Question: "How frequently do you find yourself reaching for help or documentation?"
    AI follow-up prompt:

    Explore what makes them ask for help instead of continuing, and how helpful those resources are.

  • Question: "Tell me about a time you discovered a new feature. How did it affect your usage?"
    AI follow-up prompt:

    Probe for how they learned about it, why they decided to try it, and whether it changed their regular

Create your survey

Try it out. It's fun!

Sources

Getting valuable insights from user interview UX sessions depends entirely on asking the right questions—and knowing how to follow up when users share something interesting.

In this article, I walk through the 25 best questions for UX interviews, grouped by research goal and paired with actionable AI-powered follow-up prompts. These are especially powerful in conversational surveys that dig deeper for authentic, actionable answers.

Questions about user goals and motivations

Understanding user goals and motivations is the foundation of impactful UX research. When you know what drives your users, you can design experiences that actually help them get what they want. Plus, 73% of UX professionals believe AI improves their workflow efficiency—making it even more crucial to use smart, probing survey tools for this work. [1]

Here are essential questions and AI follow-ups to use in your interviews or when creating questions with an AI survey generator:

  • Question: "What are you trying to accomplish with [product/feature]?"
    Why it matters: Centers the conversation on the user's real objectives, not assumptions.
    AI follow-up prompt:

    Ask why this goal is important to them and what happens if they can't achieve it. Probe for emotional impact and business consequences.

  • Question: "What made you start looking for a solution like this?"
    Why it matters: Reveals the catalyst behind user engagement and prior pain points.
    AI follow-up prompt:

    Explore the main trigger that made them search for a solution. Ask about alternatives they considered and why those weren't enough.

  • Question: "What's the most important outcome you hope to see after using this?"
    Why it matters: Surfaces the user's personal or business success metrics.
    AI follow-up prompt:

    Probe for how they measure success—time saved, cost, personal satisfaction, or something else? Ask for examples.

  • Question: "How would you describe the ideal experience when using [product/feature]?"
    Why it matters: Gives vision for what 'great' looks like from the user's point of view.
    AI follow-up prompt:

    Ask what specifically would make that experience ideal and whether they’ve seen it done elsewhere.

  • Question: "If this product disappeared tomorrow, what would you miss most?"
    Why it matters: Identifies core value as perceived by users.
    AI follow-up prompt:

    Explore which tasks or results would become harder and how they would try to replace the missing value.

  • Question: "What other tools do you turn to for similar needs?"
    Why it matters: Pinpoints competing products or real-world workarounds.
    AI follow-up prompt:

    Ask what those other tools do better or worse, and why they sometimes choose those alternatives over your product.

High-quality, well-targeted questions at this stage set up all your later research for success.

Questions to uncover pain points and frustrations

Understanding pain points is where conversational surveys show their power—users often downplay or skip details of frustration in traditional forms, but open up with the right probing. AI follow-ups, like those built into Specific's automatic AI follow-up questions feature, help unearth these hidden details. This translates into more opportunities for UX improvement.

  • Question: "What's the most frustrating part of your current process?"
    Context: Zeroes in on what's not working, often revealing ideas for improvement.
    AI follow-up configuration:

    When they mention a frustration, ask for a specific example of when this happened last. Then explore what they tried to do instead and how much time/money it cost them.

  • Question: "Have you ever gotten stuck or confused using [product/feature]? What happened?"
    Context: Confirms real struggles with usability or logic.
    AI follow-up configuration:

    Ask what they did next: did they look for help, give up, or try something else? Probe for feelings or thoughts at that moment.

  • Question: "Can you recall a situation where the product didn’t meet your expectations?"
    Context: Opens the door to specific stories of disappointment.
    AI follow-up configuration:

    Explore what specific expectation wasn’t met. Ask how important that was to their overall satisfaction.

  • Question: "Are there parts of the experience that take longer or require more steps than you’d like?"
    Context: Surfaces friction that leads to indifference or churn.
    AI follow-up configuration:

    Ask which step is the worst offender and how they imagine it could be faster or simpler.

  • Question: "When do you feel most frustrated or annoyed when interacting with [product/feature]?"
    Context: Finds the emotional low points in the user journey.
    AI follow-up configuration:

    Ask what triggers that feeling and what, if anything, helps them recover from it.

  • Question: "What problems did you try to solve before this one?"
    Context: Reveals longstanding or recurring pain.
    AI follow-up configuration:

    Ask whether those problems still happen and what attempts failed to solve them.

  • Question: "Is there anything about the product that makes you hesitant to recommend it to a friend?"
    Context: Captures sources of skepticism or perceived risk.
    AI follow-up configuration:

    Probe for what would need to change for them to feel confident recommending it—and why.

  • Question: "What’s the one thing you wish you could change instantly?"
    Context: Cuts through the noise to the single biggest ask.
    AI follow-up configuration:

    Push for why this is the top priority over other issues and how it would change their experience.

AI-driven conversational surveys don't just ask for problems—they dig for context and alternatives, helping you prioritize fixes that make a real difference.

Jobs-to-be-done questions for deeper understanding

The jobs-to-be-done (JTBD) framework is all about uncovering the 'job' users are hiring your product to do—focusing not on features, but on the progress or outcome users are seeking. These questions are gold for finding both expected and totally new use cases.

  • Question: "When you started using [product/feature], what was happening in your life or work?"
    Why it matters: Clarifies context and triggers for adoption.
    AI follow-up prompt:

    Ask about alternative ways they tackled these challenges before using your product. Probe for what changed that led them to switch.

  • Question: "What progress were you hoping to make with this tool?"
    Why it matters: Frames the user's deeper intention beyond just using a tool.
    AI follow-up prompt:

    Explore how using your product helped (or didn't help) them make that progress. Ask about moments when they felt real advancement.

  • Question: "What would you have done if this product wasn’t available?"
    Why it matters: Surfaces true competition and substitutes, not just competitors.
    AI follow-up prompt:

    Ask for details about that alternative—how effective it was and whether they'd go back to it now.

  • Question: "Have you recommended [product/feature] to anyone? Why or why not?"
    Why it matters: Measures advocacy and can hint at unmet jobs or pain points.
    AI follow-up prompt:

    Probe for what would make them more likely to recommend, or reasons they hesitated to do so.

  • Question: "What surprised you about using this product compared to what you expected?"
    Why it matters: Reveals hidden value or unexpected pain.
    AI follow-up prompt:

    Push further: Was the surprise positive or negative, and how did it shape their overall experience?

  • Question: "Has your use of the product changed over time? In what ways?"
    Why it matters: Indicates a broader or evolving job that could inform future design.
    AI follow-up prompt:

    Ask for examples of how their workflow changed before and after using your solution.

Jobs-to-be-done questions focus on the progress users crave. When you leverage smart AI probing, you’ll often surface unexpected jobs—sometimes the most valuable insight of all.

Questions about behavior and decision-making

What users say and what they actually do aren’t always the same. That’s why focusing on behavior—not just opinions—matters for real UX insight. Behavioral questions, paired with AI-powered probing and analysis like AI survey response analysis, help you understand what is actually happening, not just what people wish would happen.

Type

What You Learn

Example

Opinion Questions

Aspirations, beliefs, or perceptions

"Would you recommend this to a friend?"

Behavioral Questions

Concrete actions and frequency

"When was the last time you used the feature?"

  • Question: "Can you walk me through how you typically use [product/feature]?"
    AI follow-up prompt:

    Ask for the last time they went through this process step-by-step, including any shortcuts or workarounds.

  • Question: "When was the last time you used [feature]? What did you do?"
    AI follow-up prompt:

    Have them recall what led up to using it and if the outcome matched their expectation.

  • Question: "How do you decide which tool to use for [task]?"
    AI follow-up prompt:

    Probe for criteria or triggers that make them pick one tool over another. Get examples of recent decisions.

  • Question: "Which features do you use the most, and which do you ignore?"
    AI follow-up prompt:

    Explore why they skip certain features—are they hard to find, confusing, or just not useful?

  • Question: "Have you ever stopped mid-way while using the product? What happened?"
    AI follow-up prompt:

    Ask what made them pause and how often this pattern repeats.

  • Question: "How frequently do you find yourself reaching for help or documentation?"
    AI follow-up prompt:

    Explore what makes them ask for help instead of continuing, and how helpful those resources are.

  • Question: "Tell me about a time you discovered a new feature. How did it affect your usage?"
    AI follow-up prompt:

    Probe for how they learned about it, why they decided to try it, and whether it changed their regular

Adam Sabla - Image Avatar

Adam Sabla

Adam Sabla is an entrepreneur with experience building startups that serve over 1M customers, including Disney, Netflix, and BBC, with a strong passion for automation.

Adam Sabla

Adam Sabla is an entrepreneur with experience building startups that serve over 1M customers, including Disney, Netflix, and BBC, with a strong passion for automation.

Adam Sabla

Adam Sabla is an entrepreneur with experience building startups that serve over 1M customers, including Disney, Netflix, and BBC, with a strong passion for automation.