Create your survey

Create your survey

Create your survey

User interview questions: the best questions for usability testing that drive actionable insights

Adam Sabla - Image Avatar

Adam Sabla

·

Sep 9, 2025

Create your survey

The right user interview questions can transform your usability testing from surface-level feedback into actionable product insights. This article helps you craft effective prompts for in-product interviews that genuinely move the needle.

With in-product conversational surveys, you ask these questions at the perfect moment—right when users interact with your features. Conversation-driven UX surveys dig deeper than boring forms, surfacing context, pain points, and motivations as real humans experience your product.

Core questions every usability test needs

Kicking off any in-product interview, you want a handful of questions that are simple yet revealing. Using the best questions for usability testing sets the tone for honest, detailed feedback. Here are the essentials I always include:

  • “What were you hoping to accomplish when you opened the product today?”
    This starter gets at real intent, and for new users, it uncovers initial needs while returnees often reveal deeper jobs-to-be-done.

  • “Was anything confusing or frustrating just now?”
    Uncovers friction in the moment, instead of relying on distant, less honest recollections. For first-time users, swap “just now” for “during your first session.”

  • “Did you find what you needed? If not, where did you get stuck?”
    Goes direct on discoverability. New users might share navigation pain, while power users flag subtle blockers.

  • “If you could change one thing about this experience, what would it be?”
    A classic way to prompt for actionable ideas. Swaps nicely for “What’s missing?” if you want to focus on features, not flaws.

  • “What did you like most about using this feature?”
    Balances the critical with the positive, shining light on strengths you should double down on.

  • “How long did it take to finish what you set out to do?”
    Puts time and effort into focus. In fact, usability testing can reduce task completion time by up to 40%—a reminder that efficiency is worth tracking. [5]

  • “Would you use this feature again? Why or why not?”
    Surfaces stickiness and real intent. For returning users, “What keeps you coming back?” works well as a variation.

Good Practice

Bad Practice

“Was anything confusing or frustrating just now?”

“Rate your satisfaction on a scale of 1–5.”

“If you could change one thing, what would it be?”

“Is the feature acceptable?”

AI-driven follow-up questions can take a vague answer (“It was okay…”) and instantly probe deeper, surfacing actionable detail—especially if you’re using automatic AI follow-up questions to make your survey more adaptive.

Scenario starters that uncover real user behavior

Scenario-based questions beat hypotheticals every time. They help users replay real moments, leading to sharper, more specific insights. Here are my go-to scenario starters for usability interviews:

  • “Walk me through the last time you tried to [complete core task] using our product.”

  • “Think back to when you first used [feature X]. What was the hardest part of getting started?”

  • “Imagine you’re helping a friend do [goal]. How would you explain our product to them?”

  • “Tell me about a time you almost gave up using [feature]. What happened?”

  • “Describe the steps you took from signing in to finishing your main task. Where did you hesitate?”

Example prompt: “Describe the last time you used our export tool. What steps did you take, and where did you slow down or get stuck?”

Real experiences beat hypotheticals: People anchor their answers on actual frustrations and triumphs, producing insight you can trust and act on. That honesty is why in-app surveys see an average 13% response rate—13x higher than cold email forms [2].

When answers are ambiguous, I’ll use natural clarifiers like:

  • “Can you give an example?”

  • “What made that step difficult?”

  • “Was there anything unexpected?”

  • “How would you have changed that step to make it easier?”

With conversational surveys, these clarifiers flow naturally—AI can adapt to user tone, almost like a real teammate, making the whole interview feel less like a test and more like a helpful chat.

Follow-up rules that get to the ‘why’ behind user actions

Dynamic follow-ups are where the magic happens in usability testing. You don’t just collect responses—you dig for the ‘why’. Here’s how I structure follow-up logic that works, with examples:

  • Clarification: If a user gives a short or ambiguous answer, follow up with “Can you tell me a bit more about what you mean?”

  • Motivation: After positive or negative responses, probe with “What made you feel that way?”

  • Alternatives: If a user reports giving up, ask “Did you try any workarounds or different products?”

  • Specifics: When someone mentions a pain point, follow up with “Where exactly did you get stuck?”

  • Stop rules: If a respondent gives three consecutive ‘No issues’ answers, AI shouldn’t nag further—end politely instead.

  • Depth limit: For sensitive topics, set the AI to ask a maximum of two follow-ups to avoid survey fatigue.

For example, configuring the survey on Specific can look like:

Example prompt: “If the response mentions any difficulty, politely ask for a specific example, then stop after one clarification. If the user seems frustrated, acknowledge their pain point before moving on.”

Follow-ups make the difference between a static form and a true conversational survey. You can fully tailor this behavior in the AI survey editor—describe your rules in plain language, and the AI adapts live.

Example prompt: “For any answer mentioning speed or performance, immediately ask which part felt slowest. Otherwise, don’t follow up.”

When and where to trigger usability questions

The best user interview questions work because they’re asked when the user has real, fresh context. That’s why timing and placement matter so much for in-product interviews:

  • After a user completes a key task or workflow (e.g., finishes checkout or publishes content)

  • When users explore a new feature for the first time

  • On exit or logout, to catch feedback before memory fades

  • After multiple failed attempts or error triggers (like 404s)

  • During onboarding, after a major milestone (not in the first 30 seconds!)

Smart targeting prevents survey fatigue: You can target by user behavior, so only relevant users see the questions—and add frequency controls, like:

  • Prompt no more than once per user, per week

  • Set a minimum gap (e.g. 14 days) between survey triggers for the same user

  • Exclude users already surveyed this month

Timing matters—a question delivered right after a completed task gets instant, actionable feedback. In-product surveys can increase usability test response rates 4x compared to email, because context is everything [3].

Optimal Timing

Poor Timing

Post-task completion (“Congrats! What went well?”)

Random pop-up after login

On feature discovery (“What attracted you to this?”)

Unrelated feedback form in the middle of a workflow

Making your usability interviews actionable

Once answers roll in, you need to extract real insights, not just anecdotes. That’s where AI-powered survey response analysis shines. I rely on AI to:

  • Cluster open-ended responses into actionable themes

  • Summarize each conversation thread so nothing gets lost

  • Highlight urgent patterns you might otherwise miss

  • Segment findings by user type (e.g., new vs repeat customers, early adopters vs strugglers)

AI summaries surface patterns humans miss: Chat with AI about responses, digging into topics like “Why do users drop off after step 3?” or “Which features do power users mention loving the most?”

It’s easy to create multiple analysis threads in Specific—one for UX pain points, one for feature requests, another for language confusion. This lets you attack problems from every angle and feed real insights back into your product roadmap.

If you’re not running these in-product interviews, you’re missing out on understanding why users struggle with your product. You’ll never spot the invisible friction points, silent frustrations, or hidden delights that make or break the user experience—and your business.

Start capturing deeper usability insights today

Get richer insights than boring forms ever could—start having real conversations with users via AI-powered, natural-feeling usability interviews. Create your own survey in minutes and transform feedback from noise into decision-ready insights. AI-powered conversations drive action—don’t settle for anything less.

Create your survey

Try it out. It's fun!

Sources

  1. Gitnux. Surveys conducted with a conversational tone have a response rate of 35-40%.

  2. Alchemer. In-app survey response rate benchmarks.

  3. Userpilot. Case study showing a 4x increase in usability test response rates from in-product surveys.

  4. wpwax. Conversational surveys can increase survey response rates by up to 27%.

  5. Moldstud. Usability testing can reduce task completion time by up to 40% and increases revenue growth.

  6. VWO. User experience and usability statistics.

Adam Sabla - Image Avatar

Adam Sabla

Adam Sabla is an entrepreneur with experience building startups that serve over 1M customers, including Disney, Netflix, and BBC, with a strong passion for automation.

Adam Sabla

Adam Sabla is an entrepreneur with experience building startups that serve over 1M customers, including Disney, Netflix, and BBC, with a strong passion for automation.

Adam Sabla

Adam Sabla is an entrepreneur with experience building startups that serve over 1M customers, including Disney, Netflix, and BBC, with a strong passion for automation.