This article will give you tips on how to analyze responses from a survey of conference participants about accessibility and inclusion using AI-driven approaches and survey analysis tools.
Choosing the right tools for survey data analysis
The approach and tools you use to analyze your conference accessibility and inclusion survey responses will depend on the structure of your data. Let’s break down the options to help you work as efficiently as possible.
Quantitative data: If your survey asks how many people needed accessible seating, or what percentage felt the venue signage was clear, these responses are numbers and counts. You can easily analyze these in Excel, Google Sheets, or similar spreadsheet tools. Quick calculations and summaries, like tracking how many respondents reported issues, are straightforward.
Qualitative data: If you asked open-ended questions, such as “What barriers did you face as an attendee with a disability?” or followed up with “Tell me more about your experience,” you’re dealing with rich, nuanced responses. Reading dozens (or hundreds) of these is impossible to do well by hand. Here’s where AI comes to the rescue: specialized AI tools can summarize, classify, and extract core themes from these long text answers.
When you’re tackling qualitative responses, there are two common approaches for tooling:
ChatGPT or similar GPT tool for AI analysis
Copy-paste data, chat, and analyze: You can export your open-ended survey responses and paste the data directly into ChatGPT, Gemini, or similar AI platforms. Then, you’ll prompt the AI to find recurring problems, summarize experiences, or extract key quotes.
Convenience is a challenge: Transferring large blocks of survey text is tedious and running out of context space is common. You might find yourself slicing and dicing data, keeping track of which answers you’ve analyzed, and re-prompting the AI to get what you want.
Risk for error and misplacement: For complex projects—especially where accessibility and inclusion are involved—missing patterns or having to piecemeal insights can weaken your findings.
All-in-one tool like Specific
Purpose-built for AI survey analysis: Platforms like Specific are made for this exact job. They let you both collect survey data via conversational, AI-driven surveys and then analyze that data right in the same place.
Improved response quality: While gathering your data, Specific’s AI asks intelligent follow-up questions, uncovering detail that would be lost with forms. You collect richer, more reliable inputs—which matter deeply in accessibility and inclusion conversations.
Instant AI-powered insights: After data collection, Specific instantly summarizes open-text and follow-up answers, finds recurring themes, and turns everything into plain, actionable takeaways in seconds. No spreadsheets or manual data cleaning required. You can chat about your data exactly like you do in ChatGPT, but with all your survey logic and context kept safe.
Additional features for advanced analysis: With Specific, you can control what data the AI sees, manage multiple analysis sessions, and compare themes across subsets of participants—all in one place. For a hands-on overview, see the introduction to AI survey response analysis or explore how automatic AI follow-up questions work.
If you’re planning a survey from scratch or want a template tailored to inclusivity and accessibility, the AI survey generator for conference accessibility can jump-start your project.
Consider this: Only 35% of surveyed destinations report having resources in place to make meetings accessible[1]. The need for practical, data-driven analysis tools is pressing—with so many barriers still reported by conference-goers, being able to quickly turn survey data into action is a hallmark of successful inclusion programs.
Useful prompts that you can use to analyze survey data from conference participants on accessibility and inclusion
When analyzing qualitative responses, what you ask the AI matters just as much as how you collect the data. With the right prompts, you can surface core issues, validate hunches, and even get actionable advice for future accessibility improvements.
Prompt for core ideas: This is a universal prompt for surfacing main themes from a big pile of qualitative survey responses. We use (and recommend) this at Specific, but it’ll work for ChatGPT and similar AIs.
Your task is to extract core ideas in bold (4-5 words per core idea) + up to 2 sentence long explainer.
Output requirements:
- Avoid unnecessary details
- Specify how many people mentioned specific core idea (use numbers, not words), most mentioned on top
- no suggestions
- no indications
Example output:
1. **Core idea text:** explainer text
2. **Core idea text:** explainer text
3. **Core idea text:** explainer text
Add situation/context for better results: AI works much better if you quickly explain who your audience is and what your goals are. For example:
Analyze feedback from attendees of a recent business conference. The survey asks about accessibility and inclusion—focus on barriers and successfully implemented supports. We want to prioritize changes for future events. Use the core ideas format.
Dive deeper: After you spot a theme (e.g. “elevator signage hard to find”), prompt the AI with:
Tell me more about elevator signage comments.
Prompt for specific topics: If you want to check for a mention or idea in responses, use:
Did anyone talk about venue navigation? Include quotes.
Prompt for personas: To group responses by type of attendee (e.g., wheelchair users, people with hearing loss):
Based on the survey responses, identify and describe a list of distinct personas—similar to how "personas" are used in product management. For each persona, summarize their key characteristics, motivations, goals, and any relevant quotes or patterns observed in the conversations.
Prompt for pain points and challenges: Perfect for finding what’s not working:
Analyze the survey responses and list the most common pain points, frustrations, or challenges mentioned. Summarize each, and note any patterns or frequency of occurrence.
Prompt for suggestions & ideas: To get feedback on what to fix or add:
Identify and list all suggestions, ideas, or requests provided by survey participants. Organize them by topic or frequency, and include direct quotes where relevant.
Prompt for unmet needs & opportunities: To discover gaps that matter most for accessibility:
Examine the survey responses to uncover any unmet needs, gaps, or opportunities for improvement as highlighted by respondents.
These prompts work no matter what GPT-based tool you choose, but platforms like Specific have them built in for convenience. For question inspiration, there’s a handy guide to