View/Export Results
Manage Existing Surveys
Create/Copy Multiple Surveys
Collaborate with Team Members
Sign inSign in with Facebook
Sign inSign in with Google
Skip to article content

Survey question examples

Copy-ready questions by goal, plus guidance on choosing the right format.

Key Takeaways

  1. Start with the decision: Write questions that map to what you will do differently based on the results.
  2. Match format to analysis: Use rating/Likert items for trend tracking, multiple choice for categorizing drivers, and open-ended prompts for "what did we miss?" detail.
  3. Reduce bias with neutral wording: Avoid leading, double-barreled, and vague terms; specify the timeframe and context.
  4. Design the flow: Begin with easy, relevant questions; place sensitive and demographic items later; add follow-ups only where they change actions.
  5. Copy, then customize: Use the examples below as templates and replace bracketed text to fit your product, event, team, or customer journey.

Quick-start: pick your goal, then copy questions

A survey question is any prompt you use to collect a respondent's answer in a structured way (multiple choice, rating, Likert, open-ended, ranking, and more). If you want an overview of survey questions and how different formats work, start there.

Horizontal bar chart showing recommended question types across goals: open-ended 7, multiple choice 6, rating 5, Likert 2, ranking 2, dropdown 1, prefer not 1.
Open-ended and multiple choice appear in most survey goals.
Fast mapping from survey goal to question formats and copy-ready examples
GoalBest question typesCopy-ready examples (edit brackets)
Customer satisfaction0-10 rating, multiple choice, open-ended
  • Overall, how satisfied are you with [Company/Product]? (0-10)
  • What was the main reason for your score? (Open-ended)
  • Which part of your experience needs the most improvement? (Choose one)
Product feedbackMultiple choice, ranking, open-ended
  • Which features do you use weekly? (Select all that apply)
  • What problem were you trying to solve with [Product]? (Open-ended)
  • Rank the following improvements by priority. (Ranking)
Employee engagementLikert, rating, open-ended
  • I can do my best work with the tools provided. (Agree-disagree)
  • How supported do you feel by your manager? (1-5)
  • What is one thing we should start/stop/continue? (Open-ended)
Website/UXTask difficulty rating, multiple choice, open-ended
  • How easy was it to complete [task] today? (Very easy to very difficult)
  • What prevented you from completing your task? (Choose one)
  • What were you expecting to find but did not? (Open-ended)
Post-event feedbackRating, multiple choice, open-ended
  • How would you rate the overall event? (1-5)
  • Which session was most valuable? (Choose one)
  • What should we change next time? (Open-ended)
Training evaluationLikert, rating, open-ended
  • The training objectives were clear. (Agree-disagree)
  • How confident are you to apply what you learned? (0-10)
  • What topics should we add or go deeper on? (Open-ended)
Market/pricing researchMultiple choice, ranking, open-ended
  • Which alternatives did you consider? (Select all)
  • Which factor mattered most in your choice? (Choose one)
  • What price would feel "too expensive" for [offer]? (Open-ended numeric)
DemographicsMultiple choice, dropdown, "prefer not"
  • Which age range are you in? (Ranges)
  • What is your role? (Choose one)
  • Which region do you live in? (Choose one)
Want to improve question quality?

After you copy questions, scan the wording and options using how to write survey questions and the bias checks later in this guide. Small edits (timeframes, neutral phrasing, single-idea questions) usually improve data quality fast.

Customer satisfaction survey question examples

Use these when you need a clear read on satisfaction, effort, and what to fix first. If you want a ready-made set, see customer satisfaction survey questions and related templates.

Donut chart of customer satisfaction example question formats: multiple choice 6, rating 3, open-ended 3.
Customer satisfaction examples lean on multiple choice, with ratings and comments.

Overall satisfaction (CSAT-style)

  • Overall, how satisfied are you with your experience today? (0-10)
  • How satisfied are you with [Product/Service]? (Very satisfied, Satisfied, Neither, Dissatisfied, Very dissatisfied)
  • How well did we meet your expectations today? (Much better than expected to Much worse than expected)
  • What was the main reason for your rating? (Open-ended)

Support and service interactions

  • How easy was it to get the help you needed? (0-10)
  • Which channel did you use? (Phone, Email, Chat, In-app, In person, Other)
  • Did our team resolve your issue? (Yes fully, Yes partially, No, Not applicable)
  • What should we improve about our support experience? (Open-ended)

Drivers and priorities

  • Which part of your experience needs the most improvement? (Choose one: speed, accuracy, friendliness, product quality, pricing, communication, other)
  • Which of these issues did you experience today? (Select all that apply)
  • If we could improve just one thing, what should it be? (Open-ended)
  • How likely are you to continue using [Company/Product] over the next 3 months? (0-10)

For numeric satisfaction and effort items, use a consistent rating scale questions approach so you can compare results over time.

Product feedback survey question examples

These are useful for feature prioritization, usability pain points, and understanding the job-to-be-done (the real problem users hired your product to solve).

Stacked bar chart showing product feedback sections and question-format counts: Usage (MC3, Open1), Value (MC2, Rating1, Open1), Usability (MC1, Rating2, Open1), Prioritization (MC2, Ranking1, Open1).
Product feedback shifts from usage questions to usability ratings and prioritization.

If you are building a full questionnaire, start with these product feedback survey questions and customize.

Usage and context

  • How often do you use [Product]? (Daily, Weekly, Monthly, Less often, This was my first time)
  • Which features have you used in the last 30 days? (Select all that apply)
  • What are you mainly using [Product] for? (Choose one)
  • What tool(s) did you use before [Product]? (Open-ended)

Value and outcomes

  • How valuable is [Feature] to you? (Not at all valuable to Extremely valuable)
  • What outcome matters most to you when using [Product]? (Choose one)
  • How disappointed would you be if you could no longer use [Product]? (Not disappointed, Slightly, Moderately, Very, Extremely)
  • What is one way [Product] has helped you in the last month? (Open-ended)

Usability and friction

  • How easy was it to complete [key task] in [Product]? (Very easy to Very difficult)
  • Where did you get stuck, if anywhere? (Choose one, plus optional comment)
  • How clear are the labels and navigation in [Product]? (0-10)
  • If you could change one thing about the user experience, what would it be? (Open-ended)

Prioritization

  • Rank these improvements in order of priority. (Ranking)
  • Which improvement would have the biggest impact for you? (Choose one)
  • Which of the following should we stop doing or remove? (Select all that apply)
  • What feature are we missing that would make [Product] a better fit? (Open-ended)

Employee feedback and engagement question examples

Competitor pages often under-serve employee surveys. Use the examples below for engagement pulses, manager feedback, team effectiveness, and retention risk. For a full set, see employee engagement survey questions.

Engagement and enablement (agree-disagree)

These work well as Likert scale questions (for example: Strongly disagree to Strongly agree).

  • I understand what is expected of me in my role.
  • I have the tools and resources I need to do my job well.
  • In the last 2 weeks, I was able to focus on my most important work.
  • I see a clear connection between my work and our team's goals.

Manager and team effectiveness

  • My manager gives me useful feedback that helps me improve. (Agree-disagree)
  • How comfortable are you raising concerns with your manager? (0-10)
  • My team communicates effectively about priorities and deadlines. (Agree-disagree)
  • When priorities change, we are informed in time to adjust. (Agree-disagree)

Workload, wellbeing, and retention signals

  • How manageable is your workload right now? (Very manageable to Not manageable)
  • How often do you feel stressed at work? (Never, Rarely, Sometimes, Often, Always)
  • How likely are you to look for a new job in the next 6 months? (0-10)
  • What is the biggest obstacle to doing your best work? (Open-ended)

Growth and recognition

  • I have opportunities to learn and grow in my role. (Agree-disagree)
  • In the last 7 days, I received recognition for good work. (Agree-disagree)
  • How confident are you about your career path here? (0-10)
  • What is one skill you want to develop this quarter? (Open-ended)
Employee survey tip: protect candor

If results will be reported by team, avoid collecting identifiers you do not need. Put optional demographic items at the end, and include "Prefer not to answer" where appropriate.

Website and UX survey question examples

UX surveys work best when tied to a specific moment (right after a task, purchase attempt, or support search). Keep the wording concrete and time-bound.

Task success and friction

  • What were you trying to do on our website today? (Choose one)
  • Were you able to complete your task today? (Yes, Partly, No)
  • How easy was it to complete your task today? (Very easy to Very difficult)
  • What prevented you from completing your task? (Choose one, plus optional comment)

Content and navigation

  • How easy was it to find the information you needed? (0-10)
  • Which page or section were you looking for? (Open-ended)
  • What information was missing or unclear? (Open-ended)

Experience and trust

  • How confident are you that you found the right information? (0-10)
  • What, if anything, made you hesitate before taking the next step? (Open-ended)
  • Which words best describe your experience today? (Select up to 3)

Post-event survey question examples

Use these after conferences, workshops, webinars, or internal events. If you want a ready-to-send set, see post-event survey questions.

Overall rating and objectives

  • Overall, how would you rate the event? (1-5)
  • How well did the event meet your expectations? (Much better than expected to Much worse than expected)
  • How relevant was the content to your role/goals? (Not relevant to Extremely relevant)
  • What was your primary reason for attending? (Choose one, plus Other)

Sessions, speakers, and logistics

  • Which session was most valuable? (Choose one)
  • Which session was least valuable? (Choose one)
  • How would you rate the speakers/presenters? (1-5)
  • How would you rate the venue/platform and logistics (registration, timing, access)? (1-5)

Outcomes and improvements

  • What is one thing you will do differently because of this event? (Open-ended)
  • What should we change for next time? (Open-ended)
  • What topics should we include in future events? (Open-ended)
  • How likely are you to attend a future event from us? (0-10)

Demographic and profile question examples (optional)

Demographic items help you compare results across groups, but only ask what you will actually use. See more demographic questions guidance and examples.

  • Which age range are you in? (Under 18, 18-24, 25-34, 35-44, 45-54, 55-64, 65+, Prefer not to answer)
  • Which best describes your role? (Choose one)
  • Which industry do you work in? (Choose one)
  • What is the size of your organization? (1-10, 11-50, 51-200, 201-1,000, 1,001+)
  • Which country/region do you live in? (Choose one)
  • How long have you been a customer/employee? (Less than 3 months, 3-12 months, 1-3 years, 3+ years)
  • Which of the following best describes your work arrangement? (On-site, Hybrid, Remote)
  • What is your highest level of education completed? (Choose one, Prefer not to answer)
  • Which language do you primarily use at work/home? (Choose one, Prefer not to answer)
  • Is there anything else you would like us to know about your situation that affects your answers? (Optional open-ended)

Question type templates (with answer options you can reuse)

Choosing the right format upfront prevents rework later. Question format affects what analysis is possible (for example, how easily you can summarize results or compare groups), so decide based on how you plan to use the data. For a practical overview, see the University of Florida guidance on how question type impacts future analysis.

Multiple choice (single select)

Use when you need a clear category for reporting (top driver, main barrier, primary channel). For more patterns, see multiple choice questions.

  • What is the primary reason you chose [Product/Service]? (Choose one: price, features, recommendation, reviews, brand trust, availability, other)
  • Which best describes your experience today? (Choose one: completed successfully, completed with difficulty, could not complete, just browsing)
  • Where did you first hear about us? (Choose one)

Multiple choice (multi-select)

  • Which of the following did you use in the last [timeframe]? (Select all that apply)
  • Which issues did you experience? (Select all that apply)
  • What would you like us to improve? (Select all that apply, plus Other)

Rating scales (numeric or labeled)

Use when you need a metric you can trend. See more rating scale questions examples.

  • How satisfied are you with [topic]? (0-10)
  • How easy was it to [do task]? (0-10)
  • How likely are you to [renew/recommend/attend again]? (0-10)

Likert scales (agree-disagree statements)

Use for attitudes, perceptions, and workplace items. For patterns and pitfalls (like vague statements), see Likert scale questions.

  • [Product/Process] makes my work easier. (Strongly disagree to Strongly agree)
  • Communication about changes is timely. (Strongly disagree to Strongly agree)
  • I trust leadership to make decisions in the organization's best interest. (Strongly disagree to Strongly agree)

Open-ended prompts (qualitative follow-up)

Use open-ended items when you need nuance, examples, or ideas you did not anticipate. If you want more patterns, see open-ended questions examples. Research also shows that adding examples to an open-ended prompt can influence the kinds of responses people give, so include examples carefully and only when you truly need that direction.

  • What is the main reason for your score?
  • What is one thing we should improve, and why?
  • What nearly stopped you from [buying/signing up/completing the task]?
  • What did we miss that would have made your experience better?

Ranking

  • Rank the following factors from most important to least important: [list].
  • Rank these improvements by the impact they would have for you: [list].

Matrix (use sparingly)

  • Please rate the following aspects of [experience] (1-5): speed, clarity, friendliness, outcome, value for money.
  • How satisfied are you with each of the following? (Very satisfied to Very dissatisfied): onboarding, daily use, support, billing.
Answer option rule of thumb

Make options mutually exclusive (no overlap) and collectively exhaustive (include an "Other" or "Not applicable" when needed). This is a common source of avoidable measurement error (see Purdue OWL: creating good interview and survey questions).

Wording best practices (neutral, specific, and answerable)

Good examples fail when the wording is vague or biased. A few checks catch most problems.

  • warning
    Specify the timeframe: "in the last 30 days" beats "recently".
  • warning
    Use concrete nouns and verbs: "complete checkout" beats "use the site".
  • warning
    Avoid leading phrasing: remove adjectives like "great" or "easy" from the question.
  • warning
    One idea per question: do not combine two topics with "and".
  • warning
    Offer a legitimate "out": "Not applicable" and "Prefer not to answer" reduce forced guesses.

Bias can also come from context: question order, social desirability, and answer option framing. If you want a practical overview, see response bias and how to reduce it.

For open-ended items, keep the prompt focused and avoid stacking multiple asks into one text box. The University of Florida's Savvy Survey guidance provides practical patterns for constructing open-ended items (see Constructing open-ended items for a questionnaire).

Survey flow: ordering, sensitive questions, and follow-ups

Even with great wording, poor flow can reduce completion and distort answers. The basics are consistent across survey guidance (for example, Community Tool Box's overview of survey conduct and planning: Conducting surveys).

  1. Start easy and relevant

    Open with a simple, non-threatening item (role, use case, or "what were you trying to do?") so respondents feel oriented.

  2. Group by topic, not by format

    Keep related items together (support, product, billing). Switching topics too often increases drop-off and "satisficing" (rushing).

  3. Ask for ratings before explanations

    Example: ask the 0-10 satisfaction first, then "What is the main reason for your score?" This preserves the metric while still capturing context.

  4. Put sensitive items later

    Demographics, compensation, and identity items go near the end and should be optional when possible.

  5. Add follow-ups only where you will act

    Use targeted follow-ups ("Which part was unclear?") instead of a general "Any other comments?" on every page.

A simple follow-up pattern that works

When you ask a rating or multiple choice question, add one optional open-ended follow-up: "What is the main reason for your answer?" This often gives enough context without turning the survey into an interview.

Before-and-after rewrites (fix common weak questions)

Use these rewrites as a template for improving your own drafts. Guidance on clarity and neutrality aligns with standard survey-writing recommendations from Purdue OWL and university survey resources (see Kansas State University: survey questions).

Weak vs improved survey question wording
Weak questionWhat goes wrongImproved version
How great was our customer service?Leading ("great") pressures positive answers.How would you rate the customer service you received today? (Very good to Very poor)
How satisfied are you with our pricing and product quality?Double-barreled (pricing and quality are different topics).How satisfied are you with our pricing? (0-10)
How satisfied are you with product quality? (0-10)
Do you use our app often?Vague term ("often") means different things to different people.How often have you used the app in the last 30 days? (0, 1-2, 3-5, 6-10, 11+)
Why did you choose us? (Price, Quality, Other)Options are too broad and not exhaustive; hard to act on.What was the primary reason you chose us? (Choose one: specific feature, lower total cost, faster delivery, recommendation, contract requirement, previous experience, other)
Did our training help you?Yes/no hides degree and what to improve.How helpful was the training for your day-to-day work? (Not at all helpful to Extremely helpful)
What should we change to make it more helpful? (Open-ended)
How satisfied are you with the website?Too general; unclear what "website" means.How easy was it to find the information you needed today? (0-10)
What were you trying to find? (Open-ended)

References