Use the checklist below to improve your survey design. A good survey is easy to take and results in accurate and useful information. The strategies are based on best practices in the research literature. The appropriate strategies may vary depending on your target population and administration method.
Please see the “Accessible Survey Design Checklist” for additional strategies to help people with disabilities. Please see the “Demographic Data Collection Checklist” for strategies to collect the required demographic information.
General Survey Design
- Keep It Short: Keep the survey as short as possible. Make unessential questions optional. People will be more likely to complete it and take future surveys.
- Explain Your Purpose: People are more likely to take the survey if they understand its importance. Create a call to action. Introduce the survey topic. Highlight the benefits of taking the survey (e.g., identify your impact, improve future events).
- Promise Anonymity or Confidentiality: People are more honest when they are anonymous. Tell people in the survey introduction that their feedback is anonymous. If you need identifying information, tell them their information will be confidential. Enable anonymity in your survey software.
- Use Clear and Familiar Language: Use simple and consistent terms that are familiar to your target population. Aim for a grade 7 to 8 reading level for the general population. Check the Flesch-Kincaid Grade Level score in Microsoft Word’s reading statistics or the datayze website. Use Google Books Ngram Viewer to compare the popularity of words over time.
- Limit Required Questions: Nonessential questions should be optional for participants to answer. More required questions reduces the completion rate.
- Use Headers: Group questions by topic. Use topical headers if your survey is more than one page. A logical flow makes the survey easier to follow.
- Pre-test: Ask people from your target population to test your survey. Ask if they had trouble accessing the survey. Ask if they understood the instructions and questions. Ask if they had trouble answering the questions.
Question Design
- Limit Open-Ended Questions: Closed-ended questions with pre-identified responses are easier to answer. They are also easier to analyze. Use open-ended questions to get explanations for a previous answer, if needed, and to give people an opportunity to share additional feedback at the end of the survey.
- Avoid Matrix Questions: Matrix questions can increase response time, reduce completion rates, and skew responses. Present each question individually instead.
- Avoid Agree/Disagree Frameworks: People are biased toward agreement. Instead, ask about the magnitude (e.g., How clear was the information?).
- Align Question and Response Options: The question should imply the type of response options given. For example, the question should not imply a yes/no response (e.g., Were the materials clear?) if the response options ask about magnitude (e.g., Not clear, Somewhat clear, Very clear).
- Don’t Ask Compound Questions: Don’t ask a question that includes multiple questions (e.g., Were the speakers and materials helpful?). People might have different answers to each part of the question.
- Be Objective: Don’t introduce bias by making assumptions or using leading words.
- Provide a Reference Period: At the beginning of the question, state the timeframe you want people to report on (if relevant). For example, “During the past year…” or “In a typical day…”
- Start with Less Difficult Questions: Ease respondents into the survey by starting with easy questions that aren’t too boring.
- Place Key Questions Near Beginning or Middle: Place your most important questions, including questions that collect federal performance measures, early in case people exit the survey early. One exception is demographic questions which may be sensitive (see the “Demographic Data Collection Checklist”).
- Place Overall Assessment Questions at End: Ask about overall impressions (e.g., general satisfaction) after asking about specific aspects of the overall topic. This order limits the agreement bias.
Response Option Design
- Use One-Sided Scales: One-sided scales address the presence or absence of one attribute (e.g., satisfaction). Two-sided scales that assess two opposite attributes (e.g., satisfaction and dissatisfaction) can be confusing and skew responses.
- Use 3 to 5 Response Options: Do not use two response options, like yes and no, unless those are the only possible responses (e.g., Did you have an annual checkup this year?) or people with severe cognitive impairments are taking your survey. Use 3 to 5 response options to capture people who are in the middle.
- Use Clear Labels for Each Response Option: Every response option should have a label, not just the endpoints. The label should have words, not numbers.
- Have Comprehensive Response Options: Include all key possible responses. People answer questions more quickly, provide more accurate information, and are less frustrated if they see a response option that matches their answer.
- Use Mutually Exclusive Response Options: Eliminate overlap between your response options. People answer questions more quickly, provide more accurate information, and are less likely to become frustrated if they can distinguish the response options.
- Don’t Offer a “Neutral” Response Option: Neutral options can be interpreted differently (e.g., don’t care, don’t know, not applicable, none of the above). Neutral options also let people avoid forming and/or sharing an opinion.
- Order Response Options from Negative to Positive and Low to High: These orders are intuitive. They also reduce the bias to choose the first acceptable response or the response that is most socially desirable.
- Display Response Options Vertically: Vertical layouts reduce response time. They may also increase the tendency of people with cognitive impairments to select the first response option. However, this bias can be mitigated by ordering response options from negative to positive and low to high, per strategy #24.
Funding for this product was supported, in part, by the Virginia Board for People with Disabilities, under grant number 2101VASCDD-00, from the U.S. Administration for Community Living (ACL), Department of Health and Human Services, Washington, D.C. 20201. Grantees undertaking projects with government sponsorship are encouraged to express freely their findings and conclusions. Points of view or opinions do not, therefore, necessarily represent official ACL policy.