If you do not ask the right questions, you do not get the right answers.
– Edward Hodnett, 20th century poet and writer
At Evaluation 2014, the American Evaluation Association’s annual conference, the incredible Kim Firth Leonard (find her over at actionable data blog) and I facilitated a 90-minute skill building workshop on survey question design. Kim and I have co-authored several posts on our shared passion for survey design. You can find these posts here, here, and here. We were thrilled to geek out with such a great group ready to humor us by taking our pop quiz, listening intently as we shared the science behind answering questions about behavior, nodding as we reviewed fundamental principles of survey design, and genuinely engaging with us in exploring better ways to approach surveys.
As we shared during the workshop — we have both been in the business of collecting data long enough to be pretty annoyed by poorly written surveys, and see a real need for better, more specific advice for question design than we’ve been able to find to date. Our main goal has always been to learn together (and with anyone else interested in joining in the fun) to improve our own survey design, and to encourage others to take survey design more seriously. And we’re so grateful for the many attendees who were willing to learn with us in Denver in October – thank you! For those who did attend, or who didn’t but wish they were there (we wish you were there too), here are our slides:
Slides: It’s All in How You Ask: The Nuances of Survey Question Design – presentation at Evaluation 2014, Denver, CO
Favorite tips generated during the session:
Our participants offered a wealth of tips and advice for survey researchers. We don’t necessarily agree with all the tips that were shared, but think that most of them have something to offer in terms of food for thought and consideration.
- We received quite a few tips related to planning surveys
- Be sure to identify and clearly outline your needs as well as how you intend to use the results
- Start with your data quality objectives — what you want to report out on. Then you know you have what you need without extras
- Identify what you “need to know” and what would be “nice to know” in order to prioritize what questions to include, and limit burden on respondents.
- Test survey instrument with the client (e.g donor), walk through each question and say – if the results of this question are positive, what will you do with that information? If negative – what then? What will be helpful?
- Some tips related to how to proofread or even roughly validate your survey questions
- Even if you’re using a mail/web survey, read it out loud – you may hear things you don’t notice when reading.
- Work closely with subject matter experts to ensure language, etc. is correct.
- Cognitive interviews/ think aloud strategies are especially helpful when surveying communities of which you are not a member (e.g. kids think about things in different ways, and your questions might not mean the same thing to them as to you).
- In some situations it may make sense to start with open-ended questions with a small group of future respondents (or those in like positions) and use their responses to generate the response options, rather than starting from your own assumptions.
- And many were even more specific!
- For questions with scalar response options, between 4 and 7 is usually appropriate total number of response options.
- Remember that neutral is different than no opinion. Both options could even be used together in the same question.
- Limit questions asking what the respondent thinks about OTHER’S motivations (e.g. why did they do x) for those situations where this is truly appropriate
- Be mindful of the AGE of respondents. Seniors will interpret words/tone very differently from those in their 20s.
- Beware of “yes/no” questions — won’t be as useful if you intended to look for change. And often too binary — life is often more complex than yes/no.
Judging by the session turn-out and follow-up emails we’ve received, the conversation we started is well worth continuing! One way we hope to do so is through continued crowdsourcing of survey design tips and challenges. To that end, we’ve started a GoogleDoc with some of our favorite tips and many of those gathered through our workshop. We’ve made this available to everyone and anyone to add to (and hopefully learn from).
Our intention is to continue to build on this, and to develop a checklist or similar tool to help support better survey design. We look forward to continuing to learn with you! Stay in touch if you’re interested in continuing to geek out with us about this.
Please add tips and include your name (and sources as appropriate). Use THIS LINK to add to the GoogleDoc (and see the live doc below)!