Have you ever heard that all of your survey questions should use the same rating scale to make it easier for people to respond? 

Or that if you include demographic questions, you need to include all possible demographics, and always at the end (or beginning) of the survey?

Or that you can only measure one thing in a given survey? 

Or that you should avoid open-ended questions entirely?

We have too. Hey, it’s Sheila here, writing with co-author, co-facilitator and best survey design thought partner Kim Leonard

If any of these sound familiar, or are pieces of advice you’ve taken to heart in the past, buckle up. Let’s  break down some of the worst advice we’ve heard, and talk about alternatives.

Should all survey items use the same rating scale? Ugh, no.

This advice was given in good faith, even though it can do more harm than good. In an effort to ease the burden on respondents, people figured that having them see the same scale over and over again would make it easier for them to work through the survey. 

Consider this example from the US Post Office. These questions all use the same scale, but some are almost impossible to wrap your brain around. Let’s consider their information needs and what they want to measure: 

  • If the package was delivered to the correct address, whether it was on time, and whether it was delivered in good condition. 
  • Whether information was timely and clearly communicated through notifications.
  • Whether knowing the time of delivery is helpful to the customer.

While an agree-disagree scale might be fine for some of these, the question of whether a package was delivered to the correct address is a simple YES/NO (the fancy term is dichotomous) question! 

Survey items in matrix from the US Post Office. First item asks if package was delivered to the correct address and provides a 6 point strongly agree to strongly disagree scale.

Isn’t it better to have response options that actually match the question, even if that means separating questions and providing different sets of response options? The research says yes! Every time a respondent has to work harder, or reread questions to understand them introduces the potential for survey fatigue and inaccuracies in the data.

Now, are there circumstances where a survey with a single rating scale works or makes sense? Sure. Probably. With the right framing, and a very similar set of questions or stems that match the rating scale used. But far too often, we see survey questions contorted so wildly in order to make them work that it becomes a much greater headache for respondents. Want your survey to be easy for respondents to complete? Make sure your questions are clear, understandable, and include question stems and response options that match. 

Should you include all possible demographic questions, and always at the end (or beginning) of the survey? Please don’t. 

There are many considerations when it comes to demographic questions. First, and absolutely most important is WHY you need demographic data. This will help you figure out WHAT demographic information you need. Then you have the substantial task of determining how best to ask demographic questions. This is going to depend greatly on WHO your respondents are and WHAT information you need. You also need to ask yourself (and answer honestly), how will the information be used? Will it validate what you already know about inequality and injustice? Or do you need it so that you can better tailor your program to meet a particular group’s needs. Can you commit to using it to inform program improvement? Is the burden, and potential discomfort to respondents, worthwhile? 

We like to separate demographic questions (with at least a page or section break), and often (but not always) place them at the end of the survey so that they can be skipped without loss of earlier responses. A few demographic questions can be placed at the beginning of a survey to “warm up” respondents before asking more challenging or sensitive questions (e.g., a survey about substance abuse, mental health, sexual behaviors). We also explain to respondents WHY we’re asking for the information, how it will be used, and how we will ensure confidentiality. We then reiterate our gratitude to them for sharing whatever information they are able and willing to provide.

We acknowledge that there is no perfect way to ask some demographic questions and advise consulting the most recent literature on asking about variables such as gender and sex, and also checking with members of the respondent population. 

Oh, and one more thing: while it’s not within the scope of this article to share best ways to ask those demographic questions, one thing we’re sure of: PLEASE don’t use the term “other” as a response option (that is literally “othering” people who don’t identify with named options). “Prefer to describe” is often one viable option. 

Are you limited to measuring only one thing in a given survey? Not at all.

Depending on the length of the survey and the nature of the items, you can certainly attempt to measure more than one concept or construct. For example, a survey could ask questions about job satisfaction while also attempting to capture professional development needs. 

Can you measure everything you’ve ever thought of in a single survey. Also no. But some of the surveys we’ve run across seem to be trying!

Must you avoid open-ended questions entirely? Nope.

Should you limit use of open-ended questions? Yes. For several reasons. First, if you find yourself crafting a survey that includes more open-ended questions than closed-ended, you likely are trying to use a hammer when a screwdriver would work better. In other words, a survey may not be the right tool when what you really need is rich, descriptive responses to questions. Interviews, focus groups, observations, or other data collection strategies would be far better suited for this). 

Second, how do YOU feel when you’re taking a survey and encountering one open-ended question after another? Tired? Frustrated? Wishing there were just buttons to tick? Our respondents feel this way too.

A few open-ended questions are not going to sink your response rate, and if they’re well-crafted, can result in detailed, useful insights. Did you catch the important part there? Open-ended questions require careful design just like closed-ended questions do. See the difference in this example: 

Not so great open-ended question: 

Tell us about your experiences with the teachers and staff at your child’s school.

Much better open-ended question: 

You indicated that what you like best about your child’s school is the friendly teachers and staff. Please share more about why this is what you like best about your child’s school? 

This is a question Kim recently answered in a survey from her daughter’s elementary school. She was easily able to write a couple of sentences about how the staff are kind and understanding when they call in sick, and how she likes that the principal is outside greeting everyone as they arrive every morning. The survey also included a follow up question about what she thinks could be improved, though it was set up so that she could easily skip it (or note that she couldn’t think of anything!).

But Sheila and Kim, is the answer to most of my survey design questions really “it depends”? Again? 

Yep. Context matters, and your respondents matter. Whatever is going to make the most sense to the folks you want to complete your survey, and net you the best information, is going to be the right approach. That approach is inevitably going to vary depending on the survey’s context and your desired respondents. And that’s precisely why our book, workshop, and online course are called “Designing Quality Survey Questions” and not “Designing the Perfect Questions that Will Work in Any Survey.”  🙂 

Find this useful and want to learn more? 

Check out our live, online course where we share loads of research-based and practical survey design advice. Participants leave with a rich understanding of how to craft survey questions that minimize respondent burden and maximize the usefulness of the resulting data. 

Pick up our book – Designing Quality Survey Questions.

Check out my other articles on survey design.

What other bad survey advice have you heard, or used?

Interested in a talk or workshop on any of the topics I offer? I’d love to chat with you.