Do you put out an annual survey at your organization? I work with associations and other organizations that do annual surveys (or biennial, quarterly, or other regularly scheduled surveys) that collect data on:

  • salary
  • working conditions
  • state of the industry
  • membership
  • job satisfaction
  • professional learning needs
  • organizational climate / culture

…and other topics.

Some organizations continue using the same survey over and over despite the fact that some survey items (questions) aren’t performing well – that is, the questions aren’t producing meaningful, relevant, useful data OR the questions are very frequently skipped by respondents, thus producing NO data. Why do they do this?

It’s time to update your annual survey! 

illustration of calendar pages all mixed up

It’s the old “sharpen the saw” story (HT to Stephen Covey and The 7 Habits of Highly Effective People). Do you want to keep using a dull saw knowing it will take you longer to get the job done, or are you willing to interrupt your work to sharpen the saw even if you “lose” some time in doing so? The point is, making an investment of time now, will save you time in the long run, AND will produce a better result. The same is true with annual surveys.

Let’s take a look at five things to consider when reviewing (and potentially updating) your annual survey.  

1.) Determine which questions are performing well.

As you review each survey item, reflect on these questions: Was the resulting data from this question informative? Did you use the results to inform any decisions, or update/create any systems, policies or practices? If the answer to these questions is YES, then consider keeping the question. If you use this question to look at multi-year trends, don’t change a thing. Once you change a word, phrase, or response option, it can influence responses, thus making it impossible to compare results to earlier surveys. 
 

2.) Determine which questions are NOT performing well. 

In addition to the above questions, consider these: Did a substantial number of respondents skip this question? Dis a substantial number appear to interpret the question differently from what you had intended?  If the answer is YES, consider revising or eliminating these questions. Sure, it means starting fresh in measuring what you set out to measure, but there’s no point in continuing to use a question that doesn’t perform well, just for the historical trends (which won’t be helpful to you!).
 

Given the question “How often do you read a magazine?” respondents may wonder if the researcher is looking for occasions when they have glanced at a magazine, read an article or two, or read a magazine cover to cover. Any of these could reasonably fall within an individual’s interpretation of “read a magazine.” Well-designed questions are clear, leave little room for interpretation, and do not put respondents in a position of having to guess the researcher’s intentions (Robinson & Leonard, 2019, p. 42).

3.) Explore questions with “other” as a response option.

If a substantial number of respondents checked “other” on a survey item in which you offered that option, it’s worth considering whether an update is needed. What’s a “substantial number”? That’s up to you to determine, based on your number of respondents and your information needs. If you have a write-in option for the “other” category, are there a substantial number who wrote in the same response? If so, you may want to add this response to the list of options for this item. But, use caution here. Too many response options increases the level of cognitive effort and too many survey items requiring a great deal of cognitive effort can result in respondents quitting the survey. Balance the need for an exhaustive list of response options with the level of granularity you need for the survey. There are numerous makes and models of cars out there, but if you really just need to know who’s driving a Honda, Toyota, or Ford vehicle, then it’s OK to have a substantial number of “other” responses. 

4.) Explore questions with an odd number of scale points.

Do you have questions with a 5 or 7 point scale? Most surveys do! Check the responses to see if a substantial number (yes, again, you’ll need to determine what is “substantial”) of them cluster at the midpoint. Consider whether to update these items with an even number of scale points – a “forced choice” situation. If you’re asking about attitudes, or asking people to rate something, is it likely that nearly everyone will feel some way – positive or negative – about the topic? If so, eliminate that midpoint option and make respondents commit to a more positive or more negative stance. Remember to offer an even balance of positive and negative response options. Here are a couple of examples of balanced response options with an even number:

Strongly Disagree – Disagree – Agree – Strongly Agree

Terrible – Poor – Fair – Good – Very Good – Excellent

Now, we can argue all day long about whether “fair” is more negative or more positive. The bottom line is the only way we can be sure is to test the survey with a sample of potential respondents. In the meantime, if you want an interesting take on how people in one survey assessed words such as “bad,” “good,” and other adjectives, check out You.gov’s How Good is Good?

5.) Revisit your survey purpose to ensure all questions align with it.

 You simply MUST articulate a clear survey purpose before designing or administering any survey. 

…identifying and articulating the purpose for conducting a given survey may be the single most important piece of design advice we can offer (Robinson & Leonard, 2019, p. 21). 

BONUS: While you’re at it, make sure you have other provisions in place to mitigate survey fatigue. You’re welcome. 😀
 
Why let another year go by before giving your annual survey a make-over? 
 

Check out my other articles on survey design and of course the book –Designing Quality Survey Questions. 

%d bloggers like this: