I’ve been working with two wonderful professional associations on their annual (or biennial) State of the Industry surveys – The Presentation Guild* and the Data Visualization Society** and have 10 essential tips to share.
To start, I spoke with each group about the survey design process I developed with my book co-author Kim Leonard. The 10 tips that follow are organized by these three broad phases.
TIP #1: A team approach captures a wealth of perspectives.
Our committees comprised people who work in vastly different segments of the two industries.
- We each had different experiences to share, different understandings of our potential respondent populations, and different experiences with the language and terminology in our fields. These differences led to deep discussions about how potential survey items might or might not work well for us, given the plan to administer each of them to a global audience.
- A leader/facilitator kept the group moving forward, aligned with purpose, and organized meeting times and documents..
TIP #2: Long runways make for great takeoffs.
High quality surveys that yield rich, nuanced useful data take time to plan.
- Our committees started the process early, held regular meetings over several months, and got these meetings on everyone’s calendar early on.
TIP #3: Organize, organize, organize!
With multiple people and perspectives, and long lists of potential survey items, things can get messy fast.
- Our committees used Google Docs for easy collaboration and access to version history.
- Headers, colors, bookmarks, and a table of contents proved invaluable as the documents grew.
- One group used a Mural Board for initial brainstorming.
2. DEVELOPING QUESTIONS:
TIP #4: Use data from previous surveys to inform revisions.
Revise questions, question stems and response options as needed based on previous data.
- Our committees examined results from previous surveys to see which questions worked well or did not work well. With this in mind, we revisited the survey purpose and discussed any new information needs.
TIP #5: Specific tensions need to be balanced.
Acknowledge, understand, and discuss how to balance the various tensions that arise such as:
The desire to establish trend data vs. strengthen/change questions. Showing trends over time is one of the most common data stories we tell. But, the story won’t be rich and meaningful if the question isn’t working well or doesn’t match respondent experiences. It’s worth postponing trend data to get the question right.
For example, We used data from previous surveys – such as what was written into “other” options or categories less often chosen – to revise questions. (See Tip #7 for an example of this)
Identifying necessary vs nice to have information AND length of survey vs respondent burden. This is illustrated by a balance between the desire for precision with our survey language, and the burden on respondents.
For example, “How much do you like our product?” is straightforward, plain language, and the response options do include “not at all.” BUT, “how much do you like…” presupposes that the respondent does like the product.
“To what degree do you like or dislike the product?” is more precise, but a bit more verbose.
In a long survey, the word count is substantial, and every word adds to the mental effort respondents need to exert (i.e., cognitive load) and contributes to survey fatigue.
The desire for granularity vs ease of analysis. Do we want respondents to answer in intervals (i.e., buckets) or ask them for a specific number?
For example, when asking for years of experience or age you may get the most accurate data by asking for a number, BUT that makes for more complex analysis. If you use intervals, however, you could miss out on key understandings. Say you have a number of respondents with 3-10 years experience. People at the endpoints of that interval are in very different places in their career paths. Another consideration is comfort level in giving one’s age. Many may be comfortable identifying themselves within a range, but less so with giving the exact number.
TIP #6: Understand exactly what you are measuring AND what you may be missing.
For example, if you ask people to identify their top three challenges working in their field, you’re missing the degree to which each is a challenge. You’re missing a scenario in which their top two may be intense challenges, while the third is much less of a challenge. You’re missing why something is a challenge, and the respondent’s particular relationship to that challenge.
Knowing what you’re capturing and what you’re missing can highlight the need for a different question, for additional questions, or even the need to add interviews or focus groups to your data collection strategy.
TIP # 7: Document decision rules as you go.
We made a number of decisions about how to revise existing survey items to more closely match respondents’ experiences.
For example, for an item with a long list of response options, we decided to drop the three options with the fewest responses, and add in any options written in the “other” category that exceeded the number of responses the lowest option got. We also could have dropped just the lowest one, or made sure we dropped the same number as we added. The point is to be able to remember the decision rules for future iterations of the survey. Imagine this survey item and results:
Q14. What are your favorite ice cream flavors? (Check all that apply)
- Chocolate (n=274)
- Vanilla (n=262)
- Strawberry (n=171)
- Pistachio (n=128)
- Rocky Road (n=126)
- Peanut Butter (n=92)
- Butter Pecan (n=87)
- Banana (n=41)
- Mint Chocolate Chip (n=39)
- Coffee (n=59)
- Cookie dough (n=17)
- Butterscotch (n=12)
We would drop Mint Chocolate Chip, Banana, and Butter Pecan, and add in Coffee because it garnered more responses than our bottom one – Mint Chocolate Chip.
TIP #8: Pretesting is a must do.
Get feedback before launching the survey.
- Our committees shared a survey draft among our members for a careful proofread, and asked a few additional members of the association to take the survey and offer feedback.
Tip #9: Create a marketing and communication plan for reaching respondents.
You want a good response rate for your survey. You want to have enough data for findings to be relevant and useful. Reaching respondents, engaging them, and encouraging them to complete the survey is key.
- Our committees created detailed plans for reaching potential respondents through social media channels, emails, and newsletters, and coordinated with Marketing and Communications team members to design and schedule content.
- We shared results or reports from previous surveys as a nudge for respondents to complete the current survey.
TIP #10: Plan with analysis and use in mind.
A seemingly straightforward survey item can become a nightmare for a data analyst. Question formatting and the way data is exported from a platform can influence how much reformatting an analyst might have to do.
- Our committees each had a strong data analyst on the team who could articulate what kind of analysis would need to be done for each item. This person was familiar with and able to explain how data is exported from the platform we used and how specific question formatting would influence how they approached analysis.They were also able to share non-response analysis from previous surveys that helped us determine whether to delete or revise certain items.
If you work in any aspect of presentations or data visualization and would be willing to share your perspectives by completing a state of the industry survey, please look for information and social media outreach campaigns about when their surveys are open at the Presentation Guild or Data Visualization Society.
*I am currently serving as Director of Education, Training and Advocacy at the Presentation Guild. **I am currently serving as a member of the Survey Committee at the Data Visualization Society