Asked and Answered: Demographic Survey Questions

This blog post is part of a series called Asked and Answered, about writing great survey questions and visualizing the results with high impact graphs. Dr. Sheila B. Robinson is authoring the Asked series, on writing great questions. Dr. Stephanie Evergreen is authoring the Answered series, on data visualization. View the Answered counterpart to this post on Dr. Evergreen’s website.

Demographic survey questions header image

When you think of demographic survey questions, does your mind automatically turn to age, gender, or race? Of course! These are some of the most common demographics we collect with survey questions. But there are many, many more demographic variables.

In Designing Quality Survey Questions, my co-author Kimberly F. Leonard and I share a pretty extensive list of 23 demographic variables we found in another text:

Alreck and Settle (2004) offer a list of what can potentially be measured with demographic questions.

(more…)

Asked and Answered: Ranking Survey Questions

This blog post is part of a series called Asked and Answered, about writing great survey questions and visualizing the results with high impact graphs. Dr. Sheila B. Robinson is authoring the Asked series, on writing great questions. Dr. Stephanie Evergreen is authoring the Answered series, on data visualization. View the Answered counterpart to this post on Dr. Evergreen’s website.

Ranking questions header image

Ranking survey questions are a subset of multiple choice questions that allow multiple responses, but with the twist of requiring respondents to place their choices in a defined order. There are many varieties of ranking questions, some of which include:

(more…)

A Design Thinking Perspective on Surveys: My interview with Dr. James Pann

 

Designing Quality Survey Questionsthe book, the workshop, and the approach intentionally start with the word “design.” We can craft surveys, develop surveys, write surveys, etc. but when we take a design approach to them, the mindset is a little bit different. My co-author, Kimberly F. Leonard and I devote an entire chapter of our book to sharing how a human-centered design thinking process underpins our approach to survey design.

Recently, James Pann, Ph.D., evaluation consultant and Associate Professor at Nova Southeastern University acquired a copy of the book and told me in an email, “Really like it, very practical and different from other survey books available. For example, the incorporation of the design thinking approach.”

James wanted to chat with me about that aspect of survey design (along with a few others) to share with his students in a program evaluation course.

Here is some of what we cover in the interview:

  • how specific phases of a human-centered design thinking approach (we often refer to it as a respondent-centered approach) informs survey development and can aid in yielding useful, meaningful data.
  • how committing to your research or evaluation questions AND a written purpose statement for your survey can help keep you on track.
  • the importance of prototyping or pretesting a survey with respondents before administering it to the whole population.
  • a cautionary tale about developing surveys too quickly and without enough thought about who the respondents are.
  • how easy it is to make simple mistakes that introduce bias into your surveys.
  • what is important to include in a survey invitation to connect with and appeal to respondents.
  • how to know whether you should develop your own survey or find one already out there.
  • when a survey is not the right tool for your research or evaluation question.
  • one of my favorite survey stories about how empathy resulted in a 100% response rate for a survey.
  • a few tips on choosing survey software.
  • a few words on demographics… a HUGE topic in survey design.

 

Enjoy!

Asked and Answered: Check All That Apply Survey Questions

This blog post is part of a series called Asked and Answered, about writing great survey questions and visualizing the results with high impact graphs. Dr. Sheila B. Robinson is authoring the Asked series, on writing great questions. Dr. Stephanie Evergreen is authoring the Answered series, on data visualization. View the Answered counterpart to this post on Dr. Evergreen’s website.

Asked & Answered check all that apply featured image

Let’s talk about check all that apply survey questions. I’ll admit up front, they’re not my favorite.

I love ice cream. As a respondent, the survey question below would be really easy for me to answer. I’d simply check them all. (It’s not the same with vegetables. I still can’t stand asparagus.)

(more…)

Asked and Answered: Rating Scale Survey Questions

This blog post is part of a series called Asked and Answered, about writing great survey questions and visualizing the results with high impact graphs. Dr. Sheila B. Robinson is authoring the Asked series, on writing great questions. Dr. Stephanie Evergreen is authoring the Answered series, on data visualization. View the Answered counterpart to this post on Dr. Evergreen’s website.

Asked & Answered rating scales featured image

Rating scale questions are ubiquitous. I can hardly imagine a survey without at least one. They are typically posed as multiple choice (one answer only) questions composed of a question stem, and a set of response options. Like this:

(more…)

New Year, New Newsletter: What is Professional Learning?

Happy New Year! Each year, just like many of you, I make… and usually break… the same resolutions, with the exception of one: I learn.

In 2018, I learned how to create and launch my new website. That year, I also learned more about educational equity and culturally responsive education, communication, and leadership. In 2019, I studied negotiation skills, learned more about the science of learning, and added to my Excel, PowerPoint, and data visualization skills. All of this “professional learning” informs my work on various projects and helps improve my professional practice. 

To learn all of this, here’s what I did (along with a few example favorites):

(more…)

Hindsight is 20/20, even with surveys (Cross post with actionable data blog)

Yep, it’s another great co-post with the splendid Kim Firth Leonard, of the actionable data blog.

Almost everyone (probably everyone, actually) who has written a survey has discovered something they wish they had done differently after the survey had already launched, or closed, with data already in hand. This is one of the many ways in which surveys are just like any written work: the moment you’ve submitted it, you inevitably spot a typo, a missing word, or some other mistake, no matter how many editing rounds you undertook. Often it’s a small but important error: forgetting a bit of the instructions or an important but not obvious answer option. Sometimes it’s something you know you should have anticipated (e.g. jargon you could have easily avoided using), and sometimes it’s not (e.g. an interpretation issue that wasn’t caught in piloting – you DID pilot the survey, didn’t you?). (more…)

The CASM that Bridged a Chasm: When Cognitive Science Met Survey Methodology and Fell in Love! (cross post with actionable data blog)

When Kim Firth Leonard of the actionable data blog and I write together, we usually refer to each other with a superlative – fabulous, magnificent, wonderful, etc. (all totally accurate, of course) – but now, I’m even prouder to call her co-author! Yes, we are in the throes of writing a book on survey design!*

After a very successful presentation to a packed room at Evaluation 2014 in Denver, CO (if you were there, thanks!) we met with an editor at Sage Publications to pitch our idea and now we’re busy fleshing out chapters and excited to share bits with our readers along the way.

The foundation of our collaborative work lies here: “how evaluators ask a question can dramatically influence the answers they receive” (Schwarz & Oyserman, 2001, p. 128).  (more…)

Designing Effective Surveys Begins with the Questions BEFORE the Questions! (cross post with actionable data blog)

The art and science of asking questions is the source of all knowledge.      – Thomas Berger

Hey readers, Sheila here, writing once again with the marvelous Kim Firth Leonard, of the actionable data blog.

It’s survey design season, so get ready to flex those question design muscles! Well, to be truthful, it’s always survey design season in our data-saturated evidence-hungry society. As surveys have become ubiquitous, it is incumbent upon survey researchers to to cut through all the noise by developing the most effective instruments we can. And what’s the best way to get ready for any endeavor that requires flexing? A warm-up! Just as failure to warm up for physical activity can invite injury, diving into survey question design without a preparation process can introduce the possibility of gathering bad data. Remember the principle of GIGO(more…)

It’s All in How You Ask: The Nuances of Survey Question Design (cross post with actionable data blog)

If you do not ask the right questions, you do not get the right answers.


– Edward Hodnett, 20th century poet and writer

At Evaluation 2014, the American Evaluation Association’s annual conference, the incredible Kim Firth Leonard (find her over at actionable data blog) and I facilitated a 90-minute skill building workshop on survey question design. Kim and I have co-authored several posts on our shared passion for survey design. You can find these posts here, here, and here. We were thrilled to geek out with such a great group ready to humor us by taking our pop quiz, listening intently as we shared the science behind answering questions about behavior, nodding as we reviewed fundamental principles of survey design, and genuinely engaging with us in exploring better ways to approach surveys.  (more…)

When a Direct Question is NOT the Right Question

Who hasn’t answered the question, “What did you learn?” after attending a professional development session? As a PD facilitator and evaluator, I’ve certainly used feedback forms with this very question. After all, measuring participant learning is fundamental to PD evaluation.

In this post, I’ll share examples of actual data from PD evaluation in which we asked the direct question, “What did you learn?” I’ll then explain why this is a difficult question for PD participants to answer, resulting in unhelpful data. Next, I’ll offer a potential solution in the form of a different set of questions for PD evaluators to use in exploring the construct of participant learning. Finally, I’ll show where participant learning fits into the bigger picture of PD evaluation.  (more…)

Where have all the (qualitative) data gone?

As is evidenced in recent posts co-authored with fellow blogger Kim Firth Leonard of actionable data (read them here and here), I’m fascinated with surveys and survey research. Just last week another fellow blogger, Brian Hoessler, of Strong Roots Consulting offered this post on open-ended questions.

I shared with Brian that I recently saw a needs assessment instrument composed of all open-ended questions – maybe a dozen or so questions in all. I always wonder when I encounter surveys with open-ended questions whether the qualitative data collected is indeed systematically analyzed and not just scanned or read through, especially in the case of very brief responses to open-ended questions. If data are analyzed, I wonder what kinds of coding strategies evaluators use – inductive or deductive? Constant comparison? Grounded theory?  (more…)

A Roundup of Survey Design Resources (cross-post with actionable data)

Sheila here, writing with the magnificent Kim Firth Leonard of the actionable data blog.

Since agreeing that we would co-author a series of blog posts on surveys with a focus on composing good questions, we have discovered countless other blog posts, websites, journal articles, and books on survey research from a variety of fields and perspectives, many of which feature discussions of and advice on question construction. Of course, we have a few personal favorites and well dog-eared texts:  (more…)