Yep, it’s another great co-post with the splendid Kim Firth Leonard, of the actionable data blog.
Almost everyone (probably everyone, actually) who has written a survey has discovered something they wish they had done differently after the survey had already launched, or closed, with data already in hand. This is one of the many ways in which surveys are just like any written work: the moment you’ve submitted it, you inevitably spot a typo, a missing word, or some other mistake, no matter how many editing rounds you undertook. Often it’s a small but important error: forgetting a bit of the instructions or an important but not obvious answer option. Sometimes it’s something you know you should have anticipated (e.g. jargon you could have easily avoided using), and sometimes it’s not (e.g. an interpretation issue that wasn’t caught in piloting – you DID pilot the survey, didn’t you?). (more…)
When Kim Firth Leonard of the actionable data blog and I write together, we usually refer to each other with a superlative – fabulous, magnificent, wonderful, etc. (all totally accurate, of course) – but now, I’m even prouder to call her co-author! Yes, we are in the throes of writing a book on survey design!*
After a very successful presentation to a packed room at Evaluation 2014 in Denver, CO (if you were there, thanks!) we met with an editor at Sage Publications to pitch our idea and now we’re busy fleshing out chapters and excited to share bits with our readers along the way.
The foundation of our collaborative work lies here: “how evaluators ask a question can dramatically influence the answers they receive” (Schwarz & Oyserman, 2001, p. 128). (more…)
The art and science of asking questions is the source of all knowledge. – Thomas Berger
Hey readers, Sheila here, writing once again with the marvelous Kim Firth Leonard, of the actionable data blog.
It’s survey design season, so get ready to flex those question design muscles! Well, to be truthful, it’s always survey design season in our data-saturated evidence-hungry society. As surveys have become ubiquitous, it is incumbent upon survey researchers to to cut through all the noise by developing the most effective instruments we can. And what’s the best way to get ready for any endeavor that requires flexing? A warm-up! Just as failure to warm up for physical activity can invite injury, diving into survey question design without a preparation process can introduce the possibility of gathering bad data. Remember the principle of GIGO? (more…)
If you do not ask the right questions, you do not get the right answers.
– Edward Hodnett, 20th century poet and writer
At Evaluation 2014, the American Evaluation Association’s annual conference, the incredible Kim Firth Leonard (find her over at actionable data blog) and I facilitated a 90-minute skill building workshop on survey question design. Kim and I have co-authored several posts on our shared passion for survey design. You can find these posts here, here, and here. We were thrilled to geek out with such a great group ready to humor us by taking our pop quiz, listening intently as we shared the science behind answering questions about behavior, nodding as we reviewed fundamental principles of survey design, and genuinely engaging with us in exploring better ways to approach surveys. (more…)
Who hasn’t answered the question, “What did you learn?” after attending a professional development session? As a PD facilitator and evaluator, I’ve certainly used feedback forms with this very question. After all, measuring participant learning is fundamental to PD evaluation.
In this post, I’ll share examples of actual data from PD evaluation in which we asked the direct question, “What did you learn?” I’ll then explain why this is a difficult question for PD participants to answer, resulting in unhelpful data. Next, I’ll offer a potential solution in the form of a different set of questions for PD evaluators to use in exploring the construct of participant learning. Finally, I’ll show where participant learning fits into the bigger picture of PD evaluation. (more…)
Sheila here, writing with the wonderful Kim Firth Leonard of the actionable data blog.
This post highlights some favorite recommendations from our collective experiences in crafting survey questions. It is also a continuation of our earlier co-authored posts (here and here). (more…)
As is evidenced in recent posts co-authored with fellow blogger Kim Firth Leonard of actionable data (read them here and here), I’m fascinated with surveys and survey research. Just last week another fellow blogger, Brian Hoessler, of Strong Roots Consulting offered this post on open-ended questions.
I shared with Brian that I recently saw a needs assessment instrument composed of all open-ended questions – maybe a dozen or so questions in all. I always wonder when I encounter surveys with open-ended questions whether the qualitative data collected is indeed systematically analyzed and not just scanned or read through, especially in the case of very brief responses to open-ended questions. If data are analyzed, I wonder what kinds of coding strategies evaluators use – inductive or deductive? Constant comparison? Grounded theory? (more…)
Sheila here, writing with the magnificent Kim Firth Leonard of the actionable data blog.
Since agreeing that we would co-author a series of blog posts on surveys with a focus on composing good questions, we have discovered countless other blog posts, websites, journal articles, and books on survey research from a variety of fields and perspectives, many of which feature discussions of and advice on question construction. Of course, we have a few personal favorites and well dog-eared texts: (more…)
Now, THAT’S a very good question!
Sheila here, writing with the fabulous Kim Firth Leonard of the actionable data blog.
We have been tweeting and emailing about challenges related to survey design and as a result, this is the first in a series of posts we have co-authored about our discussions. (more…)