Yep, it’s another great co-post with the splendid Kim Firth Leonard, of the actionable data blog.
Almost everyone (probably everyone, actually) who has written a survey has discovered something they wish they had done differently after the survey had already launched, or closed, with data already in hand. This is one of the many ways in which surveys are just like any written work: the moment you’ve submitted it, you inevitably spot a typo, a missing word, or some other mistake, no matter how many editing rounds you undertook. Often it’s a small but important error: forgetting a bit of the instructions or an important but not obvious answer option. Sometimes it’s something you know you should have anticipated (e.g. jargon you could have easily avoided using), and sometimes it’s not (e.g. an interpretation issue that wasn’t caught in piloting – you DID pilot the survey, didn’t you?). (more…)
The art and science of asking questions is the source of all knowledge. – Thomas Berger
Hey readers, Sheila here, writing once again with the marvelous Kim Firth Leonard, of the actionable data blog.
It’s survey design season, so get ready to flex those question design muscles! Well, to be truthful, it’s always survey design season in our data-saturated evidence-hungry society. As surveys have become ubiquitous, it is incumbent upon survey researchers to to cut through all the noise by developing the most effective instruments we can. And what’s the best way to get ready for any endeavor that requires flexing? A warm-up! Just as failure to warm up for physical activity can invite injury, diving into survey question design without a preparation process can introduce the possibility of gathering bad data. Remember the principle of GIGO? (more…)
If you do not ask the right questions, you do not get the right answers.
– Edward Hodnett, 20th century poet and writer
At Evaluation 2014, the American Evaluation Association’s annual conference, the incredible Kim Firth Leonard (find her over at actionable data blog) and I facilitated a 90-minute skill building workshop on survey question design. Kim and I have co-authored several posts on our shared passion for survey design. You can find these posts here, here, and here. We were thrilled to geek out with such a great group ready to humor us by taking our pop quiz, listening intently as we shared the science behind answering questions about behavior, nodding as we reviewed fundamental principles of survey design, and genuinely engaging with us in exploring better ways to approach surveys. (more…)
Who hasn’t answered the question, “What did you learn?” after attending a professional development session? As a PD facilitator and evaluator, I’ve certainly used feedback forms with this very question. After all, measuring participant learning is fundamental to PD evaluation.
In this post, I’ll share examples of actual data from PD evaluation in which we asked the direct question, “What did you learn?” I’ll then explain why this is a difficult question for PD participants to answer, resulting in unhelpful data. Next, I’ll offer a potential solution in the form of a different set of questions for PD evaluators to use in exploring the construct of participant learning. Finally, I’ll show where participant learning fits into the bigger picture of PD evaluation. (more…)
Sheila here, writing with the wonderful Kim Firth Leonard of the actionable data blog.
This post highlights some favorite recommendations from our collective experiences in crafting survey questions. It is also a continuation of our earlier co-authored posts (here and here). (more…)
As is evidenced in recent posts co-authored with fellow blogger Kim Firth Leonard of actionable data (read them here and here), I’m fascinated with surveys and survey research. Just last week another fellow blogger, Brian Hoessler, of Strong Roots Consulting offered this post on open-ended questions.
I shared with Brian that I recently saw a needs assessment instrument composed of all open-ended questions – maybe a dozen or so questions in all. I always wonder when I encounter surveys with open-ended questions whether the qualitative data collected is indeed systematically analyzed and not just scanned or read through, especially in the case of very brief responses to open-ended questions. If data are analyzed, I wonder what kinds of coding strategies evaluators use – inductive or deductive? Constant comparison? Grounded theory? (more…)
Sheila here, writing with the magnificent Kim Firth Leonard of the actionable data blog.
Since agreeing that we would co-author a series of blog posts on surveys with a focus on composing good questions, we have discovered countless other blog posts, websites, journal articles, and books on survey research from a variety of fields and perspectives, many of which feature discussions of and advice on question construction. Of course, we have a few personal favorites and well dog-eared texts: (more…)
Now, THAT’S a very good question!
Sheila here, writing with the fabulous Kim Firth Leonard of the actionable data blog.
We have been tweeting and emailing about challenges related to survey design and as a result, this is the first in a series of posts we have co-authored about our discussions. (more…)
I got to thinking about what I love about evaluation and it occurred to me that at least one aspect of it is particularly appealing due to its remarkable resemblance to shopping. Yup, that’s what it is. What I love about evaluation is collecting data!
The same rush of excitement I would get during a day at the mall I now get checking SurveyMonkey for incoming responses. It’s that “thrill of the hunt”, the acquisition, the addition to the collection that ignites the passion. As I become less materialistic and more fiscally responsible (as my golden years approach), I find myself engaging more in the latter than the former.
It happens even prior to data collection. Have you been to the Mall of America? Like planning out my day – thinking about what am I shopping for, how much I really need vs. how much I want, and which stores I will visit – I love designing the surveys. Who do I need to reach? How many of them? What do I really need to know from them vs. what I want to know? How will I get the biggest “bang for my buck?”