Net Promoter Survey Questions: I’m very likely to recommend you read this.

Have you ever encountered a survey question that asks, “How likely are you to recommend {name of business} to a friend? You probably found an 11 point (0-10) scale that accompanied the question with the endpoints labeled “not at all” and “extremely likely.”

Net promoter questions asking "How likely are you to recommend..." showing 11 point scale

This question is associated with a concept called “net promoter,” and many large corporations are using this as a key metric in lieu of asking a series of more traditional customer satisfaction survey questions. Net promoter (NP) survey questions are designed to measure customer experience and even predict business growth.

 

Respondents who rate an NP survey question highly — with a 9 or 10 — are called “promoters,” while those who rate from 0-6 are called “detractors.” Those who rate a 7 or 8 are considered “passives.” Some companies will work very hard to try to change detractors to promoters while others focus efforts on moving passives to promoters.

 

Promoters are loyal customers likely to return and speak highly of the product, service, or business. Passives aren’t as enthusiastic and may easily be lured away by competitors. Detractors (as I’m sure you can guess) can potentially damage business by sharing their negative experiences and perceptions of the product or service.

Net Promoter example from United Airlines survey

 

 

NP example from United Airlines

In what seems the zenith of social media, the idea of identifying those who might promote your brand through social sharing channels, and those who might damage it with similar strategies, makes a ton of sense. There’s no shortage of support for employing this “likely to recommend” question in surveys — after all, a large number of Fortune 1000 companies use it — one of which is identifying customer loyalty with a very simple question. The originators of the concept found it worked especially well for mature, competitive industries. But there are also good reasons to consider whether this is the best option for your survey, or whether using differently worded questions might yield better data, especially with certain respondent populations.

 

My book, Designing Quality Survey Questions, features a number of Real World Questions culled from actual surveys my co-author and I have encountered over the years, and Stories from the Field collected from other researchers and evaluators who shared lessons learned from their survey design experiences. Along with continuing to explore current research on survey design, I keep my eyes and ears open for survey experiences people share with me because these too shape my thinking about quality survey design.

Net promoter survey question from HP Connected

 

 

NP example from HP Connected

Here are a couple of recent examples — real stories, from real people — on this idea of net promoter survey questions:

 

Story #1: Using NP with children

Prior to our podcast interview, I was chatting with Rebecca from Glass Frog who asked my opinion on net promoter questions and told me a story of her nine-year-old daughter encountering this “likely to recommend” question on a survey about her Girl Scout troop. Rebecca’s daughter provided very favorable responses about her experience with the Girl Scouts. However, when asked whether she would recommend this troop to others, she responded with a 0 on the net promoter question and told her mom, “No. I really like our troop the way it is. I wouldn’t recommend it to anyone else, because I don’t want anyone else to join.” The young girl took the question quite literally and answered honestly. But hey, kids do that, and we know that surveying children has its unique challenges, right?

Story #2: Using NP with seniors

Given my interest in survey design and knowing I was working on the book, my dad, an octogenarian, would occasionally talk with me about survey questions he came across. One day he mentioned the “would you recommend” question he saw on a survey from his insurance company. He said he was troubled by the question, not because he was dissatisfied with the company, but rather because he didn’t see himself as ever having a conversation with anyone about insurance companies. To him, insurance is a private matter, not to be discussed with others, and thus, he felt he simply wouldn’t be in a situation to recommend or not recommend the company. Taking the question quite literally, he didn’t quite know how to answer it. He understood the intent, but was troubled by the particular wording, “would you recommend.”. He’s quite a literal guy, and wanted to answer honestly. He asked me, “Why don’t they just ask me if I’m satisfied with their service? That one I can answer easily.”

Net promoter survey question from a local spa

 

 

NP example from a local spa

Lesson Learned:

So here we have it: from eight to eighty, this question is clearly not for everyone! Now, I’m not saying it should never be used. After all, there’s plenty of evidence for the value of the net promoter question to many, many successful companies. My advice is this:

  • Carefully consider who your respondents are and how they’re likely to interpret and answer this question;
  • Revisit the purpose of your survey (you did articulate a clear purpose in your research or evaluation plan before designing your survey, right?);
  • Have a crystal clear understanding of what it is you want to measure — is it customer satisfaction? Is it brand loyalty? Something else?;
  • Based on the above, determine what question or set of questions is likely to yield rich, actionable data that will answer your specific research or evaluation question(s); and finally
  • Pretest the questions! Use a small-scale pilot test, cognitive interviewing strategies, or other pretesting techniques (all conveniently explained in Ch. 7 of DQSQ!).
  • Carefully consider the wording of the main NP question with your respondents in mind:
    • People will interpret these questions in different ways:

How likely are you to recommend [organization name] to a friend, family member, or colleague?

How willing would you be to recommend [organization name] to a friend, family member, or colleague?

  • You can also vary the end of the question – “friend, family member or colleague” could simply become “others.” More concise question wording = less cognitive load for respondents and reduces the likelihood of survey fatigue.
  • If you use an NP question, follow it up with an open-ended “Why?” question. There’s always a risk that respondents will skip the question or provide non-substantive responses (e.g., “because that’s the way I feel”) but in sifting through these in analysis, you may come across some true gems in the form of highly insightful answers.

 

NP in the NP world: No easy answers

Non-profits have been experimenting with net promoter survey questions and learning how they can best use them to inform their work. Feedback Labs offers good advice for those wanting to experiment with NP questions including how often to ask, and how to use the data in different ways than corporations do. Others are encouraged by early success with this approach in the social sector, but that said, not all non-profits are in favor of the NP question approach, and this perspective should be taken into account as well.

 

Tinker, tinker, tinker…

If you’ve read my book or previous articles on survey design you know I’m a huge fan of experimenting with question wording, and a staunch believer that words matter, word choice matters, word order matters, and what you do and don’t ask matters. This is true whether we’re talking about surveys or conversations. After all, how many times have you responded to someone angry or hurt with “but I didn’t mean it like that!”?. If you’re going to try an NP approach, go for it! If you have the opportunity to pre-test with a sample of potential respondents, try some A/B testing, using different versions of the question with random samples of your pre-test respondents. And let me know what you learn!

 

Many thanks to Chelsea BaileyShea of Compass Evaluation + Consulting, LLC for her generous and insightful feedback on an early draft of this article, and to Rebecca Casciano of Glass Frog for contributing her story.

New Year, New Newsletter: What is Professional Learning?

Happy New Year! Each year, just like many of you, I make… and usually break… the same resolutions, with the exception of one: I learn.

In 2018, I learned how to create and launch my new website. That year, I also learned more about educational equity and culturally responsive education, communication, and leadership. In 2019, I studied negotiation skills, learned more about the science of learning, and added to my Excel, PowerPoint, and data visualization skills. All of this “professional learning” informs my work on various projects and helps improve my professional practice. 

To learn all of this, here’s what I did (along with a few example favorites):

But that’s not all. I also went hiking, rode my bike, ran road races, attended yoga classes, and cooked meals for myself and my family. Wait, what? Was there professional learning to be had from these activities? Let’s return to that notion in a bit…

Why is professional learning important?

Thinking of professional practice as professional learning positions us to think of everything we do as contributing to making us better at what we do. It’s mindset work. What do I mean by that? Mindset work is about attitudes and dispositions and understanding how principles guide our actions. It’s about how and what we learn from successes and failures, and about focusing efforts on incorporating what we learn into how we practice our craft.

What is professional learning?

As a young public school teacher, my professional learning (in those days we called it “in-service” or “staff development”) meant attending workshops on various topics, some directly related to what and who I was teaching, and others seemingly less so. Thankfully, my earliest experiences were positive and influential thanks to skilled presenters and compelling presentations. What I learned from them struck me as reasonable, relevant, and doable. In fact, some* resulted in career-long changes in my teaching practice and approach to students.  

Thus began a career-long fascination with professional learning. 

I once surveyed colleagues for a grad school project asking them to list any activities (including hobbies, sports, volunteer work, etc.) they felt impacted or informed their teaching practice. It was surprising when many of them identified activities not usually associated with professional learning – watching movies, scrapbooking, teaching swim lessons, cooking, and playing sports. They were making connections I wasn’t. They had figured out that the things they did for themselves and for others could also inform their work. 

You’re reading this because we share an interest in some of the same professional topics: learning and teaching, communication and presentations, evaluation, data visualization, survey research, and others. I’ve grappled with finding a thread that ties these seemingly disparate topics together. What I’ve landed on thus far is professional learning.

We read, we listen, and we learn to enhance, refine, or otherwise improve our professional practice, as is done in any field. We’re here because we are dedicated to improving our professional practice. But what if we also considered professional practice itself as a powerful form of professional learning? Let me show you what I mean and share why this is so important.

Evaluation as professional learning

Are you an evaluator? You’re engaging in professional learning all the time. After all, evaluation is conducted for the purpose of learning about programs or policies. As we collect data—from surveys, interviews, focus groups, site visits, observations, record reviews, etc.—we are in a constant state of learning that we then translate (through data analysis, of course) into findings, conclusions, and recommendations. Need info on evaluation? Check out my collection of resources.

Education as professional learning

Are you an educator? As teachers, we’re in a constant state of professional learning not only to keep up with educational innovations or research, but also as we learn each day from our students. Whether we teach kindergarten or college, we learn what our students are capable of, where they struggle to grasp concepts, where they can and can’t apply their understanding, and most importantly, we learn about their interests and special gifts—who they are as people. Effective educators analyze, synthesize, and use all of this learning in practice. And what about lesson planning? Here’s what I know from my ongoing work in classrooms supporting teachers, teaching graduate courses, and giving workshops: Whether I’m helping a science teacher teach combustion, a math teacher teach circumference and perimeter, or I’m getting ready for one of my survey design or audience engagement strategies workshops, I’m cracking open books, journals, or websites to relearn, refresh, or catch up on the latest research to ensure my teaching is thorough and up-to-date. That’s professional learning. In fact, check out the quote on my home page about the intersection of teaching and learning.

Presentations as professional learning

Have you ever given a presentation? Presentations have many purposes—to sell, to persuade, to inform, to educate, etc. —but what they all have in common is learning. As presenters, we work in service to the audience —our learners. Our goal is for them walk away with new learning about the topic. Every presentation is a lesson plan. Whether I’m giving a report to the Board of Education, sharing data with stakeholders, keynoting at a conference, or facilitating a workshop, I approach it the same way as I do a classroom session.  

Survey research as professional learning

Have you ever used a survey for research or to understand something about your colleagues or customers? That’s professional learning, too! From survey questions we learn about our respondents. We learn about their behaviors and attitudes. We learn how programs and policies are operating, how goods and services are being purchased and used, and how people feel about all of these. We use all of this learning for continuous improvement in our organizations, often communicating it to others (through presentations and education) so that they can improve programs, policies, and practices.

Everything is learning, and we are all learners. 

We pursue learning to enhance our professional practice doing the expected, the usual – reading books, blogs, and journal articles, engaging in listserv discussions, or attending conferences. We learn from both mistakes and successes. To form a deeper understanding of what facilitates success and failure, think of professional practice as learning – the acquisition of experiential knowledge arising from the daily scenarios, vignettes, and case studies that comprise our work.

*Discipline with Dignity, for example, taught me to stay calm in the face of challenging behaviors, not to vilify students when they acted out, and to work collaboratively and privately with those who struggled in my classroom.

Many thanks to my friend Chelsea BaileyShea, of Compass Evaluation + Consulting, LLC, for her thoughtful and valuable feedback on an early draft of this article.

New Newsletter!

Sure, you can read this blog here and check back for updates every now and then, but why not just subscribe to my newsletter The Learning Curve? You’ll get a link to any new blogs right in your inbox, along with a bunch of other cool content on a variety of topics! Easy peasy. Click here. 

The Learning Curve logo

Hindsight is 20/20, even with surveys (Cross post with actionable data blog)

Yep, it’s another great co-post with the splendid Kim Firth Leonard, of the actionable data blog.

Almost everyone (probably everyone, actually) who has written a survey has discovered something they wish they had done differently after the survey had already launched, or closed, with data already in hand. This is one of the many ways in which surveys are just like any written work: the moment you’ve submitted it, you inevitably spot a typo, a missing word, or some other mistake, no matter how many editing rounds you undertook. Often it’s a small but important error: forgetting a bit of the instructions or an important but not obvious answer option. Sometimes it’s something you know you should have anticipated (e.g. jargon you could have easily avoided using), and sometimes it’s not (e.g. an interpretation issue that wasn’t caught in piloting – you DID pilot the survey, didn’t you?). (more…)

The CASM that Bridged a Chasm: When Cognitive Science Met Survey Methodology and Fell in Love! (cross post with actionable data blog)

When Kim Firth Leonard of the actionable data blog and I write together, we usually refer to each other with a superlative – fabulous, magnificent, wonderful, etc. (all totally accurate, of course) – but now, I’m even prouder to call her co-author! Yes, we are in the throes of writing a book on survey design!*

After a very successful presentation to a packed room at Evaluation 2014 in Denver, CO (if you were there, thanks!) we met with an editor at Sage Publications to pitch our idea and now we’re busy fleshing out chapters and excited to share bits with our readers along the way.

The foundation of our collaborative work lies here: “how evaluators ask a question can dramatically influence the answers they receive” (Schwarz & Oyserman, 2001, p. 128).  (more…)

Designing Effective Surveys Begins with the Questions BEFORE the Questions! (cross post with actionable data blog)

The art and science of asking questions is the source of all knowledge.      – Thomas Berger

Hey readers, Sheila here, writing once again with the marvelous Kim Firth Leonard, of the actionable data blog.

It’s survey design season, so get ready to flex those question design muscles! Well, to be truthful, it’s always survey design season in our data-saturated evidence-hungry society. As surveys have become ubiquitous, it is incumbent upon survey researchers to to cut through all the noise by developing the most effective instruments we can. And what’s the best way to get ready for any endeavor that requires flexing? A warm-up! Just as failure to warm up for physical activity can invite injury, diving into survey question design without a preparation process can introduce the possibility of gathering bad data. Remember the principle of GIGO(more…)

It’s All in How You Ask: The Nuances of Survey Question Design (cross post with actionable data blog)

If you do not ask the right questions, you do not get the right answers.


– Edward Hodnett, 20th century poet and writer

At Evaluation 2014, the American Evaluation Association’s annual conference, the incredible Kim Firth Leonard (find her over at actionable data blog) and I facilitated a 90-minute skill building workshop on survey question design. Kim and I have co-authored several posts on our shared passion for survey design. You can find these posts here, here, and here. We were thrilled to geek out with such a great group ready to humor us by taking our pop quiz, listening intently as we shared the science behind answering questions about behavior, nodding as we reviewed fundamental principles of survey design, and genuinely engaging with us in exploring better ways to approach surveys.  (more…)

When a Direct Question is NOT the Right Question

Who hasn’t answered the question, “What did you learn?” after attending a professional development session? As a PD facilitator and evaluator, I’ve certainly used feedback forms with this very question. After all, measuring participant learning is fundamental to PD evaluation.

In this post, I’ll share examples of actual data from PD evaluation in which we asked the direct question, “What did you learn?” I’ll then explain why this is a difficult question for PD participants to answer, resulting in unhelpful data. Next, I’ll offer a potential solution in the form of a different set of questions for PD evaluators to use in exploring the construct of participant learning. Finally, I’ll show where participant learning fits into the bigger picture of PD evaluation.  (more…)

Where have all the (qualitative) data gone?

As is evidenced in recent posts co-authored with fellow blogger Kim Firth Leonard of actionable data (read them here and here), I’m fascinated with surveys and survey research. Just last week another fellow blogger, Brian Hoessler, of Strong Roots Consulting offered this post on open-ended questions.

I shared with Brian that I recently saw a needs assessment instrument composed of all open-ended questions – maybe a dozen or so questions in all. I always wonder when I encounter surveys with open-ended questions whether the qualitative data collected is indeed systematically analyzed and not just scanned or read through, especially in the case of very brief responses to open-ended questions. If data are analyzed, I wonder what kinds of coding strategies evaluators use – inductive or deductive? Constant comparison? Grounded theory?  (more…)

A Roundup of Survey Design Resources (cross-post with actionable data)

Sheila here, writing with the magnificent Kim Firth Leonard of the actionable data blog.

Since agreeing that we would co-author a series of blog posts on surveys with a focus on composing good questions, we have discovered countless other blog posts, websites, journal articles, and books on survey research from a variety of fields and perspectives, many of which feature discussions of and advice on question construction. Of course, we have a few personal favorites and well dog-eared texts:  (more…)