A Tale of Two Workshops: How Retrieval Practice Saves Time and Money

Retrieval practice – the act of testing yourself on what you know – is a simple but powerful tool for making learning stick. Well-researched, it’s a strategy everyone is capable of, and few of us intentionally use. Why? It’s like many other good habits – getting enough exercise, avoiding junk food, making those phone calls we need to make. They’re a bit challenging and require some effort that may not feel pleasant, and thus too easy to relegate to tomorrow, or that mythical “some other time.”

The other day I sat on my office floor sorting oId papers and stumbled on a folder from a 2013 conference. In it were handout packets from two half-day workshops I took – one on project management, and one on data visualization. At the time, I had little or no background in either area and was interested in learning the basics. As I flipped through the packets of information from the presenters (both riddled with my handwritten annotations) I realized I was having two very different experiences.

Workshop 1: Easy peasy lemon squeezy

First, I reviewed the packet from the data visualization course. I laughed, realizing that all those years ago I needed to take notes on concepts and strategies I now feel as if I’ve known forever. I was almost hoping to pick up some interesting nugget I had long since forgotten, but that wasn’t the case. There wasn’t one word or phrase in that packet that I don’t know inside out or backwards today. I remember everything! Not one to toss things away, I realized I no longer had a need for these notes. I’ve been regularly using and building on these skills for years. Now THAT’s the mark of a good investment of time and money, right? 

Workshop 2: Stressed depressed lemon zest

Next, I reviewed the packet from the project management course. This time though, I wasn’t laughing. Nothing was familiar. In fact, not only did I not remember any of the principles or concepts, I didn’t even understand the notes I took on them. Page after page after page of unfamiliar content and unintelligible notes. {SIGH} What a waste of time and money that course was. 

WHY the difference?

Was it the instructors? The content? The learning environments? The course activities? I can give you a firm NO to all of these. As a career teacher and professional learner, I can say that both courses were high quality professional learning. Both instructors were excellent. Both created environments conducive to learning. Both courses featured interesting content and relevant interactive activities to support learning.

So, if it wasn’t the course, did it have something to do with… ME? 


The difference lies in what I did with what I learned after the course. 

Retrieval practice… by accident?

After that 2013 conference, I returned to work and immediately put the data visualization strategies I learned in workshop #1 to the test. I was a program evaluator and had reports and presentations to develop based on data I had collected. Applying newly learned skills is an excellent way to engage in retrieval practice. I didn’t need to study my notes from the workshop and THEN go about making graphs and charts. I only needed to look back at them IF and WHEN I stumbled or got stuck. Each new graph I made and inserted into a report keeping in mind fundamental principles of graphic design and visual communication – the content of the workshop – was a self-assessment of what I remembered from the workshop. And each time I did look something up in my notes or other sources, and then immediately applied it to my work it cemented the learning even more. I didn’t need to set aside time to study. I knew and could use the material.

At the time, however, there were no new projects for me to manage, so there was no such opportunity to apply my newly acquired project management knowledge and test out how much I remembered from workshop #2. Needless to say, I did not set aside time to study that material. As a consequence of having NO interaction with the concepts and principles of project management, forgetting naturally took over and what I learned that day in that workshop faded away. With no opportunity to apply the knowledge, and making no effort to test myself in another way (e.g., while reviewing my notes), I not only lost what I thought I had learned that day, but with it went my precious time and the money spent on that workshop. 

Stop the waste!

Professional learning, when well-chosen, often comes with built in retrieval practice opportunities. When work calls for the knowledge and skills you’re learning, it comes naturally. Try out what you think you learned on a current work project and see if you can apply it. If not, look up the parts you forgot, and you’re good to go. Continue to use those skills and concepts and the learning sticks. 

But what happens when we want to learn something we’re NOT doing at our jobs? In my case, it was project management. What should I have done to maintain that learning for a later time? THAT’S when the hard work comes in. Retrieval practice is powerful, but like any exercise, it has to be done to work. I should have sat down with a blank sheet of paper (or screen) and asked myself questions. What do I remember from the course? What IS project management? What are the main components, steps, or elements? What key concepts or vocabulary did I learn? How does what I learned relate to other things I already know and understand?

Every time one of those questions was difficult or impossible to answer would have been my cue to return to my notes and fill in the blanks. AND I should have done that same test every few days until I could answer all the questions. AND I should have retested myself a few weeks, then a few months later to ensure I still retained what I thought I learned. AND I should have continued to do this until an opportunity to use those skills presented itself and I could start applying and stop studying. Hard work? Yes, but much more in the way of mental effort. Retrieval practice doesn’t need to be time consuming. 

Take a retrieval practice challenge!

Think about the last workshop, webinar, or course you took. I’ll bet it wasn’t even that long ago. Take out a blank sheet of paper (or blank screen) and try to answer these questions:

  • What did I learn?
  • What are the main components, elements or steps?
  • What key vocabulary or terminology did I learn? Can I explain what each means?
  • How does what I learned relate to other things I know? 

Let me know how you do.

Every Day is Thanksgiving Day!

Americans are celebrating Thanksgiving today, and while my personal practice is to give thanks every day, today certainly feels like the right day to share thanks to all who subscribe, follow, read, and comment on this blog.

In addition to my wonderful family and friends, good health, and other gifts, I have been blessed with the opportunity to enjoy my work. It hasn’t always been this way, but for many years now, I have truly enjoyed my work. I generally get up in the morning, look forward to going in, and take great pleasure and pride in the work that I do. In fact, most days I consider it fun.  (more…)

A fine quotation is a diamond in the hand of a man of wit…

…and a pebble in the hand of a fool. These are not my words, but those of Joseph Roux, a French Catholic parish priest, poet, and philologist.

In this post, I’ll share a collection of quotations related to evaluation. Now, I’m hardly the first blogger to post of list of favorite quotes. Admittedly, though, I spend more time collecting writing than producing it, although I do love engaging in the latter. And in thinking back to an early post in which I studied reasons other evaluation bloggers blog, I remembered that several do so to create a repository of their work and ideas. As I’m a collector/curator at heart, this blog will soon become my repository.  (more…)

Where have all the (qualitative) data gone?

As is evidenced in recent posts co-authored with fellow blogger Kim Firth Leonard of actionable data (read them here and here), I’m fascinated with surveys and survey research. Just last week another fellow blogger, Brian Hoessler, of Strong Roots Consulting offered this post on open-ended questions.

I shared with Brian that I recently saw a needs assessment instrument composed of all open-ended questions – maybe a dozen or so questions in all. I always wonder when I encounter surveys with open-ended questions whether the qualitative data collected is indeed systematically analyzed and not just scanned or read through, especially in the case of very brief responses to open-ended questions. If data are analyzed, I wonder what kinds of coding strategies evaluators use – inductive or deductive? Constant comparison? Grounded theory?  (more…)

Paradox Redux: A Pleasant Surprise

A few weeks ago, I wrote Exploring the Public Education Paradox – Evaluation and Public Education (response to Jamie Clearfield). Soon after bemoaning the apparent lack of understanding of evaluation and its role in public education, I was delighted to find a chapter devoted to program evaluation in an education book I’m reading with my colleagues.  I was even more excited to discover a section on theory of change and logic model. Seldom (if ever) have I seen these concepts addressed outside of an evaluation text.

The book is Coaching Matters*, a text on PK-12 teacher leadership, and is described by its authors as addressing “…whether coaching matters. In other words, does it work?” My point here is not to offer a book review, but rather to revel in the fact that a book written for educational pracitioners is framed by evaluative thinking!  (more…)

Thinking About Evaluative Thinking…in my own backyard!

Confucius said: Learning without thought is labor lost. Thought without learning is perilous. I have been thinking about evaluative thinking. And exploring. And reading. And of course, learning. And I’m finding the same old story: as with most other evaluation-related terminology, there’s no one accepted definition of evaluative thinking. But, I did find two amazing resources:  (more…)

Exploring the Public Education Paradox – Evaluation and Public Education (response to Jamie Clearfield)

Just last week, one of my favorite evaluation blogs, Emery Evaluation, featured a guest post that got me thinking. Exploring the Non-Profit Paradox – Evaluation and Non-Profits [Guest post by Jamie Clearfield] reminded me that I’ve long thought there exists a dearth of program evaluation in public schools. As Jamie indicates for the world of non-profits and community-based organizations (CBOs), I too believe there is a lack of understanding of evaluation and its role in public education. How do I know this?  (more…)

Ode to AEA365: A “Meta-post”

This is a blog post about blog posts…a meta-post, if you will (and even if you won’t). 🙂

I call myself AEA365’s biggest fan. It’s true. I’ve been a daily reader since its inception.* For the uninitiated, AEA365 Tip-A-Day by and for Evaluators is the official blog of the American Evaluation Association. It’s well-designed, reader-friendly and very searchable. I strongly encourage everyone to spend time exploring posts using keyword searches, or posts tagged for Topical Interest Groups (TIGs). Or, simply click on Archive and see every title and author of the more than 1000 posts.  (more…)

13 Ways to Express Your Resoluteness in ’13

I‘m at a crossroads at the close of twenty-twelve – ready to welcome the new year with pronouncement of intentions for making good in 2013 but, ever the evaluator, pressured also to assess promises past. As I consider options for reflection or resolution, it occurs to me that we have an array of alternatives for framing the latter.

Are you ready to make your ’13 proclamations? Here we go: You can… make resolutions, set goals, objectives, or aims, express intentions or aspirations, announce expected outcomes or impacts, identify key performance indicators or key result areas, espouse your visions, or begin with the end in mind.  (more…)

Evaluators are humans, TWO!

As part of his launch team, I introduced Daniel H. Pink’s new book, To Sell is Human, in a recent post (see Evaluators are humans, too!). Pink’s premise is that regardless of our chosen fields, we’re all in sales – even those of us in what he calls “non-sales selling.” As a matter of course, we must all move others.

Eagles Mere, PA

©2006 Photo by SheilaBRobinson

Especially engaging for an evaluator is Pink’s chapter on Clarity, one of the “new ABCs of selling” – Attunement, Buoyancy, and Clarity. Pink sees clarity as “the capacity to help others see their situations in fresh and more revealing ways and to identify problems they didn’t realize they had.” What resonates with me is the notion of the value of problem-finding over problem-solving. The Information Age has given us access to all manner of solutions to our problems, but not necessarily to their identification. “The services of others are far more valuable,” claims Pink “when I’m mistaken, confused, or completely clueless about my true problem.”  (more…)

What is the “evalusphere” anyway?

I’m proud to say I’m the first to coin the term “evalusphere.” Of course, “blogosphere” has been around for more than a decade. “The blogosphere is made up of all blogs and their interconnections.”  The same can be said of the evalusphere. While it’s made up of all evaluators, evaluations, and their interconnectedness, a growing variety of concepts, disciplines and fields of inquiry are now part and parcel of the evalusphere.

In “The Future of Evaluation,” a panel at Evaluation 2012, Michael Scriven called evaluation “the alpha discipline.” In the same panel, Beverly Parsons spoke of “the growing understanding of complex systems thinking” in the evaluation community, and Susan Kistler opined that using cultural competence is the only way to do quality evaluation.  In the opening plenary, Rodney Hopson spoke about the emergence of evidence-based discourse, addressing methodological challenges, and understanding the principles of adaptive management. Numerous presenters held sessions on new technologies and data visualization.  (more…)

Who are you as an evaluator?

At Evaluation 2012, I attended “Advice for novice evaluators,” an engaging panel comprised of  the following evaluators whose experience ranges from 13-47 years: Arthur Hernandez, Jeanne Hubelbank, Michael Morris, Katye Perry, Robert Stake, and Sue Lin Yee. Each panelist was given the opportunity to offer wisdoms of practice, and all gave substantive advice that I look forward to sharing with my future evaluation students.

I wish I could properly credit the panelist who proffered this astute aphorism:

Developing a professional identify doesn’t always come from what you do, but how you think of yourself. 

It reminds me of one of my  favorite AEA365 posts: John LaVelle on Personal Statements About Evaluation – published March 25, 2010. Read the original post here and you’ll even see a comment from me!

©2006 Photo by SheilaBRobinson

Among other things, LaVelle tasks evaluators to Develop a personal statement of what evaluation means to you and how it can and should be practiced in dynamic, fluid, and political organizational and community environments. Given AEA12’s insightful conference theme, I would now consider asking evaluators to describe how evaluation can and should be practiced in complex ecologies comprised of relationships, responsibilities, and relevance.

I ask evaluation students to complete an evaluation personal statement as a required course assignment. I couldn’t possibly ask blog readers to do the same, but I will ask you this:  if you were to write such a statement, what key words or phrases would most certainly be included in yours?

Hello and Welcome!

Welcome to my first post of my first blog! I suppose I should explain what I’m doing here, but believe it or not, this is one of the hardest things to do! Why am I here? Well, I love to read and I love to write, though the former comes much easier than the latter.

I’ve been enjoying other evaluators’ blogs and have a few ideas of my own to share as well. I enjoy professional conversation and the exchange of ideas, and my hope is that this blog will be one way to capitalize on the collective wisdoms of practice and experience of my evaluation colleagues by posing questions and encouraging dialogue. Plus, I’ve received encouragement from fellow evaluators, and I’ve never been one to shy away from peer pressure! So, perhaps this is why I blog. But, in the interest of learning more about why others blog, I did a little evaluating (what else?). Look for my next post on what I’ve learned about why others blog.

If you’re a blogger, please leave a comment and tell me why you blog. Oh, and thanks for visiting. Hope to see you again soon!