In Mea Culpa and Rethink on Pre-tests, Clark Quinn makes a confession:
Well, it turns out I was wrong. I like to believe it doesn’t happen very often, but I do have to acknowledge it when I am. Let me start from the worst, and then qualify it all over the place ;). In the latest Scientific American Mind, there is an article on The Pluses of Getting It Wrong (first couple paragraphs available here). In short, people remember better if they first try to access knowledge that they don’t have, before they are presented with the to-be-learned knowledge. That argues that pre-tests, which I previously claimed are learner-abusive, may have real learning benefits. This result is new, but apparently real. You empirically have better recall for knowledge if you tried to access it, even though you know you don’t have it.
If the research that Clark quotes is true (and after recent debunkings my default state is to trust nothing and no-one), then there are implications beyond the argument for pre-tests. Pre-tests are sterile and mechanistic; they serve a purpose but they are hardly an engaging way in which to commence a course of study. Luckily, there are other, much more exciting ways to build on learner’s ‘mistakes’:
The simplest way is by asking inductive questions. Rather than presenting the learner with the information and then testing it, you ask challenging questions which force the learner to draw on prior knowledge and skills to try and figure things out for themselves. Your feedback to these questions becomes critical, because these are not tests which you can respond to with "Well done, that’s correct" or "Sorry, you’ve got that wrong". The feedback must respond to each and every possible response in a constructive, non-patronising and conversational tone, much like a highly-skilled facilitator would respond in a live environment. Mistakes are welcome because they provide the learner with motivation to explore further and provide you with an opportunity to share what you know. Inductive questioning is absolutely key to writing engaging e-learning tutorials, yet the technique is tragically under-used. What’s more, many authoring tools don’t help, because they don’t allow you to display feedback to every possible selection in a mult-choice question (see my post from two years ago, Whatever happened to inductive learning?).
The really fun way is to engage in some serious discovery learning, which makes a lot of sense when you’re trying to get learners to explore cause and effect relationships and to make considered judgements. Branching scenarios are probably the simplest to implement, with full-scale simulations at the other end of the scale. Informational material becomes something that the learner accesses as and when they need it. One learner may perform so poorly in the activity that they jump out early to find out more before trying again; another may learn from their mistakes and figure it all out for themselves, using the informational material to confirm their understanding and fill any gaps; yet another may find the whole activity so straightforward that they bypass all the informational material and get back to their work.
Effective learning materials both build on your prior knowledge and allow you to make mistakes without hurting anyone. This simply can’t be accomplished with ‘tell and test’.