Stop asking poor multiple choice questions. In this article, Steve Penfold explains how to write better multiple choice questions that can make your assessments more meaningful.
Your course is written, the activities are working beautifully and all of the graphics have been polished. Just the 10-question assessment left to do….
This is an all too common situation. And not necessarily a good one. The questions can be the trickiest, most under-thought out and, arguably, most important part of your course.
Multiple choice questions, or variations on them, are a common way to assess learners, but there are good ways and bad ways of constructing them.
Consider these points when writing multiple choice questions and make your assessments more meaningful and effective.
Write the questions before the content
This may seem counter-intuitive, but when you think about it, it makes sense.
There’s often far too much unnecessary content in an elearning course; the important material gets lost in a let’s-tell-the-learner-everything avalanche. By writing solid questions that test the learning objectives and then developing content to support those questions, your content will be leaner and tighter. I.e., there’ll only be such content as is necessary to support the questions and learning objectives.
If you develop the course content before writing the assessment, you may find that you’ve spent time on developing content that isn’t aligned with the learning objectives, and worse, you’ll then write questions on the unnecessary content and waste both your and the learners’ time.
Test what you want people to do, not what you think they should know
Let’s assume that a course has the learning objective of demonstrating the use of the Active Voice in business writing.
An assessment question for such a course might be:
Well, the learner has a 50/50 chance of getting this right and, even if they do, it doesn’t mean that they’ll be able to apply it.
A better question would be:
At least this shows that the learner knows why they should use Active Voice. But can they apply it? Do they know what it looks like? We still don’t know, and that’s what we really want from them.
An even better question would be:
This tests whether the learner knows what the Active Voice looks like, and therefore begins to test the learning objective. If they can identify the Active Voice, they’re well on their way to being able to write it.
And always try and put the question into a workplace context. For example, give the learner a scenario and ask which of the answer options would resolve it, or ask, “Here are the opinions of four colleagues. Whose would you follow to resolve this situation?”
True/False, Multiple-choice and Multiple-select
I never use True/False or two-option questions. Blindfolded, the learner has a 50% chance of guessing the correct answer. It’s often a lazy option on the question writer’s part. There’s always a more meaningful way to ask this type of question.
A standard Multiple-choice (i.e. the learner can choose 1 option from several presented) is a better tool. But even then, if there are four options, the learner has a 25% chance of guessing the right answer.
A Multiple-select (i.e. like a Multiple-choice, but the learner can choose several options) question is a much better tool. In a four option question, the learner has a 0.4% chance of guessing the correct answer, assuming that they don’t know whether 1, 2, 3 or 4 of the options form the correct answer. And I mix them up. Some of these will have 2 correct options, others 3, some will have 1, and sometimes all 4 are correct.
Writing good distractors
I like to allow 30 minutes to write a question, which often surprises people. They think it should be much quicker, especially if they’re paying by the hour!
Creating a good question and the right answer is relatively easy. It’s producing realistic distractors (the incorrect answers in a multiple choice type question) that takes the time. And realistic distractors are really important. Each weak distractor that the learner can discount gives them a better chance of being able to deduce or guess the correct answer. E.g. in a 4 option Multiple-choice question, if two of the distractors are weak, the learner suddenly has a 50% chance of guessing correctly, instead of a 25% chance.
A good distractor should seem like a plausible option to someone who doesn’t know the learning content, but should be clearly wrong to someone who does. It shouldn’t, however, be incorrect on a vague technicality – especially if that technicality isn’t specifically covered in the learning content.
Avoid being predictable
Savvy learners will assume that if the answer to question 1 is A, and the answer to question 3 is D, then the answer to question 2 is unlikely to be A or D. I.e., consecutive questions are rarely given the same correct option. To avoid answers being easy to guess, you need to make a systematic decision to randomize your answers. Most elearning authoring tools, like Elucidat, offer the option of randomizing answers. Use it! However, if you’re not using an authoring package that automatically randomizes answers, use a dice or toss a coin to decide on the order in which your correct answers will be displayed.
Randomizing answers means that “All of the above” and “None of the above” options won’t make sense when they’re randomized and therefore probably not the last option. Good! These options are often just a sign of lazy question writing. One study showed that “All of the above” and “None of the above” answers were correct over 50% of the time.
Another point to bear in mind is that within a single question your correct answer(s) and distractors should be about the same length. It is all too easy for a correct answer to give itself away by being significantly longer or shorter than the distractors.
Related: Stay on top of the latest elearning ideas, trends and technologies by subscribing to the Elucidat weekly newsletter.
You’ll want your learners to pass the assessment – but because your elearning content is powerful, not because the assessment is weak.
And when your learners do pass, you want it to be based on the learning objectives, not on “knowledge” that doesn’t ensure that they can perform the tasks in the workplace.
Writing good questions take a little thought, but it can add a lot of strength to your elearning and its outcomes.
Latest posts by Steve Penfold (see all)
- How do you price online training courses? - March 15, 2019
- 9 Elearning Authoring Tools: Comparison and Review - January 1, 2019
- Stop selling training the wrong way: 5 ideas to help you sell more courses - March 1, 2018