Developing Multiple Choice Tests

If you are reading this, I could assume with almost 100% certainty that you have taken a multiple-choice test at some point in your life.  You’ve probably taken more than you can count. 

Unfortunately, there is also a very strong probability that you’ve encountered numerous test items that were so poorly designed you didn’t need to possess any related knowledge to guess the correct answer, or you were able to at least narrow the correct answer to two choices.  Chances are also good that you’ve fumed over poorly worded question-and-answer options that created confusion for you as the test-taker. 

Even with those limitations, test-givers highly prefer multiple-choice tests due to the ease and efficiency of administration.  The familiarity of the test structure also makes these tests a comfortable experience for test-takers. 

While multiple-choice tests provide a traditional format for test-takers and a quick record of training for test-givers, they are often ineffective in measuring knowledge due to poor test item writing.  Most of us think that writing a test question is easy because we’ve read thousands (maybe tens of thousands) of test questions and responses during our lifetimes.  But contrary to what many people think, multiple-choice items are not easy to write if your tests are to be valid (testing the thing you want to test) and reliable (offering dependable results every time).  If you are an employer who relies on multiple-choice testing to assess knowledge, properly constructed test items are essential.

Multiple-choice test items include: 

-          A stem – the question asked or incomplete statement provided

-          A key – the correct response

-          A distractor – the incorrect response (Each test item usually has multiple distractors but could have only one if appropriate.)

Good test item writing provides:

-          Stems that ask only one direct question

-          Stems that avoid absolutes, vague references, or negatives that can confuse the test-taker (Examples: none, all, never, always, usually, or “Which of the following does not include…”)

-          Stems that provide enough wording to be clear but not so much that unnecessary information is included

-          Stems that grammatically match all of their responses

-          Responses that grammatically follow their stems

-          Responses that are similar in length and structure

-          Stems that include most of the wording with responses that are limited in length

-          Distractors that are plausible to anyone with limited knowledge although obviously incorrect to someone with the appropriate knowledge (For example, don’t offer implausible answers like “Mickey Mouse” to questions about people.  Every distractor should sound logical to a layperson.)

-          Distractors that avoid absolutes designed to trick the test-taker or provide a clue (Examples: always, never, completely)

-          Distractors that avoid vague options (Answers like “typically” and “possibly” are confusing and imprecise, and therefore unfair to the test-taker.)

-          An appropriate number of distractors (Nowhere it is written that multiple-choice items must provide 4 responses.  Actually, in most cases three responses [one key plus two distractors] provide the best test item validity*.)

-          Content that incorporates common procedural mistakes.  For example, if employees frequently use the wrong form to complete a task, be sure to include the erroneously selected form as a legitimate distractor in your responses.

How do your test items stack up to these item-writing tips?  Properly worded test items help to ensure validity and reliability from your tests, so it’s worth the effort to examine your testing tools and refine them as needed. 

Next month we will examine some sample test questions and how we might improve them.

*Rodriguez, Michael C./ Three options are optimal for multiple-choice items : A meta-analysis of 80 years of research. In: Educational Measurement: Issues and Practice. 2005 ; Vol. 24, No. 2. pp. 3-13.

Return to the home page.

Return to the blog list.