The Anatomy Of Assessment: 5 Elements Of A Quality Multiple Choice Question
by Nanda Krish, CEO of Wisewire
Computer-based assessments have given teachers, educators, and administrators powerful tools for enhancing the testing process, all while allowing for efficient scoring of high volumes.
However, those benefits are rendered useless if the actual content isn’t successful. The key to this is crafting assessment questions that zero in on the most important element: Does it test the student’s mastery of the topic?
That seems like a simple black-and-white question but the details show how difficult it can be to ensure of that. Mastery can be unclear if the question proves to be confusing or if the answers aren’t effectively chosen. For the latter, there are many ways the answer set can lead the student, either helping them with process of elimination rather than deep understanding or providing too much information, thus making the answer obvious.
See also: The Most Important Question Every Assessment Should Answer
This all comes down to five key components of a strong multiple-choice assessment. If you are wondering if your assessment — be it computer-based or traditional — is effective, break your content down into these elements and examine them one by one.
Let’s look at an example of why a particular assessment item fulfills these elements.
5 Elements Of A Quality Multiple Choice Question
Strong assessment items are made up of five elements:
- Standard
- Stimulus
- Stem
- Key
- Distractors
Standard
The standard establishes the purpose and topic of the assessment item. The student should be able to read the standard and know immediately what to expect in terms of topic and desired action. This is usually at the head of a section, with problem sets below it. For example, a standard for a geometry section might be “Use the Pythagorean Theorem to identify the hypotenuse for the following.”
The standard establishes the purpose and topic of the assessment item. The student should be able to read the standard and know immediately what to expect in terms of topic and desired action. This is usually at the head of a section, with problem sets below it. For example, a standard for a geometry section might be “Use the Pythagorean Theorem to identify the hypotenuse for the following.”
Stimulus
The stimulus provides the necessary background information to understand the problem. For math, it is the original equation or problem. For word-based problems, it is the background material that sets up the challenge, whether that’s a sentence or a paragraph. For example, in the previous example, the Stimulus would be the actual description of “A right triangle has one side with a distance of 4 and the other side with a distance of 6.”
Stem
The stem asks the actual problem to solve. It is critical because it provides the exact direction that brings the prior elements to conclusion. If the stem does not make sense in the context of the standard, then there is an issue with the way the assessment is written.
Confusing stems do a disservice to students, wasting valuable test time and losing clarity in the process. Using the above example, a proper stem would be “What is the hypotenuse of this triangle?” However, a misleading stem would ask a question that is tangentially related (or even unrelated) to the topic; for the triangle example, it might be “What are the other angles of the right triangle?” While this may be a valid question in terms of the overall subject, such a stem changes the expectation of the standard and confuses the student.
One additional not on stems — it may be tempting to use negatives (not, except etc.) in the stem. However, this type of language tends to create confusion about the actual purpose of the assessment item. It is considered best practice to avoid any type of negative terminology in the stem and instead keep it as straightforward and clear as possible.
Key
The key is the answer. It should be completely clear and not leaving any doubt for someone who has mastered the skill.
Distractors
Distractors are the other options for the assessment item. There is an artform for crafting proper distractors. For someone who has mastered the skill, a distractor should be obviously wrong. However, distractors should be plausible enough that they could be seen as the correct answer if the test-taker has missed a step or applied a common misconception.
In the geometry example above, a strong set of distractors might be “7; 8; 8.5” with the actual answer being “7.21.” However, weak distractors would be obviously incorrect, such as “4,000; -30; 0.” These answers enable the student to use process of elimination rather than applying deep understanding to the actual process.
Let’s look at one more example from top to bottom. Rather than math, the students here are asked to analyze a food chain for an earth-science assessment.
Standard: Analyze food chains and food webs for an ecosystem.
Stimulus: The diagram below shows a forest food chain.
Grass > Grasshopper > Frog > ?
Stem: Which organism is missing from this food chain?
Key: Fox
Distractors: Fly, Mushroom, Rabbit
Examining this assessment item as a whole, it is clear that this uses best practices. The standard, stimulus, and stem all work within the context of each other. The key is clear for someone that grasps the directional flow of a food chain. However, the distractors are all plausible enough to work for someone who is using a common misconception (e.g. “frogs eat flies” or “rabbits are mammals and bigger than frogs”).
Are You Writing Strong Assessments?
For both standardized exams and in-class tests, the quality of an assessment’s content can shape the future of each individual student, both in their test preparation and proper calibration of their learning and understanding. To that end, whether you are preparing the assessment in a computer-based module or in a traditional pencil-and-paper format, it is imperative that your assessment items follow the model above.
To learn more, attend Share My Lesson’s free annual Ideas & Innovations Virtual Conference on March 15, 2017 at 6pm EDT for Wisewire’s webinar on crafting effective classroom assessments.