Writing Better Objective Tests - University of Wisconsin

Writing Better Objective Tests Bill Cerbin UW‐La Crosse, Center for Advancing Teaching & Learning Prioritize the subject matter What subject matter—topics, ideas, concepts are 1. Essential 2. Important 3. Worth being familiar with Use items that measure the course/unit learning objectives. What are the major objectives of the unit or course that should be measured by the test? Create a Test Plan (Table of Specifications) that classifies each item according to what topic or concept it tests AND what objective it addresses. The table can help you write a test that has content validity—there is a match between what was taught and what is tested. The table helps insure that you 1. emphasize the same content you emphasized in day‐to‐day instruction (e.g., more items about topic X and fewer about topic Y because you consider X to be more important and you spent more time on X) 2. align test items with learning objectives (e.g., important topics might include items that test interpretation, application, prediction, and unimportant topics might be tested only with simpler recognition items) 3. do not overlook or underemphasize an area of content Test Plan (Table of Specifications) Format Subject Matter concepts, topics, ideas Topic A Topic B Topic C Topic D Knowledge (recall) Learning Objectives Analyze Apply Interpret Total Etc. Write matching pairs of items after each class. While the class period is still fresh write 2‐3 items that focus on the most important material from the class. Ideally, write matching pairs of items, i.e., two items that test the same thing. Use one for formative evaluation (e.g., non‐credit quiz or review) and the other one for summative evaluation, i.e., the test). The question stem 1. There are two acceptable forms of stems a) one that poses a complete, direct question and b) one that is a statement that needs to be completed. – Direct question: What color results from the mixture of equal parts of yellow and blue paint? – Completion: The actual election of the U.S. president to office is done by 2. Put all relevant information in the stem and make it sufficiently specific to stand on its own without qualification. 3. Also try to make the stem short and sharply focused. 4. For questions that ask for recall of information do not introduce information into the stem that has not been part of the material covered in class. 5. Avoid negative wording because it can confuse students. – Bad: Which of the following is not a characteristic of Brutalism? – Better: Which of the following best distinguishes Brutalism from other architectural movement? Correct/Best answer 1. Write the best answer first, then response alternatives 2. Be sure there is only one “best” answer Response alternatives (AKA response options or distractors) 1. Should be brief and grammatically parallel. Make response alternative grammatically consistent with the stem. An example of a grammatically inconsistent item: The functions of the Federal Reserve are to provide the nation with an elastic money supply and to A. help stabilize the economy B. correction of national income statistics C. correction of tax laws D. help levy property taxes 2. Make responses roughly equal in length. FYI—Instructors tend to make the best answer longer. 3. Eliminate repetitive information from response alternatives Bad: Between 1950 and 1965 A. Interest rates increased B. Interest rates decreased C. Interest rates fluctuated greatly D. Interest rates did not change Better: What was the trend in interest rates between 1950 and 1965? A. Increased only B. Decreased only C. Increased, then decreased D. Remained unchanged 4. Take special care or avoid using no‐exception words such as “never,” “all,” “none,” and “always.” These words tend to give clues about the correct answer. Few statements are absolute or universally true. 5. Avoid using “all of the above” as an alternative. If the student can identify one alternative that is not true then it eliminates that alternative plus “all of the above,” making it easier to guess. 6. Vary the position of the best answer. Instructors tend to put the best answer in the B or C position. 2 CATL Colloquium Handout Available online at http://catl.typepad.com/ Writing Better Objective Tests November 25, 2009 7. Vary the number of response options as appropriate. A. It is acceptable practice to vary the number of response alternatives on a test. B. There is no single best number of response alternatives. Research indicates that 3 alternatives is just as effective as four. C. If appropriate, vary the number of response alternatives within the same test. 8. Avoid using “all of the above” or none of the above” just to fill space An alternative to the all of format is to ask students to find all correct answers to an item. For example, “Acceptable practices for response alternatives on multiple choice tests include” A. There is no single best number or response alternatives for multiple choice items B. Avoid using all of the above and none of the above as response alternatives C. Response alternatives should be plausible answers D. Response alternatives for a test item should be roughly equal in length Caveat: Scantron can’t handle multiple correct responses 9. If you have difficulty creating enough plausible incorrect answers for an item, put it on the test as a fill‐in item and students will produce a range of incorrect responses that you can use next time. Overall test construction 1. Test Instructions. • Instruct students to select the “best answer” and acknowledge that sometimes the response options have elements of accuracy, but there is a best answer for each item. Using the “correct answer” invites arguments from students that their answers are correct as well. • State whether there are rewards or penalties for guessing. – Types of advantages: might get additional credit; instructor allows students to justify their guesses and gives partial credit. – Type of penalty: Instructor deducts additional points for incorrect answers. – Consider using: “I don’t know” as a response alternative. 2. Include a few easy items at the beginning of the test. You can help to reduce test anxiety by including a few easy items to start the test. 3. Group test items related to the same topics. Items related to a specific topic should be grouped together on the test. This allows students to think about each topic and section of the material rather than jump from topic to topic. Consider using labels or headings to indicate topics and then group related items under the headings. 4. Test length. If you want to test what students know, as opposed to how fast they can read and respond, be sure to give sufficient time to complete the test. Otherwise, you penalize poor readers, slow readers, deliberate and reflective responding. Students can complete 1‐2 multiple choice items per minute. 5. Time limits. Students with certain forms of learning disabilities need longer to complete a test. Students whose first language is not English will also need longer. While you can likely accommodate such students individually, consider carefully what educational goal would lead you to impose a time limit on a test at all. Another class following into your room is not a good 3 CATL Colloquium Handout Available online at http://catl.typepad.com/ Writing Better Objective Tests November 25, 2009 educational reason to refuse to allow a student to finish a test. Numerous short tests or quizzes work better for most students than big, high‐stakes tests. 6. Test‐taking accommodations: Students with disabilities must register with Disability Resource Services (DRS) to qualify for accommodations. DRS will let you know what kinds of accommodations a student needs for test‐taking. In addition to more time, another common need is for a separate, quiet room. Hallways are poor choices because of distractions. Improving items using feedback and item analysis 1. Feedback from fellow instructors 2. Feedback from students—Review test results with students; opportunity to give feedback to students and get feedback from them about test items 3. Item Analysis. If you have tests scored by IT you can request an item analysis that provides item difficulty and item discrimination. IT has a handout that helps you interpret the analysis. Ask for Revised Test Scoring Display and Item Analysis. A. item difficulty‐‐ Percentage of students who correctly answer item. Desirable difficulty levels for different types of items Question Type 5 response options
4 response options
3 response options
True/False Item Difficulty % correct 60% 62% 66% 75% Review items at the extremes—very easy, >90% correct, or difficult, <20% correct. B. item discrimination—Relationship between how well students do on item and how well they do overall on the test. High discrimination is good—it means that students who do well on the test tend to get the item correct and those who do poorly on the test tend to get it incorrect How Well Item Discriminates Very good
Good
Fairly good
Poor Discrimination Value >.40 .30‐.39 .20‐29 <.20 High discrimination (>.40): Students with high test scores responded correctly and students with low scores responded incorrectly. Very low discrimination (<.19): Students with high test scores do poorly on the item, and low scoring students may do better. 4 CATL Colloquium Handout Available online at http://catl.typepad.com/ Writing Better Objective Tests November 25, 2009 Examples of Different Types of Multiple Choice Items Multiple choice questions ask students to discriminate among different plausible options and select the best answer. Typically, these items are written at a basic level and students can answer the question based on recognition or recall memory. There is nothing wrong with items that test for basic knowledge or familiarity. But, what if you want to test more complex objectives such as understanding, problem solving, or analytical reasoning? To test complex learning objectives you can either 1) use different forms of items such as essay, short answer, performance tasks) or 2) develop multiple choice items that address the objectives. Multiple choice questions that measure complex learning objectives. It is challenging to write multiple choice questions that test complex learning. Typically these items present the student with a scenario, passage, or graphic representation and then pose several questions related to the material. These can be used to assess complex thinking abilities such as prediction, interpretation, application, or evaluation. Here are some alternative types of multiple choice items (Davis, 2009, pp. 391‐393). Prediction question. In the stem of the question present a problem or situation and ask students to predict an outcome. Identify a principle or theory question. In the stem of the question present an example of some phenomenon and ask students to identify a principle or theory that it illustrates. 5 CATL Colloquium Handout Available online at http://catl.typepad.com/ Writing Better Objective Tests November 25, 2009 Assertion‐reason questions (ARQ). The stem of these questions includes “two statements—an assertion and a reason—linked by because” (Davis, 2009, p. 393) You‐are‐the‐teacher‐question. The stem of the question asks the student to assume the role of teacher and then evaluate short written passages (i.e., flawed short test answers). Multiple choice + short written justification. If you want to know the reasoning behind their answers ask students to write a short justification of why they selected their specific choice. CATL Colloquium Handout Available online at http://catl.typepad.com/ 6 Writing Better Objective Tests November 25, 2009 ConcepTests. Harvard physicist, Eric Mazur, uses ConcepTests to evaluate students’ conceptual understanding. Typically, the stem of the item includes a diagram, picture or description of a situation that illustrates a physics concept, and asks the student to predict an outcome. Below are some examples from Mazur, E. (1997). Peer instruction: A user’s manual. Prentice Hall, Upper Saddle River, NJ. 7 CATL Colloquium Handout Available online at http://catl.typepad.com/ Writing Better Objective Tests November 25, 2009 Handout References Slavin, R. E. (2000). Educational psychology: Theory and practice, 6th edition. Allyn and Bacon: Boston, MA. Davis, B. G. (2009). Tools for teaching, 2nd edition. Jossey Bass Publisher: San Francisco. Stiggins, R. J. (1994). Student‐Centered Classroom Assessment. Macmillan Publisher: NY. Wiggins, G. (1998). Understanding by design. Association fo Supervision and Curriculum Development: Alexandria, VA. Available from IT, Revised Test Scoring Display and Item Analysis. Online Resource related to writing objective tests http://testing.byu.edu/info/handbooks/betteritems.pdf Steven J. Burton, et al. “How to Prepare Better Multiple‐Choice Test Items: Guidelines for University Faculty,” Brigham Young University Testing Services and The Department of Instructional Science, 1991. All you need in one place. Explains the advantages and limitations of multiple‐choice tests, how to decide whether you should use such tests or not, how to write questions that measure more complex types of learning than recall of facts, and provides a variety of formats for multiple‐choice questions. Even has a checklist at the end. 8 CATL Colloquium Handout Available online at http://catl.typepad.com/ Writing Better Objective Tests November 25, 2009