1 / 14

Developed and Narrated by

Michigan Assessment Consortium Common Assessment Development Series Module 9B – Editing the Draft Test Items. Developed and Narrated by. Edward Roeber Professor, MQM Michigan State University. Support.

zarola
Download Presentation

Developed and Narrated by

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Michigan Assessment ConsortiumCommon Assessment Development SeriesModule 9B –Editing the Draft Test Items

  2. Developed and Narrated by Edward Roeber Professor, MQM Michigan State University

  3. Support The Michigan Assessment Consortium professional development series in common assessment development is funded in part by the Michigan Association of Intermediate School Administrators in cooperation with …

  4. In this module, you will learn about Why item editing is needed What are some common errors item writers make How to edit the draft items submitted How to prepare the assessments for field testing/pilot testing

  5. Why is Item Editing Needed? • Most item writers are not very proficient as item writers • They make a variety of errors that need to be corrected if the item is to be used • Even good item writers can make mistakes or create poor items, which they can’t see because they are so “wrapped up” in item development

  6. Common Errors Made by Writers • There are a number of errors you may see: • Correct answer is the longest response • Stem is unnecessarily wordy • Stem may be poorly worded or confusing • Some incorrect answers are not plausible • Written response items may not yield scorable responses

  7. Common Errors Made by Writers • Other errors you may see: • Unnecessary or confusing stimulus materials • Lack of alignment between the item and the intended standard or expectation • Unclear scoring rubrics for written-response items • One item may cue the correct response to another

  8. How to Edit the Items • There are typically three steps in editing needed: • Assessment edits • Content review • Bias and sensitivity review • These might be done with one person or different individuals • An advisory or review group might also help out

  9. Assessment Edits • Someone with experience in developing and editing items should be used • May be one or more than one person • The goal is to correct the issues raised in the earlier slides • Shorten, simplify, and straighten out the items, one at a time • Discard items that appear to be unfixable (e.g., when a complete re-write would be needed to salvage the item)

  10. Content Review • Content review is needed to make sure that the final item pool does not contain any items where the correct answers are incorrect or not accurate, according to content experts • This may be a review panel of content area experts, or just one specialist • Should occur after the assessment specialist edits are complete but before the item is field tested

  11. Bias and Sensitivity Reviews • This review is essential to assure that there is no systematic bias in the items, nor that any items contain material that might be deemed as offensive to a subgroup of test takers • The challenge is to come up with interesting topics for the questions, without using content that might be offensive

  12. Bias and Sensitivity Reviews • If items that appear to be biased are found, they need to be revised or dropped • This is especially true about items that delve into sensitive areas, such as politics, religion, and so forth • Use persons who know how to spot bias and sensitive topics, not just minority group members

  13. How to Prepare the Items for Field Testing/Pilot Testing • Before the items are used for real, they need to be tried out: • Pilot tested with small samples of students • Field tested with representative groups of students • Tryouts are essential for any set of items that will be used in high-stakes decisions, such as promotion or graduation, with students (for ethical, policy, and legal reasons)

  14. How to Prepare the Items for Field Testing/Pilot Testing • Once the assessment edits, content review, and bias/sensitivity reviews have been completed, the item pool should be updated - all changes made to the items • Ideally, the items would be entered by this time (or earlier) in an item bank • The pilot or field test design will dictate how many forms, containing how many items measuring how many different skills • The item bank should be used to assemble the needed forms

More Related