1 / 14

Frameworks for Considering Item Response Demands and Item Difficulty Kristen Huff College Board

NCSA Session: Theory and Research on Item Response Demands: What Makes Items Difficult? Construct-Relevant? June 20, 2010 Detroit. Frameworks for Considering Item Response Demands and Item Difficulty Kristen Huff College Board Steve Ferrara CTB/McGraw-Hill. Thesis.

Download Presentation

Frameworks for Considering Item Response Demands and Item Difficulty Kristen Huff College Board

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NCSA Session: Theory and Research on Item Response Demands: What Makes Items Difficult? Construct-Relevant?June 20, 2010 Detroit Frameworks for Considering Item Response Demands and Item Difficulty Kristen Huff College Board Steve Ferrara CTB/McGraw-Hill

  2. Thesis A coherent and comprehensive understanding of the interaction between items and examinees, and the controllable features of items that elicit predictable interactions is needed to: • Design items that do a better job of measuring what we’re interested in knowing • Design tests that are better suited to facilitate valid inferences about student performance • More generally, bridge the gap between large-scale assessment and teaching & learning

  3. Objective To conduct research (& review past research) that informs the development of a framework, or conceptual structure, of item response demands. Research questions: How much do existing item response demand schemas cover current achievement constructs? What are the features of items that influence item difficulty? What student responses are triggered by different item features, and how do these responses influence item difficulty?

  4. feature feature feature feature

  5. Reading comprehension # words reading level overlap between key and text Math number of variables graphics fractions vs whole numbers feature feature feature feature

  6. Framework

  7. Framework

  8. Conclusions & Points of Discussion • Mapping the landscape of response demands, item features and the interaction between the two is messy, difficult work • Achieving “coherent and comprehensive” frameworks needs to be higher priority in research • Start now using draft in operational testing programs • Need more effective and easier ways to gauge opportunity to learn for many reasons, but also to inform the work we are recommending

More Related