1 / 27

A Multistage Adaptive and Accessible Reading Assessment for Accountability Cara Cahalan Laitusis

A Multistage Adaptive and Accessible Reading Assessment for Accountability Cara Cahalan Laitusis ETS. Branden Hart Teresa King* Skip Livingston Pavan Pillarisetti Kitty Sheehan Elizabeth Stone* Klaus Zechner. ETS Contributors. Linda Cook* Kelly Bruce Jennifer Dean Dan Eignor

ashton
Download Presentation

A Multistage Adaptive and Accessible Reading Assessment for Accountability Cara Cahalan Laitusis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Multistage Adaptive and Accessible Reading Assessment for Accountability Cara Cahalan Laitusis ETS

  2. Branden Hart Teresa King* Skip Livingston Pavan Pillarisetti Kitty Sheehan Elizabeth Stone* Klaus Zechner ETS Contributors • Linda Cook* • Kelly Bruce • Jennifer Dean • Dan Eignor • Lois Frankel • Gena Gourley • Eric Hansen

  3. DARA Goal 4 • Field test a multi-stage component-based reading assessment. • Reduce number of students performing at “chance level” • Allow students to show what they know • Push instructional to include both comprehension and reading fluently for students with reading-based LD

  4. DARA Test Design

  5. Accessibility Elements • Students with disabilities included in pilot test • “Higher” interest passages selected based on student ratings • Single column question format (increased white space and reduced wrapping of text) • Included “context” sentence • Panel of disability experts reviewed items and made suggested revisions (simplified language)

  6. Data Collection Design

  7. Primary Research Questions • For accountability purposes, is it possible to combine scores from the two different routes on the component test (i.e., average scores from Test 1 and Test 2)? • Is the Component test more accessible than the state assessment • Do RLD students do better on the Component test than the state assessment while students without disabilities (NLD) perform similarly on both assessments?

  8. Other Research Questions • Can we reduce the number of students scoring at chance level? • Can we use automated scoring technology (SpeechRater) to score oral reading fluency measure? • Can we accurately route students based on 7, 14, 21, and 28 items? • What is the best measure of oral reading fluency? • How do we combine fluency and comprehension test scores (50/50, 25/75, 75/25)?

  9. Sample • 8th Grade Students • 26 Middle Schools • 294 RLD (final sample=275) • 194 LP (not include in this presentation) • 500 Non-Disabled (final sample=486)

  10. Race, Gender, and cut score impact Description of Sample by NLD/RLD

  11. Test Score Summaries: Route 1

  12. Test Score Summaries: Route 2

  13. Primary Research Questions • For accountability purposes, is it possible to combine scores from the two different routes on the component test (i.e., average scores from Test 1 and Test 2)? YES • Is the Component test more accessible than the state assessment • Do RLD students do better on the Component test than the state assessment while students without disabilities (NLD) perform similarly on both assessments? YES, for Route 1

  14. Can we reduce the number of students scoring at chance level?

  15. Can we accurately route students based on 7, 14, 21, and 28 items? Routing decision

  16. Can we use automated scoring technology (SpeechRater) to score oral reading fluency measure? Fluency TestHuman vs. Automated Scoring

  17. Future Questions for Study and Policy • Q: What is the best measure of oral reading fluency? • Corrected words per minute in 1st minute • Words per minute, corrected words per minute, percent correct, rating • Q: How do we combine comprehension and fluency scores • 25% fluency + 75% comprehension • 50/50, 75/25

  18. Contact information Cara Cahalan Laitusis Senior Research Scientist Educational Testing Service Mailstop 09R Princeton, NJ 08541 CLaitusis@ETS.org

  19. Extra Slides

  20. Test Score Correlations: Route 1 NLD RLD

  21. Test Score Correlations: Route 2 NLD RLD

More Related