1 / 20

What Item Level Data Tell Us About Universal Design:

What Item Level Data Tell Us About Universal Design:. Fantasy, Foolishness, or Fuel for Fire? in Large-Scale Assessments. Presenters:. Martha Thurlow, NCEO Liru Zhang, Department of Education, Delaware Karen Barton, Research Triangle Institute, International

Download Presentation

What Item Level Data Tell Us About Universal Design:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What Item Level Data Tell Us About Universal Design: Fantasy, Foolishness, or Fuel for Fire? in Large-Scale Assessments National Center on Educational Outcomes

  2. Presenters: Martha Thurlow, NCEO Liru Zhang, Department of Education, Delaware Karen Barton, Research Triangle Institute, International Brent Garrett, Alliance for Systems Change, MSRRC National Center on Educational Outcomes

  3. Title I Regulations introduced the need for universally designed assessments – [Assessments must be] designed to be accessible and valid with respect to the widest possible range of students, including students with disabilities and students with limited English proficiency. Sec. 200.2(b)(2) National Center on Educational Outcomes

  4. Proposed Reauthorization of IDEA includes universal design concepts as well – UNIVERSAL DESIGN- The State educational agency (or, in the case of a districtwide assessment, the local educational agency) shall, to the extent possible, use universal design principles in developing and administering any assessments…. Sec 612, Senate Bill 1248 National Center on Educational Outcomes

  5. A Quick Definition Universally designed assessments are built from the beginning to be accessible and valid for the widest range of students. National Center on Educational Outcomes

  6. Five categories for consideration of item bias grew out of an international effort (SENDD subgroup of OECD): Bias due to item layout and design Bias due to differential acculturation/lack of experience directly related to disability Bias due to unnecessary or extraneous complexity of item language or length, including bias due to memory load and bias due to verbal load National Center on Educational Outcomes

  7. Five categories for consideration of item bias grew out of an international effort (SENDD subgroup of OECD): Bias due to presentation or response design that differentially affects users of assistive technology or alternate formats Bias due to use of manipulatives that differentially affect students, irrelevant to the construct being measured National Center on Educational Outcomes

  8. Another effort involved reviewing and piecing together information from other fields: • Vision • Ergonomics • Graphic Design • Architecture National Center on Educational Outcomes

  9. From this review, NCEO identified a set of elements of universally designed assessments • They relate to different points in the item development process – some of these are the responsibility of item writers, some are the responsibility of test contractors; some are the responsibility of both • Some have more research behind them than others – so far National Center on Educational Outcomes

  10. Elements of Universally Designed Assessments • Inclusive assessment population • Precisely defined constructs • Items developed and reviewed for bias and accessibility • Amenable to accommodations National Center on Educational Outcomes

  11. Elements of Universally Designed Assessments • Simple, clear, and intuitive instructions and procedures • Maximum readability/ comprehensibility • Maximum legibility: text, graphs, tables, illustrations, and response formats National Center on Educational Outcomes

  12. “Four holds on one of the rock climbing walls are labeled on the diagram below. Matthew first climbs vertically 10 feet from Hold A to Hold B, horizontally 25 feet from Hold to Hold C, and then vertically 15 feet from Hold C to Hold D. How many fewer feet would Matthew have climbed if he had climbed directly from Hold A to Hold D?” Is the use of “hold” as a noun familiar to students? Is the concept of a “rock climbing wall” familiar to most students? Will students be distracted by the odd shapes on the diagram? National Center on Educational Outcomes

  13. Amenable to Accommodations Could this item be presented in an alternate format? Braille? Is the high number of items on the map and long list of cities necessary to respond to this item? “According to this weather page, which place is the warmest on December 28?” If you were flying to Chicago the day this weather page was printed, what information could you learn for your trip from this page? National Center on Educational Outcomes

  14. Legible graphs, tables, illustrations What is that big black rectangle? National Center on Educational Outcomes

  15. Is the border distracting? National Center on Educational Outcomes

  16. Element #5: Simple, clear, and intuitive instructions and proceduresAre the swimmers at the bottom of the page distracting? National Center on Educational Outcomes

  17. Abedi research suggested that linguistic complexity of test items was a significant source of measurement error for ELL students (and students with disabilities) Examples of Linguistic Modifications: Familiarity or frequency of non-math vocabulary (unfamiliar or infrequent words changed): Census Video game Length of nominals (long nominals shortened): Last year’s class vice president The vice president National Center on Educational Outcomes

  18. Many of the Universal Design elements directly address the characteristics of test items. • Is there a way to tell which items may be a problem without looking at all of the items? • We have procedures that we use to identify potential bias for other subgroups – will these work for students with disabilities? National Center on Educational Outcomes

  19. Universal Design Project at NCEO has suggested three ways to flag items: • Statistical analysis of test results (or field test results) • Expert review using universal design considerations • Think aloud (cognitive labs) with students National Center on Educational Outcomes

  20. Focus Today – Statistical Analysis of Test or Field Test Results • Do these approaches work for identifying problematic items? Pros and cons of approaches? • Should we instead do as one TAC said – just look at all items? National Center on Educational Outcomes

More Related