1 / 19

Dynamic Learning Maps Alternate Assessment Consortium

Dynamic Learning Maps Alternate Assessment Consortium. Laura Kramer Center for Educational Testing and Evaluation University of Kansas March 3, 2011. Overview. Race to the Top Assessment competition focused on the general assessment.

Download Presentation

Dynamic Learning Maps Alternate Assessment Consortium

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dynamic Learning Maps Alternate Assessment Consortium Laura Kramer Center for Educational Testing and Evaluation University of Kansas March 3, 2011

  2. Overview • Race to the Top Assessment competition focused on the general assessment. • OSEP released a grant competition in mid-June for an alternate (“1%”) assessment. • KSDE approached CETE in late June about applying for the grant. • Of 5 applications, only 2 were awardees.

  3. Consortia Collaboration • The other grantee under this competition is the National Center and State Collaborative. • To maximize resources the two consortia are exploring ways to work together. • To maintain coherence in state assessment systems, also will explore working with the Race to the Top Assessment consortia (SBAC and PARCC).

  4. State Participants Iowa Kansas Michigan Mississippi Missouri New Jersey North Carolina Oklahoma Utah Virginia* West Virginia Washington* Wisconsin

  5. Other Participants • University of Kansas • Beach Center on Disability • Center for Educational Testing and Evaluation • Center for Research Methods and Data Analysis • Center for Research on Learning • Faculty in several departments • AbleLink Technologies • The ARC • The Center for Literacy and Disability Studies at the University of North Carolina at Chapel Hill • Edvantia

  6. Major Tasks • Common Core Essential Elements and creating ALDs • Development and validation of learning maps • Creation of instructionally relevant item types • Technology development • Item and assessment development • Standard setting • Professional development • Instructional consequences • Family engagement and dissemination

  7. Essential Elements and Learning Maps • Use Common Core State Standards as the starting point. • Derive key concepts in iterative fashion. • Break down skills in CCSS and identify multiple (somewhat) hierarchical pathways. • Review nodes with educators and content experts to ensure that links to CCSS are maintained.

  8. Purposes • Establish consistency in expectations • Emphasize skill similarities within diverse ways of performing • Provide instructional guidance • Connect instruction and assessment • Accommodate diverse disabilities

  9. Dynamic Learning Maps • The goal – proficiency on the CCSS • Like driving from DC to San Francisco… • Won’t always take the shortest or most direct route • Scenic routes, through small towns, occasional detours, might spend some extra time in one place or another (or go back and revisit a town) • But we keep going!

  10. Dynamic Learning Maps • Assessment integrated with instruction • Multiple waypoints with detailed feedback • Between St Louis and Denver vs between Lawrence and Topeka • Identifying specific aspects of student mastery to pinpoint what has been achieved, or what still needs work • Different paths • Driving conditions differ, so that may lead you to choose a different route • Students don’t all travel the same roads, so DLM provides many routes for students to demonstrate mastery

  11. Dynamic Learning Maps • Focus on what students CAN do • Identify gaps of what students cannot do (yet) • Provide standardized scaffolds to unpack what precursor skills are missing • Offer logical next steps for instruction / skill-building

  12. So Far… • Developing Learning Maps and Essential Standards • Developing new technology to deliver instructional tasks • Identifying instructionally relevant and sensitive item types • Preparing first professional development modules

  13. Coming Soon… • From Essential Elements, develop assessment achievement level descriptors, instructional achievement descriptors, and examples • Develop assessment tasks based on nodes of Learning Maps, reviews including cognitive labs, pilot testing, and field testing • Enhance assessment task delivery through technology platform, including “built-in” accommodations and accessibility

  14. Item Development • Development of instructionally relevant item types • Focus groups with master educators • Review by special education, assessment, and technology experts • Use of evidence centered design • Student model (What skills in the learning maps should be assessed?) • Evidence models (What behaviors will provide necessary evidence?) • Task models (What tasks will elicit the behaviors required for evidence gathering?)

  15. Item Development • Creation of item and item pool specifications • Align items to learning maps (not just a 1-to-1 correspondence!) • Develop items and standardized scaffolds so that incorrect responses lead to further diagnostic inquiry • Cognitive labs with students, teachers, and parents • Review by educators with content expertise and special education expertise • Other internal and external reviews (bias, sensitivity, editorial, etc.) • Pilot testing • Census field testing (last year of grant)

  16. Next Generation Assessment System Planned New Assessment Delivery Features • Probabilistic model (Bayesian network-based) for item selection added to existing linear, item-level-adaptive, testlet-adaptive, and multi-dimensional IRT adaptive test models • Constructed response and other new item types • Item Scoring • Keyword lists • Numerical responses • Hot spots • Drag and drop • Others to be determined

  17. Next Generation Assessment System Planned New Assessment Delivery Features • Enhanced accommodation/universal design capabilities including (but not limited to) • Audio via sound files • American Sign Language video • Pop-up context-dependent dictionaries/glossaries • Text and image magnification • On-screen note taking • Color overlays • IntelliKeys™ keyboard accessibility • Masking • Section 508, QTI, APIP, and SCORM compliant • Proctoring and real-time results monitoring capabilities

  18. Professional Development Topics • Emphasis on instruction • UDL principles as they relate to students with significant intellectual disabilities • Integration of (extended) standards, maps, and the assessment process • Relationship with goal setting and IEP development

  19. THANK YOU! For more information, please contact: Neal Kingston (nkingsto@ku.edu) or Alan Sheinker (alans@ku.edu)

More Related