1 / 36

AT, Accessibility, and High-stakes Assessments

AT, Accessibility, and High-stakes Assessments. Dave Edyburn, Ph.D. University of Wisconsin - Milwaukee. Purpose. The purpose of today’s session is to provide participants on the latest developments concerning the design and implementation of accessible high stakes assessments.

Download Presentation

AT, Accessibility, and High-stakes Assessments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AT, Accessibility, and High-stakes Assessments Dave Edyburn, Ph.D. University of Wisconsin - Milwaukee

  2. Purpose • The purpose of today’s session is to provide participants on the latest developments concerning the design and implementation of accessible high stakes assessments.

  3. Learning Outcomes • 1. Participants will be able to identify the relevant consortia that their state belongs to and demonstrate how to access relevant policy and implementation documents. • 2. Participants will be able to facilitate local conversations concerning roles and responsibilities for assistive technology interventions during the computer-based assessments.

  4. Overview of the Consortia

  5. Two Groups of Consortia • In 2010, the U.S. Department of Education funded four consortia to create new high stakes assessments aligned with the Common Core State Standards (CCSS). • Two consortia were selected to design the general assessments • Two consortia were selected to design the (1%) alternative assessments.

  6. DLM Dynamic Learning Maps (DLM) http://dynamiclearningmaps. org/ 18 Member States: Alaska, Colorado, Illinois, Iowa, Kansas, Michigan, Mississippi, Missouri, New Jersey, North Carolina, North Dakota, Oklahoma, Utah, Vermont, Virginia, Washington, West Virginia, and Wisconsin.

  7. DLM • Focus • The DLM system is accessible by students with significant cognitive disabilities, including those who also have hearing or visual disabilities, and/or neuromuscular, orthopedic, or other motor disabilities. DLM assessments are flexible. They allow for the use of common assistive technologies in addition to keyboard and mouse and touch-screen technology. • Testing Platform • KITE is the computer application used to deliver DLM assessments to students. Educator Portal is the dashboard where educators can manage student data, access professional development resources, receive test information, and view reports. KITE Client is the web-based interface used by students for taking tests. • http://dynamiclearningmaps.org/content/kite

  8. NCSC The National Center and State Collaborative Partnership (NCSC) http://www.ncscpartners.org/ 13 Member States: Arizona, Connecticut, District of Columbia, Florida, Indiana, Louisiana, Pacific Assessment Consortium (PAC‐6), Pennsylvania, Rhode Island, South Carolina, South Dakota, Tennessee, and Wyoming. Tier II affiliated states: Arkansas, California, Delaware, Idaho, Maine, Maryland, Montana, New Mexico, New York, Oregon, and the US Virgin Islands.

  9. NCSC • Focus • The NCSC GSEG project is led by five (5) national centers and twenty-six (26) states (fifteen core and eleven tier II states), to build an alternate assessment based on alternate achievement standards (AA-AAS) for students with the most significant cognitive disabilities in grades 3 through 8 and once in high school at grade 11. The goal of the NCSC GSEG project is to ensure that students with the most significant cognitive disabilities achieve increasingly higher academic outcomes and leave high school ready for post-secondary options. • Testing Platform • The National Center and State Collaborative (“NCSC”) General Supervision Enhancement Grant (“GSEG”) Project has announced that CTB/McGraw-Hill was awarded a contract to provide a comprehensive technology system to support the summative assessment for students with significant cognitive disabilities.

  10. PARRC Partnership for Assessment of Readiness for Colleges and Careers (PARRC) http://www.parcconline.org/ 12 Member States: Arkansas, Colorado, District of Columbia, Illinois, Louisiana, Maryland, Massachusetts, Mississippi, New Jersey, New Mexico, New York, Ohio, Rhode Island

  11. PARRC • Focus • PARCC is based on the core belief that assessment should work as a tool for enhancing teaching and learning. Because the assessments are aligned with the new, more rigorous Common Core State Standards (CCSS), they ensure that every child is on a path to college and career readiness by measuring what students should know at each grade level. • Blueprints are a series of documents that together describe the content and structure of an assessment. • Evidence statement tables and evidence statements describe the knowledge and skills that an assessment item or a task elicits from students. • Testing Platform • The PARRC assessment will be administer using the Pearson TestNav system.

  12. SBAC Smarter Balanced Assessment Consortium (SBAC) http://www.smarterbalanced. org/ 9 Member States: California, Hawaii, Idaho, Missouri, Montana, Oregon, South Dakota, Washington, and the U.S. Virgin Islands Pending: Connecticut, Delaware, New Hampshire, Maine, Nevada, North Dakota, Vermont, West Virginia, Wisconsin Waiting: Iowa, Michigan, North Carolina, Wyoming

  13. SBAC • Focus • Smarter Balanced is a state-led consortium working collaboratively to develop assessments aligned to the Common Core State Standards (CCSS) that accurately measure student progress toward college and career readiness. The Consortium involves educators, researchers, policymakers, and community groups in a transparent and consensus-driven process to help all students thrive in a knowledge-driven global economy. • Utilizes a computer-adaptive testing model. • Testing Platform • Created an open source platform (http://www.smarterapp.org/).

  14. Q1 What’s new?

  15. Q1 – What’s new? General • Field testing • Implementation 2014-2015 • Political landscape PARRC • Test time announcement • Reduced total number of items based on field testing SBAC • More than 4.2 million students in grades 3-8 and 11 participated in the field tests • Anticipates roll out of formative, interim, and summative components on schedule 2014-2015

  16. Q2 Where can I learn more about the technology platform each consortium has adopted?

  17. Q2 – Where can I learn more about the technology platform each consortium has adopted? • Dynamic Learning Maps (DLM) • http://dynamiclearningmaps.org/requirements • The National Center and State Collaborative Partnership (NCSC) • http://www.ctb.com/ctb.com/control/ctbLandingPageViewAction?landngPageId=58275 • Partnership for Assessment of Readiness for Colleges and Careers (PARRC) • http://www.parcconline.org/technology • Smarter Balanced Assessment Consortium (SBAC) • http://www.smarterbalanced.org/smarter- balanced-assessments/ technology/ • SETDA – Implementing Online Assessments • http://assessmentstudies.setda.org/

  18. Q3 What should AT/IEP teams know about test accommodation policies and procedures?

  19. Q3 – What should AT/IEP teams know about test accommodation policies and procedures? • DLM • https://education.alaska.gov/tls/assessment/accommodations/DLM/AccessiblityManual2014-15.pdf • NCSC • http://doe.sd.gov/oess/documents/0314TAMfl.pdf • PARRC • http://www.parcconline.org/parcc-accessibility-features-and-accommodations-manual • SBAC • http://www.smarterbalanced.org/parents-students/support-for-under-represented-students/

  20. Q4 Why does each consortium have a different accommodation policy?

  21. Q4 – Why does each consortium have a different accommodation policy? • Whereas each pair of consortia work together on many issues, each consortia is directed by their own governance boards and technical advisors. As a result, while there are many similarities in the accessibility work of PARRC and SBAC, there are some differences. The same is true with DLM and NCSC. • Each state and school district will need to become familiar with the accessibility guidelines of the particular consortia that they belong to.

  22. Q5 What does an IEP/AT team need to do to document each students' need to use AT as part of the assessment?

  23. Q5 – What does an IEP/AT team need to do to document each students' need to use AT as part of the assessment? • PARRC • PARCC Accessibility Features and Accommodation Documentation Form • http://www.parcconline.org/sites/parcc/files/PARCC%20Field%20Test%20Accessibility%20Features%20and%20Accommodation%20Documentation%20Form%20%28Optional%29.pdf • SBAC • The ISAAP tool is designed to facilitate selection of the accessibility resources that match student access needs for the Smarter Balanced assessments. • http://www.smarterbalanced.org/parents-students/support-for-under-represented-students/

  24. Q6 How can an IEP/AT team determine whether or not a specific student's AT configuration will work within the assessment platform?

  25. Q6 – How can an IEP/AT team determine whether or not a specific student's AT configuration will work within the assessment platform? • Review the guidelines provided by each consortium regarding the allowable technologies and assistive technologies. • PARRC Assistive Technology Guidelines • http://parcconline.org/sites/parcc/files/PARCC%20Field%20Test%20Assistive%20Technology%20Guidelines%20March%202014.pdf • http://pearsononlinetesting.com/TestNav/AT/ • SBAC Device Requirements • http://www.smarterbalanced.org/wordpress/wp-content/uploads/2011/12/Tech_Framework_Device_Requirements_11-1-13.pdf

  26. Q7 How can we ensure that students have experience in answering each of the various item types that will appear on the assessment?

  27. Q7 – How can we ensure that students have experience in answering each of the various item types that will appear on the assessment? • PARRC • https://www.parcconline.org/practice-tests • SBAC • http://sbac.portal.airast.org/practice-test/

  28. Q8 What should an AT plan to do to prepare for supporting students on the actual day of the test?

  29. Q8 – What should an AT plan to do to prepare for supporting students on the actual day of the test? • At the present time, the consortia have not issued guidelines for AT teams about their role during the actual administration of the assessments. As a result, check with your local or state assessment director about how students will gain access to supports that must be activated during each assessment session.

  30. Q9 What mechanisms will be available for providing feedback about the assessment experience?

  31. Q9 – What mechanisms will be available for providing feedback about the assessment experience? • PARRC • Contact form • http://www.parcconline.org/contact • SBAC • Contact your state agency • http://www.smarterbalanced.org/about/member-states/

  32. Q&A

  33. For more information • Contact: DaveEdyburn (edyburn@uwm.edu) • Learn More Edyburn, D.L. (2013). The new common core state standards assessments: Building awareness for assistive technology specialists. Closing the Gap, (Dec/Jan), 32(5), 4-8. Download the pdf: http://people.uwm.edu/edyburn/CCSSassessment.pdf

More Related