html5-img
1 / 48

California K-12 Assessment Update

California K-12 Assessment Update. California Education Research Association November 30, 2012 Patrick Traynor, Ph.D. Director, Assessment Development and Administration Division Eric Zilbert, Ph.D. Administrator, ADAD Psychometric Unit. Presentation Overview.

Download Presentation

California K-12 Assessment Update

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. California K-12 Assessment Update California Education Research AssociationNovember 30, 2012Patrick Traynor, Ph.D.Director, Assessment Development and Administration DivisionEric Zilbert, Ph.D.Administrator, ADAD Psychometric Unit

  2. Presentation Overview • Recent and Upcoming SBAC Developments • Scoring Technologies • Alternate Assessment Participation (1% Population) • Transitional Activities

  3. Recent and Upcoming SBAC Developments

  4. SBAC Cognitive Labs Purpose To examine the impact of various item formats, tools, and accommodationson the ability of students to demonstrate in depth understanding of content measured. Sampling • 700 students • ≈ 85 students in California • Focus on SWD, ELL, and low SES students.

  5. SBAC Cognitive Labs Methodology • Research questions related to technology-based assessment • A trained facilitator • administers test items • conducts a interview (≈ 90-120 minutes). • Computer provided by the facilitator • Labs (two approaches) • Think aloud as work through an item • First solve, then respond to questions about approach

  6. SBAC Small Scale Trials Purpose To inform automated and human scoring Sampling • Stratified random sampling • ≈ 900 schools across member states • ≈ 230 schools in California • Goal: to be representative of SBAC students • Schools • randomly select one to two classrooms • assigned grade (4, 7, or 11)

  7. SBAC Small Scale Trials Methodology • Each assessment • 15-18 selected-response and constructed response items • ≈ 60-90 minutes of student time • School computers • Each participating school designates a school coordinator who is trained to administer the assessment • Schools are not required to assess all selected students at the same time or on the same day

  8. Smarter Balanced Pilot Testing • Smarter Balanced will conduct a Pilotcomputer-based administration of their assessment system beginning in February 2013. • Items will be aligned to the Common Core State Standards and will include selected response, constructed response, and performance tasks • Participation in the Pilot Test will be open to all schools in the Consortium and will be administered to students in grades 3–8 and grade 11.

  9. Smarter Balanced Pilot Testing • The Pilot Test entails two approaches (or components) in its implementation: • “Volunteer” component that is open to allschools in Smarter Balanced states and will ensure that all schools have the opportunity to experience the basic functionality of the system 2) “Scientific” component that targets a representative sampleof schools and yields critical data about the items developedto date, as well as how the system is functioning

  10. Pilot Participation Important Dates The table below presents upcoming key activities related to the Smarter Balanced Pilot Test:

  11. Smarter Balanced Draft Achievement Level Descriptors (ALDs) • “Describe” four levels of achievement: “deep command,” “sufficient command,” “partial command,” or “minimal command” of knowledge, skills, and processes in both English–language arts/literacy and mathematics • First draft ALDs for public comment from November 27, 2012 through January 15, 2013 • A full description of the ALDs and an online survey for providing feedback are available on the Smarter Balanced achievement level descriptors Web page at http://www.smarterbalanced.org/achievement-level-descriptors-and-college-readiness/

  12. Smarter Balanced Sample Items and Performance Tasks

  13. Purpose of Sample Items and Performance Tasks • Demonstrate rigor and complexity of ELA/literacy and mathematics questions • Showcase variety of item types: • Selected response • Constructed response • Technology enhanced • Performance tasks • Help teachers to begin planning for the shifts in instruction

  14. “Students can demonstrate progress toward college and career readiness in English Language arts and literacy.” Claims for the ELA/Literacy Summative Assessment • “Students can demonstrate college and career readiness in English language arts and literacy.” Overall Claim for Grades 3-8 • “Students can read closely and analytically to comprehend a range of increasingly complex literary and informational texts.” Overall Claim for Grade 11 • “Students can produce effective and well-grounded writing for a range of purposes and audiences.” Claim #1 - Reading • “Students can employ effective speaking and listening skills for a range of purposes and audiences.” Claim #2 - Writing • “Students can engage in research and inquiry to investigate topics, and to analyze, integrate, and present information.” Claim #3 - Speaking and Listening Claim #4 - Research/Inquiry

  15. “Students can demonstrate progress toward college and career readiness in mathematics.” Claims for the Mathematics Summative Assessment • “Students can demonstrate college and career readiness in mathematics.” Overall Claim for Grades 3-8 • “Students can explain and apply mathematical concepts and interpret and carry out mathematical procedures with precision and fluency.” Overall Claim for Grade 11 • “Students can solve a range of complex well-posed problems in pure and applied mathematics, making productive use of knowledge and problem solving strategies.” Claim #1 - Concepts & Procedures • “Students can clearly and precisely construct viable arguments to support their own reasoning and to critique the reasoning of others.” Claim #2 - Problem Solving • “Students can analyze complex, real-world scenarios and can construct and use mathematical models to interpret and solve problems.” Claim #3 - Communicating Reasoning Claim #4 - Modeling and Data Analysis

  16. Exploring the Sample Items

  17. Sample Items and Tasks Landing Page to Launch Oct. 9

  18. Sample Items and Tasks Navigation View mathematics or ELA/literacy items Advance to next item, or go back to previous

  19. Sample Items and Tasks Navigation Content Claim Grade band

  20. Sample Items and Tasks Navigation Filter by item type, themes

  21. Item Metadata About this item

  22. Exploring the Sample Items Selected response and technology enhanced items are machine scorable

  23. Accessibility and Accommodations • Sample items do not include accessibility and accommodations features • As part of the development of accessibility and accommodations policies, the Consortium commissioned research on best practices for assessing English language learners and students with disabilities • When fully operational, Smarter Balanced will be providing translation accommodation options for all math items • For the Pilot, Smarter Balanced will be providing full Spanish translations and translated Spanish pop-up glossaries that are customized at the item-level for a specific form for each of three grades (almost 70 items each). Each grade-level (elementary, middle, and high school) will have one grade with the special form. • The Consortium will need ELLs to participate in the Pilot to make sure they are represented in the student sample.

  24. Accessibility and Accommodations • Full range of accessibility tools and accommodations options under development guided by: • Magda Chia, Ph.D., Director of Support for Under-Represented Students • Accessibility and Accommodations Work Group • Students with Disabilities Advisory Committee • Chair: Martha Thurlow (NCEO) • English Language Learners Advisory Committee • These teams of experts will ensure that the assessments provide valid, reliable, and fair measures of achievement and growth for both English learners and Students with Disabilities • Smarter Balanced will also continue working with educators and experts in the field to design and test the assessment system • Learn more online: • http://www.smarterbalanced.org/parents-students/support-for-under-represented-students/

  25. Scoring Technology • Templates • Optical Scanning • Scantron • Electronic image based scoring (E.g. Pearson e-Pen) • Scan to Score • Traditional machine scoring • Dichotomous (correct/incorrect) scoring most common • Exact word, number, or grid matches • No partial credit • Automated Scoring • Allows scoring of short answer and essay questions • Require set of human scored papers to develop the scoring model • Can give partial credit, or multiple point scores

  26. How Automated Scoring Works • Uses a set of human scored examples to develop a statistical model used to analyze answers • Generally examine overall form and specific combinationsof words • Has an extensive library of possible meanings for words

  27. What can be scored? • Written responses • Prompt Specific Essays • Prompt Independent Essays • Short Answers • Summaries • Spoken language • Correctness • Fluency • Responses to simulations • Diagnosis of a patient’s illness • Landing a plane

  28. How good is automated scoring? According to ETS, Pearson and the College Board in the recent, report “Automated Scoring for the Common Core Standards:” • Consistent with the scores from expert human graders • The way automated scores are produced is understandable and meaningful • Fair • Validatedagainstexternal measures in the same way as is done with human scoring • The impact of automated scoring on reported scores is understood

  29. Source: Streeter et. al. Pearson’s Automated Scoring of Writing, Speaking, and Mathematics, Pearson, May 2011

  30. Example Essay Feedback Source: Streeter et. al. Pearson’s Automated Scoring of Writing, Speaking, and Mathematics, Pearson May 2011

  31. Data Requirements for Various Types of Automated Scoring Source: Streeter et. al. Pearson’s Automated Scoring of Writing, Speaking, and Mathematics, Pearson, May 2011

  32. Alternate Assessment Participation • California recently joined the National Center and State Collaborative (NCSC) as a Tier II state • Representing a Tier II state, the California team will: • Dedicate a staff member to coordinate the work • Work directly with members of the Special Education Administrators of County Offices of Education (SEACO) and with directors of special education local plan areas (SELPA) to build a community of practice • Meet directly with the field implementers every other month with technology supported meetings in between and as needed • Deliver electronically to California stakeholders the comprehensive curriculum, instruction, and professional development modules available from the NCSC on the CCSS expected by fall 2012

  33. STAR In-Transition Activities • Considerations • SBAC • CAT • ELA and Math Only • System – Formative, Interim, and Summative • Specific Planned Activities

  34. English Language Arts and Mathematics, Grades 3–8 and High School BEGINNING OF YEAR END OF YEAR INTERIM ASSESSMENT INTERIM ASSESSMENT Computer Adaptive Assessment and Performance Tasks • PERFORMANCE • TASKS • Reading • Writing • Math END OF YEAR ADAPTIVE ASSESSMENT Last 12 weeks of year* Formative Assessments DIGITAL CLEARINGHOUSE of formative tools, processes and exemplars; released items and tasks; model curriculum units; educator training; professional development tools and resources; scorer training modules; and teacher collaboration tools. Computer Adaptive Assessment and Performance Tasks Scope, sequence, number, and timing of interim assessments locally determined Re-take option Optional Interim assessment system— Summative assessment for accountability * Time windows may be adjusted based on results from the research agenda and final implementation decisions. 34 Source: http://www.ets.org

  35. Computer Adaptive Testing

  36. Assessments to Consider (by content area and grade level)

  37. Relationship of Assessments Perie, Marion, Gong, Wurtzel, 2007

  38. STAR In-Transition Activities Alignment to Common Core State Standards • Planning to provide CCSS-aligned results on STAR Student Reports • Planning to align CST released test questions (RTQs) with California’s Common Core State Standards (CCSS) 38

  39. Proposed Web Site RedesignHome Page

  40. Proposed Web Site RedesignCurrent Testing Standards—Filtering Feature

  41. Proposed Web Site RedesignCurrent Testing Standards—Information About the Standard

  42. Proposed Web Site RedesignCurrent Testing Standards—Sample Question

  43. Proposed Web Site RedesignCurrent Testing Standards—Related CCSS Information

  44. Proposed Web Site RedesignCommon Core—Filtering Feature

  45. Proposed Web Site RedesignCommon Core—Standards Information

  46. Proposed Web Site RedesignCommon Core—Comparison to Current Testing Standards

  47. Proposed Web Site RedesignCommon Core—Example Related Sample Question

  48. Patrick Traynor, PhD, Director Assessment Development and Administration Division E-mail: ptraynor@cde.ca.gov Jessica Valdez, Administrator, Transition Office Assessment Development and Administration Division Phone: 916-319-0332 E-mail: jvaldez@cde.ca.gov Jessica Barr, SBAC and Reauthorization Lead Consultant, Transition Office Assessment Development and Administration Division Phone: 916-319-0364 E-mail: jbarr@cde.ca.gov Contact Information

More Related