1 / 37

Understanding SBA Scores

Understanding SBA Scores. From the Assessment development Team. Introduction. Purpose : to answer some basic questions about Smarter Balanced Summative and Interim test scores Audience : DACs, principals, data/instructional coaches, teachers, parents

jeremiahd
Download Presentation

Understanding SBA Scores

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UnderstandingSBA Scores From the Assessment development Team

  2. Introduction • Purpose:to answer some basic questions about Smarter Balanced Summative and Interim test scores • Audience: DACs, principals, data/instructional coaches, teachers, parents • Share: staff meetings, professional development, PLC meetings OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  3. Summative Test Scores • For each content area (ELA/Literacy and Mathematics), Washington reports: • Overall scale score • Achievement level • Claim achievement category OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  4. Summative Test Scores Family Report/ Individual Student Report OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  5. Summative Test Scores—scale scores OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  6. Summative Test Scores—scale scores • Scale scores: • Are not determined only by a calculation of “raw-points-earned divided by total-points-possible.” • Are related to the difficulty of the items a student was presented with. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  7. Summative Test Scores—scale scores .. . . . . . . . Calli . . . . . . . .. Beatrice . . . . . Anya Difficult items Easy items 2500 2000 3000

  8. Summative Test Scores—achievement levels OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  9. Summative Test Scores—achievement levels • Scale scores are divided into four categories of performance called “achievement levels” (Levels 1, 2, 3, and 4) • Achievement levels are defined by Achievement Level Descriptors, the specifications for what knowledge and skills students display at each level. • The detailed Achievement Level Descriptors are available here: • http://www.k12.wa.us/assessment/StateTesting/PLD/default.aspx • The score that divides between two levels is called a “cut score.” • The cut score that divides achievement Level 2 from Level 3 is referred to as the “standard.” Students with scores above this cut have “met the achievement standard.” OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  10. Summative Test Scores—achievement levels • Example of ELA grade 7: https://www.smarterbalanced.org/assessments/scores/ OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  11. Summative Test Scores—achievement levels • Educators and other stakeholders from Smarter Balanced member states participated in achievement level setting meetings in October 2014 to recommend cut scores. • In December 2014, the state education chiefs voted to approve the cut scores for Smarter Balanced. • Washington’s State Board of Education accepted these cut scores for Washington students in January 2015. • More information about the process, and WA’s involvement can be found: • https://www.smarterbalanced.org/assessments/scores/ • http://www.k12.wa.us/SMARTER/Achievement.aspx OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  12. Summative Test Scores—SEM • Each student scale score is reported along with a Standard Error of Measurement value (SEM). • This is the ± value displayed next to the 4-digit scale score. • This value can be different for each student. • This value is dependent on the items a student saw, which ones were answered correctly, and those the student did not answer correctly. • The SEM: • Indicates a range where the student’s score is likely to be if they took the test several times • Could also be called the “margin of error” or “standard error” OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  13. Summative Test Scores—SEM OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  14. Summative Test Scores—SEM • An SEM is also included with state, district, school, and classroom/roster average scores in the Online Reporting System. • The smaller the group of students, the larger the SEM. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  15. Summative Test Scores—SEM Average comparisons students State District School Classroom OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  16. Summative Test Scores--Claims OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  17. Summative Test Scores—Claims • Claim reporting category results come from comparing the student’s claim scale score to the “standard” cut score, and incorporating the student’s SEM on the items in that claim. • Step 1: Break the items up into their claim categories: • Reading—15 • Writing—13 • Listening—8 • Research—9 OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  18. Summative Test Scores—Claims • Step 2:Calculate the scale score for only the itemsin the claim category. This is called the claim scale score. Research Listening Writing Reading 2780 OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  19. Summative Test Scores—Claims ± 50 • Step 3: Find the SEM for each of these claim scale scores. • These claim scale scores and claim SEM can be found in score files downloaded from WAMS and downloaded from ORS. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  20. Summative Test Scores—Claims David’s Reading claim scale score = 2780 • Step 4: Get the “standard cut score” for the grade level and content area and do some math: • standard cut score + (Claim SEM x 1.5) = high • standard cut score- (Claim SEM x 1.5) = low 2552 + (50 x 1.5) 2552 - (50 x 1.5) 2552 + (75) 2552 - (75) 2627 2477 OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  21. Summative Test Scores—Claims • Step 5: Compare the answers from step 4 to the claim scale score. Below Standard At/Near Standard Above Standard low high high from #4 = 2627 low from #4 =2477 David’s Reading claim scale score = 2780 OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  22. Summative Test Scores—Claims • Step 5: Compare the answers from step 4 to the claim scale score. Above Standard Below Standard At/Near Standard low high David’s Reading claim scale score = 2780 high from #4 = 2627 low from #4 =2477 OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  23. Summative Test Scores—Claims • Another way to illustrate: Above Standard Below Standard At/Near Standard Claim SEM x 1.5 Claim SEM x 1.5 high low OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  24. Summative Test Scores—Claims • The larger the SEM, the larger the “at/near” range: Above Standard Below Standard At/Near Standard Claim SEM x 1.5 Claim SEM x 1.5 OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  25. Summative Test Scores—Claims • Above Standard: student’s claim scale score > 1.5 SEMs above the “standard” cut score • Below Standard: student’s claim scale score > 1.5 SEMs below the “standard” cut score • At/Near Standard: student’s claim scale score within 1.5 SEMs of the “standard” cut score OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  26. Summative Test Scores • Scores are not determined by a calculation of “raw-points-earned divided by total-points-possible” • Scores are related to: • the particular set of items the student saw, • the difficulty of those items, • which of those items the student answered correctly, • the standard error value (SEM) of the scale score, and • the distance between the scale score and the “standard cut score.” The achievement levels (1, 2, 3 and 4) are defined by Achievement Level Descriptors. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  27. Interim Comprehensive Assessment • The ICAs are reported using the same three types of scores as the Summative: • Overall scale score • Achievement level • Claim achievement category The scores are determined the same way as the Summative scores described on the previous slides. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  28. Interim Assessment Block (IAB) Scores • The IABs are reported using a “performance category”: • Above • At/Near • Below • This is very similar to the “claim achievement category” on the Summative. • The IAB performance categories are determined the same way as described on the previous slides for the summative claim score categories: Raw score/item difficulty Scale score and SEM Compare to standard cut Performance category OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  29. Interim Assessment Block (IAB) Scores Raw score/item difficulty Compare to standard cut Scale score and SEM Performance category OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  30. Interim Assessment Block (IAB) Scores • Percentage of points earned • Scores are not determined by a calculation of “raw-points-earned divided by total-points-possible” OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  31. Interim Assessment Block (IAB) Scores • Item difficulty across the content • The IABs are groups of items about similar topics. • IABs with harder content likely have more items with higher difficulty values. • Raw points on one IAB topic cannot be compared to raw points on another IAB topic. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  32. Interim Assessment Block (IAB) Scores • Standard Error of Measurement value (SEM) • The smaller the number of items on a test, the greater the SEM. • IABs with a small number of items (5 or 6) have the largest SEM. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  33. Now what? • Look past the colors and labels • Think about…. • Were there items that all your students struggled with? • Were there items that all of your students did well on? • Where are the outliers—items that all but a few students did well on? What instruction would benefit those few? • Were there trends in answers based on particular types of items? • What did you notice while hand scoring in THSS? OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  34. Now what? Remember: The IABs are one tool that can be used to gather evidence about a student’s understanding. The teacher should also gather additional evidence through formative practices. The IABs are not intended to be all the evidence that is collected, but they can be used to inform an educator’s thoughts about a student’s understanding.  Several data points (including the IAB) should be used to formalize a conclusion about a student’s understanding. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  35. Interim Test Scores • Interim Comprehensive Assessment scores are calculated the same way as the Summative Assessment scores. • Interim Assessment Block performance categories are calculated the same way as the claim achievement categories. • Focus on the items and patterns of student responses to those items. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  36. Resources and reminders • FAQ document posted along with this video on the WCAP Portal includes: • Links included in this presentation • Answers to frequently asked questions • Links to documents for further, more technical information • Audience: DACs, principals, data/instructional coaches, teachers, parents • Share: staff meetings, professional development, PLC meetings OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

  37. Thank you! • Please direct questions and comments about this presentation to:Kara Todd, Content Coordinator for Test Developmentkara.todd@k12.wa.us • Please direct content specific questions to:Anton Jackson, Mathematics Assessment Specialistanton.jackson@k12.wa.us • Shelley O’Dell, English Language Arts Assessment Specialistshelley.odell@k12.wa.us OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION

More Related