1 / 81

2013 STAR Interpreting and Using Results

2013 STAR Interpreting and Using Results. August 7, 2013 Webcast Webcast starts at 9 a.m. Objectives. Workshop participants will be able to: Describe the purposes of STAR reports Interpret STAR results Explain key statistics Compare and contrast types of reports

Download Presentation

2013 STAR Interpreting and Using Results

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2013 STARInterpreting and Using Results August 7, 2013 Webcast Webcast starts at 9 a.m.

  2. Objectives Workshop participants will be able to: • Describe the purposes of STAR reports • Interpret STAR results • Explain key statistics • Compare and contrast types of reports • Identify proper uses of reports Post-Test Workshop

  3. Agenda • What’s New? • Results and Statistical Analysis • Using Results • Summary and Internet Reports • Data CDs • Individual Student Reports • Early Assessment Program Post-Test Workshop

  4. Standards-based Tests in Spanish (STS) performance levels are now also reported for students in grades 8–11who took the grade-level STS for RLA, STS for Algebra I, and STS for Geometry. The score for the California Standards Test (CST) for Writing is no longer doubled; possible scores are 1, 2, 3, and 4. What’s New in 2013 Manual (M) 2 – 3 Post-Test Workshop

  5. What’s New in 2013 • ELA scale scores for the grades 4 and 7 CSTs and California Modified Assessment (CMA) will be provided based on the multiple-choice items only and the reporting cluster “Writing Applications” is no longer part of the CST and CMA ELA cluster groups; instead, the writing score is provided as a standalone score called “Writing Response Score.” M 2 – 3 Post-Test Workshop

  6. What’s New in 2013 • Enrollment and exit code data to determine which students are counted as “continuously enrolled” for accountability purposes, previously collected for (CBEDS), were not collected on STAR answer documents or in Pre-ID. Instead, these data were extracted from the California Longitudinal Pupil Achievement Data System (CALPADS). • Please refer to pages 2 and 3 in the 2013 Post Test Guide for a list of all changes. M 2 – 3 Post-Test Workshop

  7. Quiz Question 1 Which of these tests had scale scores reported for the first time in 2013? • CST for World History • STS for Science • CMA for Algebra I • STS for Algebra I Post-Test Workshop

  8. Quiz Question 1 Which of these tests had scale scores reported for the first time in 2013? CST for World History CAPA for Science CMA for Algebra I STS for Algebra I Post-Test Workshop

  9. Results: Purposes of STAR Reports • Report progress toward proficiency on the state’s academic content standards • Notify where improvement needed • To help students’ achievement • To improve educational programs • Provide data for state and federal accountability programs M 4 Post-Test Workshop

  10. Results: Performance Levels • State goal: All students score at proficient or above • 350 or higher scale score • CST • CMA • STS • CAPA proficient: 35 or higher scale score M 8 – 11 Post-Test Workshop

  11. Results: Other Performance Levels • Advanced • Basic cut score • CST: 300 • CMA: 300 • STS: 300 • CAPA: 30 • Below basic • Far below basic • For each testing program, cut pointsvary for advanced and below basic by • Subject • Grade M 8 – 11; Appendix B Post-Test Workshop

  12. Results: Scale Scores • Scale scores allow the same score to mean the same thing across test versions within grade and content area. • Scale scores account for differences in difficulty. • Scale score ranges by program are: • CST, CMA, STS: 150–600 for each grade and subject • CAPA: 15–60 for each level and subject M 8 – 11 Post-Test Workshop

  13. Results: Equating • Psychometric procedure • Adjusts for test difficulty • Additional information in the technical report on the CDE Web site M 8 Post-Test Workshop

  14. Results: Reporting Clusters (Content Area) • Three to six clusters for each subject • May be useful as indicators of individual or group strengths and weaknesses • But. . . reporting clusters should be interpreted with caution M 9 – 11; Appendix A Post-Test Workshop

  15. Results: Reporting Clusters Cautions • Cluster percent correct available for CSTs, CMA, STS • Based on small numbers of items; therefore, may not be reliable or generalized • NOT equated from year to year • Should not compare reporting cluster percent correct from year to year M 9 – 11; Appendix A Post-Test Workshop

  16. Interpreting Reporting Clusters or Content Areas in the Same Year • Compare to percent-correct range of proficient students statewide M 9 – 11; Appendix A Post-Test Workshop

  17. 2013 CST Reporting Clusters: Number of Questions and Average Percent Correct 2013 Post-Test Guide, Appendixes A and C as posted on startest.org, will be finalized with complete data on August 15. M Appendix A Post-Test Workshop

  18. Examples—Interpreting Reporting Clusters for the CST for Geometry M 10 Post-Test Workshop

  19. Quiz Question 2 What is a scale score? • Percent correct of all questions • Mean percent correct of all questions • An adjustment of this year’s and last year’s raw scores to show changes • An adjustment of the raw score to account for differences in difficulty Post-Test Workshop

  20. Quiz Question 2 What is a scale score? • Percent correct of all questions • Mean percent correct of all questions • An adjustment of this year’s and last year’s raw scores to show changes • An adjustment of the raw score to account for differences in difficulty Post-Test Workshop

  21. Using Results • For instructional decisions in conjunction with other data • Used in Academic Performance Index (API) calculations, all grades and subjects: CST, CMA, CAPA • Used in adequate yearly progress (AYP) calculations, ELA and mathematics: • CST — grades 2–8 • CMA — grades 3–8 • CAPA — grades 2–8, 10 M 4 Post-Test Workshop

  22. Year-to-Year Comparisons Do Compare CSTs: Same Grade and Same Content Area • Mean scale score • Same content and grade, varying years • Percent in each performance level • Same content by grade across years • e.g., 2012 grade 10 ELA with 2013 grade 10 ELA M 12 – 15 Post-Test Workshop

  23. Year-to-Year Comparisons Do Compare CSTs: Percent Proficient and Advanced • Percentage of students scoring at PROFICIENT and above • For a given grade and subject, e.g., Percent proficient and above for grade 3 math in 2012 and 2013 • For a given subject and aggregated grades, e.g., Percent proficient and above for grades 2–6 mathematics in 2012 and 2013 • Across grades and a subject, e.g., Percent proficient and above in all courses and all grades M 12 – 15 Post-Test Workshop

  24. Year-to-Year Comparisons Don’t Compare • Individual scale scores or statistics based on scale scores for different grades or content areas • Subjects by grade are independently scaled • Different content standards are measured in different grades • Cohorts across grades • Across tests • Scale scores to percent correct scores • CAPA to years prior to 2009, due to new standard setting then M 12 – 15 Post-Test Workshop

  25. Example—Using CST Results to Compare Grade Results from Year to Year M 14 Post-Test Workshop

  26. Quiz Question 3 Which is the best comparison for CST scores of students within a middle school? • 2012 mean scale scores for ELA of a cohort of grade 7 students with 2013 scale scores for ELA of the same students in grade 8 • 2012 mean scale scores for ELA for grade 8 students with 2013 mean scale scores for ELA for grade 8 students • 2012 mean percent correct scores for ELA with 2013 mean percent correct scores for ELA for the same students • 2012 mean percent correct scores for ELA for grade 8 students with 2013 mean percent correct for ELA for grade 8 students Post-Test Workshop

  27. Quiz Question 3 Which is the best comparison for CST scores of students within a middle school? • 2012 mean scale scores for ELA of a cohort of grade 7 students with 2013 scale scores for ELA of the same students in grade 8 • 2012 mean scale scores for ELA for grade 8 students with 2013 mean scale scores for ELA for grade 8 students • 2012 mean percent correct scores for ELA with 2013 mean percent correct scores for ELA for the same students • 2012 mean percent correct scores for ELA for grade 8 students with 2013 mean percent correct for ELA for grade 8 students Post-Test Workshop

  28. Quiz Question 4 Which is the best comparison of cluster scores for a single student? Compare. . . • To proficient students statewide • One cluster to another, same year • The same cluster to the same cluster, different years • To the average percent correct of all students in a class Post-Test Workshop

  29. Quiz Question 4 Which is the best comparison of cluster scores for a single student? Compare. . . • To proficient students statewide • One cluster to another, same year • The same cluster to the same cluster, different years • To the average percent correct of all students in a class Post-Test Workshop

  30. Aggregate (Summary) Reports • What are they? • Student Master List Summary • Student Master List Summary End-of-Course (EOC) • Subgroup Summary • Report emphasis: CSTs • Criterion-referenced tests • Progress is measured in percent of students scoring proficient and advanced • Note: Back of reports provides guide to abbreviations, score codes M 18 – 22 Post-Test Workshop

  31. Student Master List Summary • By grade • Results by program (CSTs, CMA, CAPA, and STS) and subject • # and % at each performance level • Mean scale score • Reporting cluster: mean percent correct (except CAPA) M 18–19; M 27 – 33 Post-Test Workshop

  32. Student Master List Summary End-of-Course • By program (CSTs, CMA, and STS) and subject • Results for each grade, and for all grades combined • # and % at each performance level • Mean scale score • Reporting cluster: mean percent correct M 19–20; M 34 – 39 Post-Test Workshop

  33. Student Master List Summary Grade 7 Example M 33 Post-Test Workshop

  34. Student Master List Summary Basic Statistics M 28 – 31 Post-Test Workshop

  35. Who Counts? Number Enrolled • Total CST/CMA and CAPA multiple-choice answer documents submitted as scorable • Minus • Documents marked as “Student enrolled after the first day of testing and was given this test” M 29 Post-Test Workshop

  36. Who Counts? Number Tested • All CST, CMA, CAPA, STS answer documents with one or more answers • Plus • Z = Tested but marked no answers • Not included • A = Students absent • E = Not tested due to significant medical emergency • P = Parent/guardian exemptions • T = Enrolled first day, not tested, tested at previous school • Students with inconsistent grades • Non–English learners who took the STS M 29 Post-Test Workshop

  37. Who Counts? Number and Percent Valid Scores • Number Valid Scores • For the subject, number of students tested at grade level who received a score for the test. • Not included: • Incomplete tests • Modified tests • Non–English learners who took the STS • Unknown EOC mathematics (except grade 7 mathematics) or science tests • Inconsistent grades • Percent Valid Scores • For the subject, number of valid scores divided by the number of students tested. M 29 Post-Test Workshop

  38. Who Counts? Number Tested with Scores • All tests taken, including those taken with modifications, that result in a score • Not included: • Incomplete tests • Non–English learners who took the STS • Unknown EOC mathematics or science tests • Inconsistent grades M 29 Post-Test Workshop

  39. Student Master List Summary Performance Levels M 30 Post-Test Workshop

  40. Who Counts? Performance Levels • All CSTs, CAPA, CMA, STS • Advanced, proficient, basic, below basic • All valid scores falling in the performance level • Far below basic • All valid scores falling in the performance level • CSTs taken with modifications (in aggregate reporting [CSTs and STS] and accountability [CSTs] only) M 30 Post-Test Workshop

  41. Who Counts? Mean Scale Scores • Average of valid scale scores • Can be used to compare results for the same content/grade across years M 30 Post-Test Workshop

  42. Student Master List Summary: Reporting Clusters Compare to: Average percent correct range for students statewide who scored proficient on the total test (See the Post-Test Guide, Appendix A.) M 30 Post-Test Workshop

  43. B = Blank C = Copied prompt I = Illegible L = Language other than English R = Refusal T = Off topic W = Wrong prompt (prompt from an earlier administration) Student Master List Summary: Writing M 30 – 31 Post-Test Workshop

  44. Subgroup Summary: CSTs, CMA, CAPA, and STS • Disability status • Based on disability status for CST, CMA, STS • CAPA: each disability type • Economic status • Based on NSLP eligibility or parent education level • Gender • English proficiency • Ethnicity • Ethnicity for Economic Status (only for CSTs, CMA, and CAPA) M 40 – 54 Post-Test Workshop

  45. Subgroup Summary: Ethnicity for Economic Status M 51 – 54 Post-Test Workshop

  46. Subgroup Summary: Ethnicity for Economic Status Example: Economically disadvantaged for each ethnicity M 51 – 54 Post-Test Workshop

  47. Subgroup Summary: Ethnicity for Economic Status M 51 – 54 Post-Test Workshop

  48. Break — 10 minutes

  49. Internet Reports • Summaries based on same data as paper reports: CSTs, CMA, CAPA, STS • Available to the public online for school, district, county, and state • “Students with Scores” = number tested with scores • CST summaries of % advanced and proficient • More subgroups than paper reports • Parent education • Special program participation • Access from http://star.cde.ca.gov/ M 95 – 108 Post-Test Workshop

  50. Internet Demonstration Post-Test Workshop

More Related