1 / 28

COSA Assessment Conference Mickey Garrison Tony Alpert Jon Wiens Beth LaDuca

Digging into Statewide Assessment Results..(a DATA Project approach to understanding assessment results). COSA Assessment Conference Mickey Garrison Tony Alpert Jon Wiens Beth LaDuca. Essential DATA Questions. Need a question about preliminary results!

carter
Download Presentation

COSA Assessment Conference Mickey Garrison Tony Alpert Jon Wiens Beth LaDuca

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Digging into Statewide Assessment Results..(a DATA Project approach to understanding assessment results) COSA Assessment Conference Mickey Garrison Tony Alpert Jon Wiens Beth LaDuca

  2. Essential DATA Questions • Need a question about preliminary results! • What data are essential to ensuring continuous improvement? (strand 2) • Where can administrators find state assessment and report card data? (strand 1) • What tools can be used to help focus analysis, planning and evaluation? (strand 2)

  3. “Funneling” Data for Improving Instruction Broadly Explore Successes & Challenges Start with OAKS results to narrow the focus for digging deeper at the classroom level. Winnow Data Infer Cause/Effect Relationships Use student and adult data collected locally to dig deeper and gain clearer insights. SMART Goals Hypothesize Improvement Strategies

  4. Comparing performance at a global level using public results available at ODE Accountability/Reporting http://www.ode.state.or.us/search/results/?id=172 District A District B

  5. Comparing performance at sub-group level using public results available at ODE Accountability/Reporting http://www.ode.state.or.us/search/results/?id=172 Reading, 2008-09, Gender Math, 2008-09, LEP

  6. Subtest Scores: Use with Caution! • Subtest or Strand scores are based on fewer items • Fewer items = greater measurement error = lower reliability of scores • Use with caution • Understand this limitation • Triangulate—look for convergent evidence to develop interventions or instructional plans for students.

  7. Elbow Partner: Describe a time when someone (or you!) used strand data appropriately. Be prepared to report out. http://www.ascd.org/publications/educational-leadership/feb09/vol66/num05/Unraveling-Reliability.aspx

  8. Why do you care about measures of error? • Measures of error for group scores tell you how much confidence you can have that you’ve pinpointed the group’s actual performance (provides a range or band of scores) • Influenced by size of group and the size of the range of the scores of students in the group • Student level—provides a range of values that represent the range within which a student would score again. • Influenced by number of items a student answers for a particular score category • The more items answered, the smaller the measure of error.

  9. Additional Reports:Class Roster Report Compare the size of measurement error for total mathematics vs. each of the five subscores

  10. Combined Student Report

  11. Growth ISR

  12. Find a computer that may or may not be attached to another human. Use the Public Assessment Reports at: http://www.ode.state.or.us/search/results/?id=172Look at a district that is similar to yours (geography, size, demographics, etc) and compare and contrast their student achievement with yours.

  13. A simple framework: Funneling data Broadly Explore Successes & Challenges Winnow Data Infer Cause/Effect Relationships Use process tools to winnow data, analyze for meaning, set SMART goals, hypothesize improvement strategies and facilitate decision making. Use simple or complex data tools to collect and organize data and gain insights from your data. SMART Goals Hypothesize Improvement Strategies

  14. Data Team Process Collect and chart data Analyze strengths and obstacles Establish goals: set, review, revise Select instructional strategies Determine results indicators What impacts effectiveness of data teams and use of data to inform instruction? What roadblocks exist to prohibit effectiveness?

  15. Strand Means, OAKS Math Results, 2008-2009

  16. Strand Means, Preliminary OAKS Math Results, 2009-2010

  17. Strand Means, OAKS Reading Results, 2008-2009

  18. Strand Means, Preliminary OAKS Reading Results, 2009-2010

  19. Planning for change requires an understanding of factors that impact outcomes.

  20. Gap Analysis: Using Multiple Measures to Analyze the Gap between Hispanic and White Students 2008-2009

  21. How do you bring the elements of your Antecedents, Instructional Strategies, Causes and Effects together? Triangulate!

  22. “Getting Powerful Meaning from Data” Triangulation!

  23. L2 Matrix* Achievement of Results Leading Lucky Losing Ground Learning Understanding Antecedents of Excellence * Source: The Leadership and Learning Center

  24. Evidence that Convinces 1. 2. 3. 4. 5. Antecedents of Excellence & Replication Lucky Leading What’s Working? 1. 2. 3. 4. 5. Losing Learning

  25. For more information: www.oregondataproject.org Mickey Garrison 541-580-1201 mickey@oregoneesc.org

  26. 28

More Related