1 / 8

SBE Presentation on Performance Evaluations

SBE Presentation on Performance Evaluations. April 19, 2012. Context. There are three major routines that comprise DDOE’s management of districts’ Race to the Top/Success plans:. Performance Evaluations. Agenda. Purpose of evaluations. What we heard.

alayna
Download Presentation

SBE Presentation on Performance Evaluations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SBE Presentation on Performance Evaluations April 19, 2012

  2. Context There are three major routines that comprise DDOE’s management of districts’ Race to the Top/Success plans:

  3. Performance Evaluations Agenda Purpose of evaluations What we heard • The mid-year performance evaluation conversation focused on: • Initial thoughts for what is driving the district’s strengths and challenges • How the district will dig deeper to really understand what is going on, and • How the district will then replicate its strengths and address its challenges • The purpose of performance evaluations is to assess the impact of district plans and district performance overall, and to identify opportunities to improve performance before final funding decisions are made in June • Most districts had already begun to drill down into their data at the school and grade-level • Most district had clear hypotheses for the drivers of their strengths • Across many districts, effective PLCs, RTTT-funded specialists and extended learning time programs were cited as a driver of district strengths • Most districts felt that they would need further analysis to understand the root causes of their challenges • Across many districts, “initiative overload” was cited as a potential cause for district challenges

  4. Performance Dashboards – Purpose, Status and Use Status and use of dashboards Purpose of dashboards • For the February 2012 performance evaluations all data is formative, and only marks progress towards LEA’s RTTT goals (which begin in Spring of 2012) • All 19 districts will have performance evaluations in June; 14 of the districts had an additional mid-year performance evaluation at the end of February (based on grant size and/or performance to date) • The dashboards are draft/for internal use only – please see the Performance Evaluation Overview for more information on this classification • The dashboards were the primary focus of LEAs’ performance evaluations • Performance evaluation dashboards provide a picture of LEAs’ performance against their Race to the Top goals, key state performance measures, and LEA-specific performance measures

  5. Performance Dashboards – Guide to Understanding Where are we in winter 2012? Where are we vs. last winter? What was our fall to winter growth? What was our F-W growth vs. last year? What is our Spring 2012 Goal? How many more students are needed to meet the goal? What is our Spring 2015 Goal? • State Example Colors are based on district performance vs. the state (green = above the state; red = below the state) Arrows are based on district performance this year vs. the previous year (up = performance has improved; neutral = performance has stayed within 1 percentage point; down = performance has declined) Goals are based on reducing non-proficiency by 50% by 2015 – a similar methodology as was used in the ESEA Flexibility Application The “additional students to meet goal” calculation is based on the number of students who took the winter test, so it may not be “exact”

  6. State Data Please see your handout for an overview of statewide trends based on the data.

  7. District Data • Each of the 14 districts with scheduled performance evaluations received an overview with the following components: • Plan highlights (from the plan submitted in June, 2011) • Progress review strengths (from the progress review conducted in October/November, 2011) • Performance strengths (from the dashboard generated in February, 2012) • Opportunities to strengthen performance (from the dashboard generated in February, 2012) • Additional relevant trends/data points (from the dashboard generated in February, 2012) • All district-specific overviews were shared with the Innovation Action Team.

  8. Next Steps Stakeholder Communications Public Communications Further Analysis • DDOE shared the state and district dashboards with all of the stakeholder groups that comprise the Innovation Action Team • DDOE provided the opportunity for each stakeholder group to schedule an individual overview of the performance evaluation process and findings • DDOE publicly released the state dashboard, state summary, and district-specific strengths • DDOE will use existing communication opportunities (e.g., the Governor’s Rotary Club meetings) to highlight the performance evaluation process and findings • DDOE will publicly release end-of-year district dashboards in summer 2012 • If the state’s ESEA flexibility application is approved, DDOE will align and disseminate communications regarding the RTTT performance dashboards and the new accountability changes – the two methodologies are very similar, with some differences. • DDOE used the February and March Chiefs’ meetings to further discuss district data and initiatives • DDOE is in the process of conducting further data analysis to identify district strengths, coupled with on-site visits in April to help understand the connection between district initiatives and performance data

More Related