1 / 39

DELIVERY BOOTCAMP Driving Delivery April 11, 2014

DELIVERY BOOTCAMP Driving Delivery April 11, 2014. Session Objectives. At the end of this session, you will : Understand the core components and tools of driving delivery Have heard of how routines have been useful in for the Oregon School Turnaround and the Hawai’i Department of Education.

liesel
Download Presentation

DELIVERY BOOTCAMP Driving Delivery April 11, 2014

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DELIVERY BOOTCAMP Driving Delivery April 11, 2014

  2. Session Objectives • At the end of this session, you will: • Understand the core components and tools of driving delivery • Have heard of how routines have been useful in for the Oregon School Turnaround and the Hawai’i Department of Education Routines should be considered at the start of an engagement. They are the main tool to drive progress once planning is complete.

  3. Develop a foundation for delivery Understand the delivery challenge Plan for delivery Drive delivery We will focus on three essential elements that make up how a system drives delivery 1 2 3 4 • Determine your reform strategy • Set targets and establish trajectories • Produce delivery plans • Establish routines to drive and monitor performance • Solve problems early and rigorously • Sustain and continually build momentum • Define your aspiration • Review the current state of delivery • Build the delivery unit • Establish a “guiding coalition” • Evaluate past and present performance • Understand drivers of performance and relevant activities 5 Create an irreversible delivery culture A. Build system capacity all the time B. Communicate the delivery message C. Unleash the “alchemy of relationships”

  4. What are routines? What purpose do routines serve? In fact, routines are so critical to the success of a delivery effort that we like to call them the “engine” of delivery • Regularly scheduled checkpoints to assess if delivery is on track • Engine that drives delivery forward: Without routines, delivery will stall or eventually fall off the agenda • A source of structure and discipline to create order in complex public sector systems • Monitor performance: Understand if system is on track to deliver aspirations, using predetermined assessment frameworks • Diagnose problems: Surface issues that are inhibiting progress and analyze data to pinpoint causes • Address problems: Provide a venue to discuss and decide how to overcome challenges

  5. Effective routines share similar characteristics

  6. This self-assessment rubric can be used to assess and improve existing routines

  7. To prep for stocktakes or otherwise understand progress (or lack thereof) goal teams may assess the likelihood of delivery Assessment Framework Judgement Rating Rationale Summary Recent performance against trajectory and milestones L/M/H/VH Degree of challenge Quality of planning, implementation and performance management Likelihood of delivery Capacity to drive progress 1/2/3/4 Stage of delivery Key Highly problematic – requires urgent and decisive action Red Problematic – requires substantial attention, some aspects need urgent attention Amber/Red Amber/Green Mixed – aspect(s) require substantial attention, some good Green Good – requires refinement and systematic implementation

  8. To arrive at a likelihood of delivery, we use this Assessment Framework

  9. Judgments can be rolled up for a succinct message to the system leader.

  10. ODE School Improvement How are schools doing? (HASD) Quarter 1 Review December 13, 2013

  11. Understand overall goal trajectory and progress • Review the overall picture of schools • RNCs reflect on school status (region bright spots & challenge areas) • Review schools by region • Trends • Bright spots • Problem solving areas • So what do we do about it? • Review next steps • Reflect on the routine and identify next steps The Turnaround routine had these objectives How are schools doing?

  12. Roles and responsibilitiesin the Oregon “How are schools doing?” turnaround routine • ODE Director of School Improvement • Asks tough questions that challenge and support • Actively engages in problem-solving • Holds others accountable for results • Coaches • Direct support to schools • Holds day-to-day accountability for the plan’s success • Provides current assessment of progress in schools • Education Northwest Director • Asks tough questions • Ensure that coaches are providing support and building capacity • Provide support to RNCs and coaches • ODE Staff Support • Designs agenda, keeps meeting on track • Prepares data and evaluations • Works with ODE Director, RNCs, and coaches to prepare • RNC • Holds day-to-day accountability for the plan’s success • Manages the coaches to implement supports • Works with ODE and provides evidence for current assessment of progress • ODE Education Specialists • Holds day-to-day accountability for the plan’s success • Manages a strategy for ODE Turnaround • Shares project insights into school implementation

  13. The routine started with a review of overall progress How are schools doing?

  14. The routine addresses two major goals for school turnaround and four major strategies to support schools ODE Goals… Support to schools… • Network • CAP plans • SST pilot • Supports and Interventions • Four times a year • Beginning Dec, end Feb, end May, and end Aug • Sessions lasting 2-4 hours Routine Frequency

  15. The team looked at overall progress on the top-level metrics… Overall school ratings… Overall growth and subgroup growth…

  16. …And then focused on interim data to understand progress towards improving results Data Included in Routine • School and strategy assessment framework ratings • EGMS • Indistar • CAP Reviews • Formative Assessment Data • 70 schools reported reading data • 37 schools reported math data Data Missing in Routine • 5 coaches did not submit survey data • East Gresham • Glenfair • Ontario • Community • Cascade • 2 schools are new and do not have a coach (no information reported) • Nixyaawii • Roosevelt • SST data is only included for 2 schools

  17. Coaches rated the likelihood of strategies to be able to move their school to the overall rating goals At the strategy level, evidence of progress is the area of greatest need.

  18. Coaches used the assessment framework to assess the likelihood of delivery overall and of each school’s strategies Judgement Definition Rating Planning includes strategies and measures of progress. How/how much does each strategy impact goal? Quality of planning Likelihood of delivery • How well are we engaging with all of those involved from leader to student? How are we building capacity to drive implementation? Capacity to drive progress What do the latest data say about our progress on the goal and its underlying strategies? Evidence of effectiveness Key Highly problematic – requires urgent and decisive action Red Problematic – requires substantial attention, some aspects need urgent attention Orange Yellow Mixed – aspect(s) require substantial attention, some good Green Good – requires refinement and systematic implementation

  19. Most schools reported their formative assessment results, but some were not even collecting it Regions show quite variation in the % at core.

  20. Fourteen schools reported Q1 and Q2 Reading Formative Assessment data Some schools are testing all students, some are progress monitoring a subset.

  21. These indicators were used to assess the level of rigor in the schools’ planning…

  22. …and this one was used to determine whether schools were spending the funds available to them Schools that claimed <100% in school year 2012-13

  23. The RNCs made some overall reflections… How are schools doing?

  24. …And then reviewed each region to determine next steps How are schools doing?

  25. What is one school that is a bright spot? What is going well? • What are the top 2 most challenged schools? • What is most critical to improvement? • What next steps are needed to support these schools? These questions guided the discussion in each region

  26. Region Strategy Implementation Data

  27. Region: Bright Spot

  28. Region: Challenging School of Focus

  29. Region: Challenging School of Focus

  30. The team discussed how the routine went after discussing each region • What worked about this routine? • What we can we improve or change?

  31. Exercise: Using the assessment framework What How Materials Time • We are rating our likelihood of implementing the following outcome: Happy hour will start promptly at 5 pm today. • Use the assessment framework to come to a rating on overall likelihood of success by thinking about the quality of planning, capacity, and evidence related to this goal. • Share and calibrate the ratings • Individually • 5 • Assessment framework • Flipchart • Dots • Whole group • 10

  32. Hawai’i has seen encouraging results in its Race to the Top work and on NAEP • Went from High Risk status in Race to the Top to being removed from any risk status in two years • The NAEP 4th and 8th grade math average scale score increased 4 points from 2011 to 2013 • NAEP 8th grade reading average scale score increased 3 points from 2011 to 2013 Recent Hawai’i progress Note: Increases in NAEP scores are statistically significant according to NCES.

  33. Hawai’i‘s Educators are focused on the same six strategies at all levels Hawai’i‘s six priority strategies Academic Review Teams Common Core State Standards Comprehensive Student Supports State, complex areas, and school levels Educator Effectiveness Systems Formative Instruction/ Data Teams Induction and Mentoring

  34. The state team established routines where they use the implementation data to inform decisions

  35. During routines, school and complex area leaders come to a shared view of progress on each priority strategy Excerpt from strategy assessment rubric

  36. Data are then organized into complex area summaries and statewide reports that show field implementation progress Complex area and statewide field indicator summaries, September 2013

  37. The state team collects judgments from school routines Reporting structure for Hawai’i implementation data School-level routines Complex Area ART Lead State ART Lead State level routines Targeted state support • Come to shared view of progress • Reports to CA ART lead • Collects school level data • Discusses results with complex area leaders • Reports to state ART lead • Compiles data • Generates report summaries for state level routines • Assess school and CA level progress • Determine next steps of support • Provide necessary support to CAs and schools

More Related