1 / 37

2013-2014 Webinar s eries February 26, 2014 3:30 – 4:30 p.m.

Making it Stick: Going from Training to Implementation Practice. 2013-2014 Webinar s eries February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department of Education. Webinar Focus.

sorcha
Download Presentation

2013-2014 Webinar s eries February 26, 2014 3:30 – 4:30 p.m.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Making it Stick: Going from Training to Implementation Practice 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department of Education.

  2. Webinar Focus • Review implementation areas of the Direct Access to Achievement rubric. • Dig deeper into the descriptors from several implementation areas. • Extend your understanding of the rubric by using the Operationalizing and Optimizing descriptors and process tools to assess teams’ current effectiveness and set improvement goals.

  3. Overview Direct Access to Achievement Implementation Rubric Implementation Areas: Indicators Grouped as: Structures Processes and procedures Professional development • Leadership • Problem-solving through data analysis • Curriculum and instruction • Assessment • Positive school climate • Family and community partnering

  4. Overview Why are the implementation descriptions grouped under these indicators? • Structures • Processes and Procedures • Professional Development Synthesis of the research! Cosner, 2012; DuFour, DuFour, Eaker, &Karhanek, 2004; Hall & Hord, 2011; White (2005) 2012-2013 Webinar 1: A Leader's Role in Developing and Enhancing Collaborative Data Practices

  5. Overview Rubric Descriptors Each phase includes and extends the prior phase • Emerging: Establishing Consensus • Developing: Building Infrastructure • Operationalizing: Gaining Consistency • Optimizing: Innovating and Sustaining

  6. Dig Deeper into the Rubric Leadership—Structures Administrative structures are important because they may help or hinder the systems that support learning conditions for teachers and students.

  7. Dig Deeper into the Rubric Leadership StructuresAnchor and Guiding Question: 3. How are current policies and structures aligned with Direct Access to Achievement?

  8. Dig Deeper into the Rubric Which statement(s) describe(s) an optimizing level of implementation?

  9. Dig Deeper into the Rubric Problem-solving through data analysis A six-step process used to solve identified concerns.

  10. Using the Rubric Use the rubric to assess the level of implementation of problem-solving through data analysis. Establishing a baseline.

  11. Using the Rubric—establishing teams’ baselines. How would you classify most of the data team meetings you observe? • Meetings where teachers talk about data and assignment of students to interventions, and more rarely how they will change their core instruction. • Meetings where data are discussed and then blame is attributed to any number of factors, but rarely instruction. • Meetings where teachers mostly discuss workday logistics and other issues, and more rarely reflect on data, interventions, or instruction. • Meetings where teachers regularly confront their prior assumptions about the effectiveness of their teaching as supported by evidence (data), share their prior instructional actions, and seek or offer help (as appropriate) to make modifications to future instruction.

  12. Using the Rubric—establishing teams’ baselines Leader observation—Most meetings are where teachers talk about data and assignment of students to interventions, and more rarely how they will change their core instruction.

  13. Using the Rubric—establishing teams’ baselines Everyone does the best they can until they know better, and then they do better. If we expect teams to work toward increased effectiveness, then it is critical to identify where they are now and show them what “better” (increased effectiveness) looks like.

  14. Using the Rubric—establishing teams’ baselines Assess current processes using the rubric 2. How is the 6-step data/PLC team process used by educators to improve outcomes for students? Teams self-assess using the descriptions in the implementation rubric. Leader or designated process observer assesses teams using the descriptions in the implementation rubric.

  15. Using the Rubric and Observation Tool Mechanical to Mastery

  16. Using the Rubric and Observation Tool Team Observation Tool Team steps: Indicators provided: Descriptions of team actions that indicate Proficient or Exemplary behaviors. • Agenda and Minutes • Norms and Participation • Data Organization and Analysis • Analysis of Strengths and Obstacles • Goals • Instructional Strategies • Results Indicators • Meeting Self-Reflection

  17. Using the Rubric and Observation Tool Collect some process data! • Use the observation tool to objectively collect information about the focus of data/PLC team meetings. • Minutes & agenda reveal focus and content of team discussions. • Presence of SMART goals, strategies, and measures for fidelity of implementation reveal how much and how well teams are connecting data to instruction.

  18. Reflect on the Problem Identified Connecting what we’ve learned from analysis to changes in strategies used by the adults • We’ve identified the problem areas, now how do we make them better?

  19. Reflect on the Problem Identified Problem-Solving Through Data Analysis • A team that is operationalizing: • Uses what they’ve learned from data analysis to guide changes in curriculum, instruction and assessment. • Collects and analyzes data on fidelity of curriculum and intervention implementation. • A team that is optimizing: • Routinely uses the the 6-step data process to adjust programming • Evaluate quality of core instructional strategies, not just interventions • Evaluate systemic trends

  20. What about you? What percentage of the teams you observe are actually collecting data on the fidelity of their implementation?

  21. One School’s Example System of PLCs: Goal • Create a system of instruction & assessment that serves 80-90% of students. (Tier 1) • Create a system of intervention to support students in accessing grade level instruction (Tier 2 & Tier 3) • Create a system of data collection and analysis used for ongoing reflection of the following: - Student Growth - Student Learning/Effectiveness of Instruction - Effectiveness of Interventions

  22. One School’s Example Year 1 of Implementation of PLC Teams • Each grade level determined the data sources to analyze. • Teachers continually contributed to the questions used in the analysis. • Teachers used a variety of formative, interim and summative assessment to develop a ‘whole’ picture. *State Exam Student Profile *Interim Exams *Classroom artifacts— work samples, writing samples, reading responses, reflections

  23. One School’s Example PLC Routines—Fall • Used multiple sources of data to determine delivery system for each student. • Delivery system guided decision making in use of time, resources, personnel, and schedule. • Criteria for “at risk” was determined based on triangulation of data. • Students placed in interventions and progress monitored.

  24. One School’s Example PLC Teams—Ongoing Weekly PLC meetings used multiple sources of formative data to: • Align curriculum and instruction horizontally and vertically. • Identify what was working and what was not working. • Reflections were kept and used to improve guaranteed and viable curriculum.

  25. One School’s Example Were the interventions working?

  26. One School’s Example PLC Teams—Mid-Year Progress Check • Was the system of service working for each student? • Focus on Growth. • Goal-A minimum of 1 year of growth in 1 year. • Use Quadrant Analysis • Growth and Level of Intervention (Instructional Group)

  27. One School’s Example Quadrant Analysis Procedure: • Choose two related variables, collect data • Use data to group students Growth Level of Intervention

  28. One School’s Example What was discovered? Students receiving interventions were not demonstrating growth as expected—little or no acceleration of their learning.

  29. One School’s Example Why were results not aligned with expectations? Looked at fidelity of implementation of system of delivery. Discovered the unexpected!

  30. One School’s Example Lesson Learned: Walk in the shoes of the most “at-risk” students to ensure a cohesive plan vs. a disjointed day.

  31. Use the Rubric to Set Goals Curriculum and Instruction 2. How is the curriculum and instruction differentiated to meet student needs? In this example, the curriculum and instruction were differentiated, but they weren’t meeting student needs!

  32. What do we want it to look like? • Meetings where teachers talk about data and assignment of students to interventions, and more rarely how they will change their core instruction. • Meetings where data are discussed and then blame is attributed to any number of factors, but rarely instruction. • Meetings where teachers mostly discuss workday logistics and other issues, and more rarely reflect on data, interventions, or instruction. • Meetings where teachers regularly confront their prior assumptions about the effectiveness of their teaching as supported by evidence (data), share their prior instructional actions, and seek or offer help (as appropriate) to make modifications to future instruction.

  33. Use a Process Tool to Inform Goal Setting Force Field Analysis Procedure: • Define the desired change • Brainstorm driving and restraining forces • Prioritize forces • Identify action steps

  34. Force Field Analysis

  35. What about you? What have you learned from this school's example that applies to teams with which you are currently working? • Add your comments in the polling window.

  36. Webinar Review • The Direct Access to Achievement Implementation Rubric provides a tool for integrating initiatives and process tools within a system of continuous improvement. • The rubric is useful for assessing team needs and setting goals for improvement of implementation.

  37. Looking Ahead Webinar 3: “What difference is this making? Evaluating program effectiveness and fidelity of implementation” April 23, 2013 3:30–4:30 p.m.

More Related