1 / 58

The Importance of Coaching in Implementation of Evidence-based Practices

The Importance of Coaching in Implementation of Evidence-based Practices. Rob Horner University of Oregon www.pbis.org. Goals. Current assumptions/research about coaching Define our experience with coaching in PBS implementation Implications for building district capacity.

morse
Download Presentation

The Importance of Coaching in Implementation of Evidence-based Practices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Importance of Coaching in Implementation of Evidence-based Practices Rob Horner University of Oregon www.pbis.org

  2. Goals • Current assumptions/research about coaching • Define our experience with coaching in PBS implementation • Implications for building district capacity

  3. Coaching Defined • Coaching is the active and iterative delivery of: • (a) prompts that increase successful behavior, and • (b) corrections that decrease unsuccessful behavior. • Coaching is done by someone with credibility and experience with the target skill(s) • Coaching is done on-site, in real time • Coaching is done after initial training • Coaching is done repeatedly (e.g. monthly) • Coaching intensity is adjusted to need

  4. Outcomes of Coaching • Fluency with trained skills • Adaptation of trained concepts/skills to local contexts and challenges • And new challenges that arise • Rapid redirection from miss-applications • Increased fidelity of overall implementation • Improved sustainability • Most often due to ability to increase coaching intensity at critical points in time.

  5. 10% 5% 0% 30% 20% 0% 60% 60% 5% 95% 95% 95% Joyce & Showers, 2002

  6. Coaching within SWPBS Implementation • Context: • 9000 schools implementing SWPBS nationally • Defining the Role • Internal vs External • Selecting Coaches • Training and support for coaches • Assessing Impact

  7. Visibility Political Support Funding Leadership Team Active Coordination Training Coaching Behavioral Expertise Evaluation Local Demonstration Schools

  8. Coaching vs. Training • Coaching involves active collaboration and participation, but not group instruction. • Small group • Build from local competence • Sustainable

  9. Who should be a coach? • Internal vs External • Internal coaches are employed in the school where they provide support • External coaches are employed outside the schools where they provide support (e.g. by district, region, state).

  10. Who should be a coach?

  11. Who should be a coach

  12. What Coaches Do • Work with team during initial SW-PBS training • Meet with new teams monthly on-site • Telephone/email contact as needed • “Positive” nag • Self-assessment (EBS Survey, Team Checklist) • Action planning • Activity implementation • On-going evaluation • School self-evaluation efforts • State-wide Initiative evaluation efforts (SET) • Guide State-wide initiative • Feedback to Taskforce

  13. What Coaches Do • Dissemination of outcomes and effects • SWIS Facilitation • Implement and support use of data-based decision making.

  14. Commitment of Coaches • Team Support • First Year (1-2 teams) (participate in training and planning) • Second Year (Maintain initial teams, start 3-5 teams) • Future Years (10-15 teams total) • FTE commitment • 20-50% • Roles/Background • Behavior Specialists, Special Education Teachers • Consultants, Administrators • School Psychologists, Counselors, Social Workers

  15. Guiding Principles for Effective Coaching • Build local capacity • Become unnecessary…but remain available • Maximize current competence • Never change things that are working • Always make the smallest change that will have the biggest impact • Focus on valued outcomes • Tie all efforts to the benefits for children • Emphasize Accountability • Measure and report; measure and report; measure and report. • Build credibility through: • (a) consistency, (b) competence with behavioral principles/practices, (c) relationships, (d) time investment. • Precorrect for success

  16. Specific Expectations • Attend and participate in team training • Meet with your team(s) at least monthly • Provide technical assistance as needed • Monitor and report on team efforts • Team Checklist • EBS Survey/ SET/ ISSET • Annual Profile/Summary Data • Present on School-wide PBS at district, state, national forums. • Assist district to build capacity for sustained implementation (re-define your role over time) • Meetings with Coordinator and Taskforce for purposes of state-wide planning

  17. Assist Teams in Using Data for Decision-making • Using Team-Checklist and EBS Survey data for Team Action Planning • Using SET/ TIC data for evaluation • Using ODR/ Academic (ORF) data for assessment, planning and reporting. • Keeping faculty involved through regular data reporting.

  18. Examples • Illinois • North Carolina • Michigan

  19. PBIS in Illinois Lucille Eber Ed.D. IL PBIS Network July 17, 2008 Developing Local Systems of Care for Children and Adolescents with Mental Health Needs and their Families Training Institutes Nashville, TN

  20. PBIS Schools Over Ten Years: Trained & Partially or Fully Implementing

  21. # IL PBIS Schools & # External CoachesJune 30, 2008

  22. The Organization of PBIS in Illinois900 schools implementing SWPBS ISBE Coordination Chicago Coordinators North Coordinators Central Coordinators South Coordinators 46 Coaches (10) 495 Coaches (84) 193 Coaches (20) 105 Coaches (29) 33 Schools 525 Schools 203 Schools 127 Schools

  23. Elementary Middle

  24. CapacitySchools per Coach per Region

  25. Illinois Suspension Rates per 100 PBS slope = -1.15 Non PBS slope = -.37

  26. Illinois Suspension Rates per 100 for Black and Hispanic Students PBS Slope = -1.85 Non PBS Slope = -,34

  27. North CarolinaPositive Behavior Support Initiative Partners’ Update February 2009 Heather R. Reynolds NC Department of Public Instruction Bob Algozzine Behavior and Reading Improvement Center http://www.dpi.state.nc.us/positivebehavior/

  28. North CarolinaPositive Behavior Support Initiative State PBS Coordinator Heather R Reynolds

  29. North CarolinaPositive Behavior Support Initiative Office discipline referral data (majors) from schools implementing PBS in North Carolina [07-08] compare favorably with national averages.

  30. North CarolinaPositive Behavior Support Initiative Levels of behavior risk in schools implementing PBS were comparable to widely-accepted expectations and better than those in comparison schools not systematically implementing PBS.

  31. North CarolinaPositive Behavior Support Initiative [A]chievement causes [B]ehavior? [B]ehavior causes [A]chievement? [Context causes [A]chievement and [B]ehavior?.

  32. Steve Goodman sgoodman@oaisd.org www.cenmi.org/miblsi

  33. Goals • Share information about Michigan’s Integrated Behavior and Learning Support Initiative (MiBLSi) • Provide examples of improving the quality and quantity of the data collected • Provide examples of acting upon project data to improve outcomes

  34. Participating Schools 2000 Model Demonstration Schools (5) 2004 Schools (21) 2005 Schools (31) 2006 Schools (50) 2007 Schools (165)

  35. Project Data:Outcomes, Process and System Development

  36. Major Discipline Referrals per 100 Students per Year (Schools implementing > 80% on Team Implementation Checklist)

  37. DIBELS Instructional Recommendations and Major Discipline Referral per Cohort per Year Major Discipline Referrals DIBELS Benchmark

  38. Participating School Example: Fourth Grade Reading MEAP Results Began MiBLSi Implementation

  39. Improving the quality and quantity of project data

  40. Percent of Process and System Data Collected by Cohort

  41. Improving the accuracy and Consistency of Recording Office Discipline Referrals

  42. Developing Fluency with Discipline Referral Categories Example Exercise 2: Match the example situation below to the correct problem behavior on the discipline categories answer sheet. Write the letter in the column for Exercise 2.

  43. Acting on the Data to Improve Classroom Management

  44. Major Discipline Referrals by Location Began focusing on classroom management support 2005-2006

  45. Improving Targeted Student Intervention Interviews with staff and self assessment indicate a need to develop targeted support systems

  46. Checklist for Individual Student Support Systems (CISS) Results from Cohort 4 (n=34 schools)

  47. Improving Targeted Student Intervention Strategies • Building Leadership Teams: • “Quick Sort” process for identifying students and linking to interventions • Focused training for practitioners: • Using Behavior Education Program (check in- checkout)

  48. Supporting Coaches • Conducting Self-Assessment to identify needs • Providing support based on results • Coach training 2 – 4 time per year • Coach manual • Coach website • Coach conference (March 13-14)

  49. Coaches Self-Assessmentadapted from: Sugai, Todd and Horner, 2006

More Related