1 / 51

Evaluation for School-wide PBIS

Evaluation for School-wide PBIS. Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon www.pbis.org December 9, 2011. Acknowledgements : George Sugai, Susan Barrett, Celeste Rossetto Dickey, Lucille Eber Donald Kincaid, Timothy Lewis, Tary Tobin.

shawn
Download Presentation

Evaluation for School-wide PBIS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation forSchool-wide PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon www.pbis.org December 9, 2011 Acknowledgements: George Sugai, Susan Barrett, Celeste Rossetto Dickey, Lucille Eber Donald Kincaid, Timothy Lewis, Tary Tobin

  2. Goals/ Assumptions for Webinar • Goals: Participants will be able to… • Define the core elements of an evaluation plan • Define a schedule for evaluation data collection • Define the core elements of an evaluation report • Assumptions: Participants already can… • Define core features of SWPBIS • Are implementing SWPBIS • Have reviewed the SWPBIS Evaluation Blueprint

  3. Assumptions • Every evaluation is unique. Any specific evaluation report will be affected by: • Funding and political focus • Number of years implementing PBIS • Focus on Tier II and Tier III • Additional emphasis areas • (Wraparound, Bully Prevention, Discipline Disproportionality)

  4. Supporting Materials • Tobin et al., 2011 • Fidelity Matrix • Childs, George & Kincaid, 2011 (eval brief) • Outline of Evaluation Report • Evaluation Blueprint • Implementation Blueprint • Exemplar state evaluation reports • (Illinois, Colorado, Missouri, Maryland, Florida, North Carolina)

  5. Foundations • Evaluation is the process of gathering data to • Answer questions • Make decisions • Define the core questions/decisions, before selecting measures. • Provide an adequate context for interpreting the data. • Effective evaluation is cyclical (repeated) • Use the evaluation process for improvement

  6. Evaluation Questions/ Cycle Plan Context Impact Replication Sustainability Compare Perform Input Fidelity Measure

  7. Evaluation Cycle • Core Questions Evaluation Plan Data Collection Evaluation Report

  8. Core Evaluation Questions Context • What are/were the goals and objectives for SWPBIS implementation? (state/district capacity; school adoption; student outcomes) • Who delivered the training and technical assistance for SWPBIS implementation? Input • What professional development events were part of SWPBIS implementation support? • Was projected level of TA capacity provided (training/coaching)? • Who received the professional development (schools/ cohorts)?

  9. Core Evaluation Questions Fidelity • To what extent was Tier I SWPBIS implemented with fidelity ? • To what extent were Tier II and Tier III SWPBIS implemented with fidelity? • To what extent is the Leadership Team establishing core functions?

  10. Visibility Political Support Funding Policy Leadership Team Active Coordination Training Coaching Behavioral Expertise Evaluation Local School/District Teams/Demonstrations Sugai et al., www.pbis.org

  11. Core Evaluation Questions Impact • To what extent is SWPBIS associated with changes in student behavioral outcomes? • To what extent is SWPBIS associated with changes in academic performance, and dropout rates? • To what extent is district/state capacity established? ( local training, coaching, evaluation , behavioral expertise) • To what extent is leadership and policy structure established?

  12. Core Evaluation Questions Replication, Sustainability, and Improvement • To what extent do Fidelity and Impact outcomes sustain across time? • To what extent does initial SWPBIS implementation affect Implementation with later cohorts? • To what extent did SWPBIS implementation change educational/behavioral capacity/policy? • To what extent did SWPBIS implementation affect systemic educational practice?

  13. Evaluation Plan • Context • Evaluation questions, stakeholders, purpose(s) • Schedule of reports for stakeholder decision-making • Input • Who will provide what TA/Training (and how much) to whom • Fidelity • What data, when collected, by whom to assess: • Leadership Team, • School teams Tier I, Tier II, Tier III • Impact • What data, when collected, by whom to assess: • District/state capacity (training, coaching, expertise, evaluation) • Student behavior • Student academic • Replication, Sustainability, Policy

  14. Evaluation Plan • Context • Define the roles of the evaluation • Stakeholders, Implementers, Adopters, Evaluators • Define the purpose of the evaluation • What decisions are to be affected by the evaluation? • What questions need to be answered to make informed decisions? • Define the basic goals, timeline and constraints associated with the implementation effort

  15. Evaluation Plan • Input • Who will provide what training and technical assistance… on what schedule… to improve the capacity of the leadership team to establish capacity to sustain and scale up SWPBIS? • Who will provide what training and technical assistance to school teams to result in Tier I implementation of SWPBIS? • Who will provide what training and technical assistance to school teams to result in Tier II an Tier III implementation of SWPBIS.

  16. Evaluation Plan • Fidelity

  17. Evaluation Plan • Impact

  18. Data Collection • 8 core PBIS measures • (Tobin et al., 2011; Childs et al., 2011) • Basic logic • Research quality measures • Annual self-assessment measures • Progress monitoring measures • (to be used every 3-4 meetings/cycles) • Fidelity measure matrix (Tobin & Vincent)

  19. Data Collection

  20. Data Collection Schedule

  21. Data Collection Schedule

  22. Evaluation Report • Basic Outline • Context, Input, Fidelity, Student Outcomes, Future Directions • Local Adaptation • Cohorts • Locally important questions • Disproportionality • Bully Prevention • Cost • Change in evaluation focus over time • Increased emphasis on Tier II and Tier III • Examples • At www.pbis.org/evaluation.

  23. Evaluation Report • Context • Define SWBIS • SWPBIS is a framework for establishing a school-wide social culture with the necessary individualized supports needed for all students to achieve academic and social success. • Define Goals of the specific project • Number of schools per year implementing SWPBIS • How schools were selected • Expectation of 2-3 years for Tier I implementation to criterion • Expectation of 2 years of criterion implementation to affect academic outcomes. • Development of district/state capacity • Capacity needed for sustained and scaled implementation • Behavioral and academic outcomes for students • Student outcomes linked to fidelity of implementation • Define Stakeholders/ Evaluation Questions • Evaluation report is written at the request of: • Evaluation report is focused on the following key questions:

  24. Evaluation Report • Input • Who received what support, and from whom? • Leadership team • Local Capacity Building • Training, Coaching, Behavioral Expertise, Evaluation • School teams

  25. Leadership Team • Planning Dates • Development of Implementation Plan • Dates of SWPBIS Implementation Self-Assessment • Capacity Building • Training, Coaching, Behavioral Expertise, Evaluation • Data collection systems

  26. Schools; Coaches; Trainers

  27. Number of Schools, Use of Fidelity Data, and Access to ODR Data

  28. Evaluation Report • Impact on SWPBIS Fidelity • Leadership Team • SWPBIS Implementation Self-Assessment • Capacity Development • Number of trainers/coaches available to support teams/districts • Behavioral expertise available to support Tier II and Tier III implementation • Evaluation capacity (data collection, data use, information distribution) • School Teams • Tier I Implementation (TIC, BoQ, SET, SAS) • Collectively, and/or by training cohort • Tier II / Tier III Implementation (MATT, BAT, ISSET) • Collectively, and/or by training cohort • Additional measures of fidelity • Phases of Implementation • CICO checklist

  29. Evaluation Report

  30. Evaluation Report • Fidelity of Leadership Team development • SWPBIS Implementation Self-Assessment sub-scales

  31. Evaluation Report • Fidelity (TIC, BoQ, SET, SAS) Total Score • Schools

  32. Evaluation Report • Fidelity (cohort) Total Score N = 17

  33. Evaluation Report Fidelity (subscales) TIC

  34. Evaluation Report BoQ Subscale Report

  35. Evaluation Report Tier II and Tier III Fidelity Measure Cohort N = 17 ISSET/BAT/MATT Total Scores Time 1 and Time 2 Time 1 Time 2

  36. Evaluation Report • Fidelity (Tier II & III)

  37. Tier I, Tier II, Tier III Criterion:Percentage

  38. Evaluation Report • Impact: Student outcomes

  39. Steve Goodman sgoodman@oaisd.org www.cenmi.org/miblsi

  40. Participating Schools 2000 Model Demonstration Schools (5) 2004 Schools (21) 2005 Schools (31) 2006 Schools (50) 2007 Schools (165) The strategies and organization for initial implementation need to change to meet the needs of larger scale implementation. 2008 Schools (95) 2009 Schools (150*) Total of 512 schools in collaboration with 45 of 57 ISDs (79%)

  41. Average Major Discipline Referral per 100 Students by Cohort

  42. Focus on Implementing with Fidelityusing Benchmarks of Quality (BoQ)/ODR ’06-’07 and ’07-’08 Decrease 14.6% Increase 8%

  43. Percent of Students meeting DIBELS Spring Benchmark for Cohorts 1 - 4 (Combined Grades) Spring ’09: 62,608 students assessed in cohorts 1 - 4 5,943 students assessed 32,257 students assessed 8,330 students assessed 16,078 students assessed

  44. Percent of Students at DIBELS Intensive Level across year by Cohort

  45. North CarolinaPositive Behavior Interventions & Support Initiative Heather R. Reynolds NC Department of Public Instruction Bob Algozzine Behavior and Reading Improvement Center http://www.dpi.state.nc.us/positivebehavior/

  46. Suspensions per 100 students

  47. Cedar Creek Middle SchoolFranklin County, North Carolina

  48. Dr. Bob Algozzine North CarolinaPositive Behavior Support Initiative Schools with Low ODRs and High Academic Outcomes Proportion of Students Meeting State Academic Standard Office Discipline Referrals per 100 Students

  49. Evaluation Report • Implications • Policy • Practice • Technical Assistance • Future Directions

More Related