1 / 28

Developing a Comprehensive State-wide Evaluation for PBS

Developing a Comprehensive State-wide Evaluation for PBS. Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D. Objectives. Describe Florida’s evaluation system for state, district, and school levels Identify the critical questions that Texas needs to answer

sumana
Download Presentation

Developing a Comprehensive State-wide Evaluation for PBS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing a Comprehensive State-wide Evaluation for PBS Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D.

  2. Objectives • Describe Florida’s evaluation system for state, district, and school levels • Identify the critical questions that Texas needs to answer • Describe a comprehensive model for evaluating Tier 1 PBS • Build a scalable and sustainable system • Review methods of data collection procedures, tools, analysis and training

  3. Purpose of Evaluation • To examine the extent to which teams are accurately selecting and implementing PBS systems and practices • Allows teams to determine the extent to which target student outcomes are being and/or likely to be achieved • To determine if teams are accurately and consistently implementing activities and practices as specified in their individualized action plan • (PBIS Blueprint, 2005)

  4. Factors to Consider in Developing Comprehensive Evaluation Systems • Systems Preparation • Readiness activities • Service Provision • Training and technical assistance • Evaluation Process • Timelines • Evaluation Data • Implementation Fidelity, Impact on Students, Attrition, Client Satisfaction • Products and Dissemination • Reports, materials, presentations, etc. (Childs, Kincaid & George, in press)

  5. What Questions Does Texas Need to Answer?

  6. (1) Systems Preparation • Readiness activities • District Readiness Checklist • District Action Plan • School Readiness Checklist • New School Profile • Baseline data: ODR, ISS, OSS, academic

  7. (2) Service Provision • Training and ongoing technical assistance

  8. (3) Evaluation Process • Timelines for Evaluation Reports • Mid Year I – due 10/31 • School Profile • PBS Implementation Checklist (PIC) • Mid Year II – due 2/28 • PBS Implementation Checklist (PIC) • End Year – due 6/15 • Benchmarks of Quality (BoQ), Bencahmark for Advanced Tiers (BAT) • Outcome Data Summary • School-wide Implementation Factors (SWIF)

  9. Implementation Fidelity PIC BoQ, BAT School Demographic Data SWIF Team Process Survey Attrition Attrition Survey Impact on Students Outcome data (ODR, ISS, OSS) FCAT (state test) School climate surveys Referrals to ESE Screening ID Response to intervention Client Satisfaction SWIF (4) Evaluation Data

  10. (a) Implementation Fidelity • Are schools trained in Universal PBS implementing with fidelity? Tiers 2 and 3? Across years? Across school types? • BoQ, BAT, School Demographic Data • What factors are related to implementing with fidelity? • SWIF survey, BoQ, BAT • Do teams that work well together implement with greater fidelity? • Team Process Evaluation, BoQ

  11. BoQ Totals by School Type Across Years

  12. School-Wide Implementation Factors (SWIF) Higher Implementing Lower Implementing (70+ on BoQ) (-70 on BoQ)

  13. Descriptive Data: Teams • Team functioning did not effectively differentiate school teams implementing with high or low fidelity with better or worse outcomes • Teams implementing Tier 1 PBS with fidelity saw substantially different effects on all four outcome measures

  14. (b) Impact on Student Behavior • Do schools implementing SWPBS decrease ODRs, days of ISS, and days of OSS? • ODRs, ISS, OSS • Do schools implementing SWPBS realize an increase in academic achievement? • FCAT scores • Is there a difference in outcomes across school types? • ODRs, ISS, OSS, FCAT scores, school demographic data • Do schools implementing with high fidelity have greater outcomes implementers with low fidelity? • BoQ, ODRs, ISS, OSS • Do teams that work well together have greater outcomes than those that don’t work as well together? • Team Process Evaluation, ODRs, ISS, OSS

  15. Percent change in ODR, ISS and OSS rates per 100 students before and after PBS implementation

  16. Academic Outcomes by Implementation Level

  17. Percent decrease in ODR, ISS, OSS rates per 100 students after 1 year of implementation (by school type)

  18. ODRs by implementation level across three years of implementation

  19. (c) Attrition • Why do schools discontinue implementation of SWPBS? • Attrition Survey

  20. (d) Consumer Satisfaction • Are our consumers satisfied with the training, technical assistance, products and support received? • SWIF survey • District Coordinators survey • Training evaluation

  21. Consumer Satisfaction

  22. (5) Products and Dissemination • Annual Reports • Revisions to Training • Revisions to Technical Assistance process • Dissemination activities: • National, state, district, school levels • Revisions to Website • On-line Training Modules

  23. Improvements Made • Increased emphasis on BoQ results for school and district-level action planning • Increased training to District Coordinators and Coaches and T.A. targeted areas of deficiency based upon data • Team Process Evaluation no longer used • Academic data used to increase visibility and political support • Specialized training for high schools • Identifying critical team variables impacted via training and T.A. activities • Revised Tier 1 PBS Training to include classroom strategies, problem-solving process within RtI framework • Enhanced monthly T.A. activities

  24. Evaluation Process Evaluation Data Products and Dissemination Systems Preparation Service Provision Implementation Fidelity Benchmarks of Quality, BAT School Demographic Data School-wide Implementation Factors Team Process Survey District Action Plan District Readiness Checklist School Readiness Checklist New School Profile (includes ODR, ISS, OSS) Training On-going technical assistance FLPBS ↓ Districts ↓ Coaches ↓ Schools Annual Reports Revisions to training and technical assistance process National, State, district, school dissemination activities Website On-line training modules Impact on Students Outcome data (ODR, ISS, OSS) Florida Comprehensive Assessment Test School Demographic Data Team Process Survey Mid-Year Reports End-of-Year Reports Attrition Attrition Survey Client Satisfaction School-Wide Implementation Factors Florida’s Service Deliveryand Evaluation Model (Childs, Kincaid & George, in press)

  25. In Summary… • Know what you want to know • Compare fidelity of implementation with outcomes – presents a strong case for implementing PBS with fidelity • Additional sources of data can assist a state in determining if PBS process (tiers 1-3) is working, but also why or why not it is working • Address state, district, school systems issues that may impact implementation success

  26. Resources • Childs, K., Kincaid, D., & George, H.P. (in press). A Model for Statewide Evaluation of a Universal Positive behavior Support Initiative. Journal of Positive Behavior Interventions. • George, H.P. & Kincaid, D. (2008). Building District-wide Capacity for Positive Behavior Support. Journal of Positive Behavioral Interventions, 10(1), 20-32. • Cohen, R., Kincaid, D., & Childs, K. (2007). Measuring School-Wide Positive Behavior Support Implementation: Development and Validation of the Benchmarks of Quality (BoQ). Journal of Positive Behavior Interventions.

  27. Contact Heather Peshak George, Ph.D. • Co-PI, Co-Director & PBIS Research Partner Phone: (813) 974-6440 • Fax: (813) 974-6115 • Email: flpbs@fmhi.usf.edu • Website: http://flpbs.fmhi.usf.edu

More Related