1 / 53

MiBLSi Schools’ Implementation Process and Student Outcomes

MiBLSi Schools’ Implementation Process and Student Outcomes. Anna L. Harms Michigan State University. Agenda. Reasons for studying implementation and ways to do it Linking research to our schools’ data Next steps Questions and Feedback. The Status of Research.

zaza
Download Presentation

MiBLSi Schools’ Implementation Process and Student Outcomes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MiBLSi Schools’ Implementation Process and Student Outcomes Anna L. Harms Michigan State University MiBLSi State Conference 2009

  2. Agenda • Reasons for studying implementation and ways to do it • Linking research to our schools’ data • Next steps • Questions and Feedback MiBLSi State Conference 2009

  3. The Status of Research • Primary focus has been on developingandidentifyingpractices. . . • National Reading Panel Reports • What Works Clearinghouse • Florida Center for Reading Research Reviews • OJJDP Model Programs • Center for the Study and Prevention of Violence Model Programs MiBLSi State Conference 2009

  4. What determines the evidence base for a practice? • Independent randomized control trial is the gold standard • Effect size (Cohen, 1988) : • Large: .80 • Moderate: .50 • Minimal/Weak: .20 MiBLSi State Conference 2009

  5. Efficacy vs. Effectiveness(Christensen, Carlson, Valdez, 2003) • Efficacy • controlled conditions • Conducted by innovation developers • Effectiveness • External to the developers of an innovation • Replication • Under different conditions IMPLEMENTATION RESEARCH PRACTICE MiBLSi State Conference 2009

  6. Greenberg, Domitrovich, Graczyk, Zins (2005) PROGRAM AS IMPLEMENTED PLANNED INTERVENTION ACTUAL INTERVENTION = ACTUAL MPLEMENTATION SUPPORT PLANNED IMPLEMENTATION SYSTEM MiBLSi State Conference 2009

  7. NIRN/SISEP • Framework for Implementation • Stages of Implementation • Core Implementation Components • Multi-level Influences on Successful Implementation MiBLSi State Conference 2009

  8. Effective Intervention Practices + Effective Implementation Strategies _______________________________ = Positive Outcomes for Students SISEP, 2009 MiBLSi State Conference 2009

  9. Getting into the Habit of Collecting, Analyzing, and Acting Upon Data Problem Identification Plan Evaluation Problem Analysis DATA & DOCUMENTATION Plan Implementation Plan Selection MiBLSi State Conference 2009

  10. Response to I________ • Intervention ? • Instruction ? • Implementation of evidence-based practices MiBLSi State Conference 2009

  11. Reasons for Studying and Monitoring Implementation • Effort evaluation • Quality improvement • Documentation • Internal validity • Program theory • Process evaluation • Diffusion • Evaluation quality Greenberg, M. T., Domitrovich, C. E., Graczyk, P. A., & Zins, J. E. (2005). MiBLSi State Conference 2009

  12. What tools can we use to measure implementation of school-wide systems? MiBLSi State Conference 2009

  13. Tier 1 Implementation Tools MiBLSi State Conference 2009

  14. Tier 2 & 3 Implementation Tools MiBLSi State Conference 2009

  15. MiBLSi Mission Statement “to develop support systems and sustained implementation of a data-driven, problem solving model in schools to help students become better readers with social skills necessary for success” MiBLSi State Conference 2009

  16. Our Data • MiBLSi’s existing data • Elementary Schools (any combination of K-6) * Refers to # of elementary schools included in this study. MiBLSi State Conference 2009

  17. Purpose of the Study • To systematically examine schools’ process of implementing school-wide positive behavior supports and a school-wide reading model during participation with a statewide RtI project. • To systematically examine the relation between implementation fidelity of an integrated three-tier model and student outcomes. MiBLSi State Conference 2009

  18. PLANNED INTERVENTION School-wide Positive Behavior Supports Response to Intervention for Reading STUDENT OUTCOMES Office Discipline Referrals Performance on Curriculum-Based Literacy Measures Performance on State-Wide Standardized Test in Reading ACTUAL IMPLEMENTATION Submission of Implementation Checklists Scores on Implementation Checklists Conceptual Framework (Chen, 1998; Greenberg et al., 2005) MiBLSi State Conference 2009

  19. Measuring Implementation • Effective Behavior Support Self Assessment Survey (EBS-SAS) • Spring of each school year • Total % implementation by building location • Effective Behavior Support Team Implementation Checklist (EBS-TIC) • 4 x per school year (quarterly) • Total % Implementation • Planning and Evaluation Tool for Effective Reading Supports-Revised (PET-R) • Fall of each school year • Total/Overall % implementation MiBLSi State Conference 2009

  20. THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference 2009

  21. Systems Implementation Research • Expect 3-5 years for full implementation (Fixsen, Naoom, Blase, Friedman & Wallace, 2004; OSEP Center on Positive Behavioral Interventions and Supports, 2004; Sprague et al., 2001) • Studies often split up implementation and outcomes (Reading First--U.S. Department of Education, 2006) • View implementation at one point in time (McCurdy, Mannella & Eldridge, 2003); McIntosh, Chard, Boland & Horner, 2006; Mass-Galloway, Panyan, Smith & Wessendorf, 2008) • A need for systematic research MiBLSi State Conference 2009

  22. THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference 2009

  23. Process and Progress • Just as we measure student progress, we should also measure our progress toward implementation efforts. • What is our current level of implementation? • What is our goal? • How do we get from here to there? MiBLSi State Conference 2009

  24. How do scores vary by year of implementation? MiBLSi State Conference 2009

  25. MiBLSi State Conference 2009

  26. MiBLSi State Conference 2009

  27. MiBLSi State Conference 2009

  28. THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference 2009

  29. How long does it take? 2-5 years MiBLSi State Conference 2009

  30. At each year of implementation, what % of schools attain criterion levels of implementation? MiBLSi State Conference 2009

  31. PET-R: COHORT 3 (N=50) 0-5 mo. 6-11 mo. 1:6-1:11 2:6-2:11 3:6-3:11 4:6-4:11 1 (2%) 24 (48%) 25 schools (50% did not attain criterion scores) MiBLSi State Conference 2009

  32. EBS-SAS: COHORT 3 (N=50) 0-5 mo. 6-11 mo. 1:0-1:5 2:0-2:5 3:0-3:5 4:0-4:5 5:0-5:5 2 (4%) 14 (28%) 13 (26%) 21 schools (42% did not attain criterion scores) MiBLSi State Conference 2009

  33. EBS-TIC: COHORT 3 (N=50) 0-5 mo. 6-11 mo. 1:0-1:5 2:0-2:5 3:0-3:5 4:0-4:5 5:0-5:5 1 (2%) 30 (60%) 6 (12%) 13 schools (26% did not attain criterion scores) MiBLSi State Conference 2009

  34. THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference 2009

  35. Sustainability • Think and work • Up • Down • Out MiBLSi State Conference 2009

  36. What percent of schools that attain criterion levels of implementation are able to maintain or improve their score in all subsequent years? MiBLSi State Conference 2009

  37. PET-R: COHORT 3 (N=50) 6-11 mo. 1:6-1:11 1 (2%) 1 MiBLSi State Conference 2009

  38. EBS-SAS: COHORT 3 (N=50) 0-5 mo. 6-11 mo. 1:0-1:5 2:0-2:5 3:0-3:5 4:0-4:5 5:0-5:5 2 (4%) 2 2 14 (28%) 12 13 (26%) MiBLSi State Conference 2009

  39. EBS-TIC: COHORT 3 (N=50) 0-5 mo. 6-11 mo. 1:0-1:5 2:0-2:5 3:0-3:5 4:0-4:5 5:0-5:5 1 (2%) 1 0 30 (60%) 15 6 (12%) MiBLSi State Conference 2009

  40. Another way of looking at implementation. . . MiBLSi State Conference 2009

  41. What % of implementation data do schools submit for each year of implementation? MiBLSi State Conference 2009

  42. % of Schools Submitting PET-R Data Each Year MiBLSi State Conference 2009

  43. % of Schools Submitting EBS-SAS Data Each Year MiBLSi State Conference 2009

  44. % of Schools Submitting EBS-TIC Data Each Year MiBLSi State Conference 2009

  45. THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference 2009

  46. Is the % of behavior checklist data submitted each year related to student behavior outcomes for that year? MiBLSi State Conference 2009

  47. Is the % of reading checklist data submitted each year related to student reading outcomes for that year? MiBLSi State Conference 2009

  48. Are scores on the behavior implementation checklists related to student behavior outcomes for that year? MiBLSi State Conference 2009

  49. Are scores on the reading implementation checklist for each year of implementation related to student reading outcomes for that year? MiBLSi State Conference 2009

  50. THE PROCESS HOW LONG SUSTAINABILITY ASSOCIATED STUDENT OUTCOMES BEHAVIOR + READING MiBLSi State Conference 2009

More Related