600 likes | 760 Views
E N D
1. The Evaluation of a Statewide Initiative Targeting Reading and Behavior
Anna Harms
Margie McGlinchey
DIBELS Summit 2010
2. Session Objectives Overview of Michigan’s Integrated Behavior and Learning Support Initiative (MiBLSi)
Project Outcomes
Future Directions
3. In Your Handouts
4. MiBLSi Mission Statement To develop support systems and sustained implementation of a data-driven, problem solving model in schools to help students become better readers with social skills necessary for success.
5. Goals of MiBLSi Increase reading performance.
Reduce behavior problems.
Have accurate knowledge of behavior and reading performance.
Use student performance information to develop and implement interventions.
6. Participating Schools
9. Clarification MiBLSiis: A State Professional Development Grant.
Implemented in schools under typical conditions, with existing staff.
Continuously evolving. MiBLSi is not: A research study/project.
Necessarily the right fit for every school.
10. Scope and Sequence
11. Training Outcomes
12. For Much More Information:
13. Purpose of the Study Examine outcomes of a statewide, integrated RtI project.
Examine the relation between implementation fidelity and student outcomes in the context of a statewide integrated three tier model.
14. Research Questions To what extent do schools implement 3 tier reading and behavior systems with fidelity across time?
What is the relation between implementation fidelity and student outcomes?
15. What We Know About Implementation Less than 50% of educational research articles provide quantitative data about implementation fidelity. (HagermoserSanetti, Gritter&Dobey, 2009; HagermoserSanetti&Kratochwill, 2009)
Very little is known about implementation of systemic practices (vs. individual interventions). (HagermoserSanetti&Kratochwill, 2009)
Studies often split up implementation and outcomes. (Reading First--U.S. Department of Education, 2006)
Expect 2-5 years for full implementation. (Fixsen, Naoom, Blase, Friedman & Wallace, 2004; OSEP Center on Positive Behavioral Interventions and Supports, 2004; Sprague et al., 2001)
View implementation at one point in time(McCurdy, Mannella& Eldridge, 2003; McIntosh, Chard, Boland & Horner, 2006; Mass-Galloway, Panyan, Smith &Wessendorf, 2008)
16. Conceptual Framework
17. Inclusion/Exclusion Criteria
18. Included Schools
19. Unit of Analysis Whole-school building
20. What does it mean to
do
RtI and MiBLSi?
21. Measures of Implementation Fidelity
22. 1. To what extent do schools implement 3 tier reading and behavior systems with fidelity across time?
30. What can we celebrate? Upward trends in the data.
Scores.
Number of schools attaining criterion scores.
31. What do we need to work on? Increasing submission of systems/process data, especially over time.
Supporting schools as they implement individual student behavior systems.
32. What is the relation between implementation fidelity and student outcomes?
33. Students cannot benefit from interventions they do not experience.
34. Measures of Student Performance
35. Relation between Implementation Fidelity of a Schoolwide Reading Model and Student Outcomes in Reading
40. Relation between Implementation Fidelity of Schoolwide PBS and Student Behavior Outcomes
42. Non-significant correlationNon-significant correlation
43. Non-significant correlationNon-significant correlation
44. What can we celebrate? For reading, we see a positive relation between the PET and Percent of Students at Benchmark.
45. What do we need to work on? Determining why we do not see a strong relation between the TIC, SAS and Discipline Referral Data.
What is going on with Cohort 2?
46. How is our integrated model working?
49. What can we celebrate? We see a relation between office discipline referrals and DIBELS data.
(in the direction that we anticipated)
50. What do we need to work on?
Support data submission and use.
Investigate the impact of meeting criterion on the behavior and reading implementation measures on student outcomes.
51. Limitations Implementation fidelity is based on self-report.
We do not know what specific factors have impacted implementation.
Limited amounts of data actually submitted and available for analysis.
This study measures only a slice of what our schools are engaged in.
52. Lessons Learned We need to provide more support to our schools in order to get the process data submitted.
With the right systems set up to look at our project data as a whole, we have the ability to examine our project outcomes in more complex ways than we have done in the past.
53. Possible Next Steps Validation of the systems/process tools.
Systematic evaluation of implementation and student outcomes at all three tiers—a progressive measurement process.
A more complex study which integrates more of the implementation research. (drivers, stages, feedback cycles, etc.)
54. Critical Features of Implementation Stages
Drivers
Feedback Cycles
55. Implementation Drivers
56. Implementation Stages Exploration
Installation
Initial Implementation
Full Implementation
Innovation
Sustainability
57. PDSA Cycle
58. Developing Feedback Loops and Continuous Improvement Cycles
59. How did we do? Do you now know more about what Michigan’s Integrated Behavior and Learning Support Initiative is all about?
Do you have a general understanding of our project outcomes?
Do you have ideas about how the information presented might apply to your own work?
60. Appreciations 3 Co-Directors
25Technical Assistance Partners and other Staff
Over 300 Coaches
486 Schools
Over 100,000 Students
61. Contact Information MiBLSi Website
http://miblsi.cenmi.org
Anna Harms
aharms@oaisd.org
Margie McGlinchey
margiemcglinchey@me.com