1 / 44

Check In, Check Out

Check In, Check Out. An Evidence-based Intervention for Tier II Supports. Educational and Community Supports. Educational and Community Supports (ECS) is a research unit within the College of Education at the University of Oregon.

tybalt
Download Presentation

Check In, Check Out

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Check In,Check Out An Evidence-based Intervention for Tier II Supports

  2. Educational and Community Supports • Educational and Community Supports (ECS) is a research unit within the College of Education at the University of Oregon. • ECS focuses on the development and implementation of practices that result in positive, durable, and scientifically substantiated change in the lives of individuals. • Federal and state funded projects support research, teaching, dissemination, and technical assistance. • PBIS Applications is a series of educational tools created within ECS and related to the implementation of multi-tiered systems of support (MTSS). • The PBIS Application tools have been utilized in 25,000+ schools both domestically and internationally.

  3. Session Intentions

  4. Essential Components of RTI Response to intervention (RTI) integrates assessment and intervention within a multi-level prevention system to maximize student achievement and reduce behavior problems. --National Center on Response to Intervention The intent of RTI is to improve outcomes for all students while providing immediate supplemental supports for students at risk for poor learning outcomes.

  5. Multi-Tiered Systems of Support Tertiary—intensive, individualized Secondary—targeted, small group Universal—primary prevention

  6. Multi-Tiered Systems of Support (MTSS) • The triangle does not represent the overall RTI or SWPBIS framework; it only represents one component, the multi-tiered system of support and prevention. • This component represents three levels of prevention. • In an effective system, we would expect: • Universal Level = at least 80% • If less than 80%, consider focusing school improvement efforts on improving core instruction and curriculum. • Secondary Level = 10-15% • Tertiary Level = 1-5%

  7. Multi-Tiered Support & Prevention Essential Question: Is the student successful at this level of support? Students themselves do not fit into a tier of supports; instead, their needs are addressed at the tiers provided. Intensity is a two-way street. Improved student outcomes are the result of continually monitoring and modifying (as needed) instructional programs and methods. Writing Reading Social-Emotional Math

  8. Secondary (Tier II) Systems of Support • Secondary Support Level • Focus = students identified through screening as being at-risk for poor learning outcomes; students unresponsive to the core curriculum • Instruction = targeted, supplemental instruction delivered to small groups • Setting= general environment • Assessments= continuous progress monitoring, diagnostic

  9. Secondary (Tier II) Systems of Support • The goal of secondary supports is to provide efficient supports for a large number of students with similar needs. • Efficiency is achieved by using ongoing, generic interventions. • Programming should be applicable to large numbers of students in the same way, with little to no individualization. • Secondary interventions should provide: • Additional instruction/time for student skill development • Additional structure/predictability • Increased opportunity for feedback

  10. RTI and SWPBIS • Improved student outcomes in social competence and academic achievement. • Systems support staff behavior. • Practices support student behavior. • Data support decision making. OUTCOMES DATA PRACTICES SYSTEMS Response to Intervention School-wide PBIS

  11. Systems, Data, and Practices OUTCOMES • Improved Outcomes • Social Competence & Academic Achievement • Systems to Support Staff Behavior • Administrative support, team-based leadership, data-based decision making systems DATA PRACTICES • Practices to Support Student Behavior • Define & teach procedures, Daily Progress Report for progress monitoring, sharing of progress reports with home, acknowledgement of appropriate behaviors, systematic correction of behavior errors, data-based decision making SYSTEMS • Data • Data entry, report generation, data-based decision making

  12. Fundamentals of Tier II Support Systems

  13. Team • Secondary supports are often overseen by a team charged with: • Pre-referral consultation • Screening • Assessment • Progress Monitoring • Intervention Implementation • Tier II teams need individuals with specific skill sets (i.e., behavior expertise, administrative authority) and perspectives (i.e., knowledge about school operations) to implement with success.

  14. Universal Screening • Not all students will respond to universal systems. • The purpose of screening is to identify those students who are at risk for poor learning outcomes. • The focus is on all students, not just those students that teachers believe are at risk. • It is a brief, reliable, valid assessment used to identify which students may need additional assessments or additional instructional support.

  15. Universal Screening • Brief assessment to determine students’ current level of performance • Collect information on all students at least twice a year • After the first 6 weeks of the new school year and 6 weeks after the return from winter break • Use data-decision rules for decision making: • For behavior, common screening measures include office discipline referrals (ODRs; Sugai, Sprague, Horner, & Walker, 2000). • ODRs are not valid indicators of “internalizing” problem behavior, such as anxiety and depression (McIntosh, Campbell, Carter, & Zumbo, 2009).  Green zone = 0-1 ODRs  Yellow zone = 2-5 ODRs  Red zone = 6+ ODRs

  16. Cumulative Mean ODRs Per Month for 325+ Elementary Schools 08-09 Cumulative Mean ODRs Jennifer Frank, Kent McIntosh, Seth May

  17. Cumulative Mean ODRs Per Month for 325+ Elementary Schools 08-09 Cumulative Mean ODRs Jennifer Frank, Kent McIntosh, Seth May

  18. Assessment for Intervention Selection • Additional information is often required to select the appropriate intervention, described as diagnostic testing (Salvia, Ysseldyke, & Bolt, 2009). • Diagnostic testing refers to assessment of problem analysis and function of behavior, with a focus on variables that can be changed (Christ, 2008; Tilly 2008). • Function-based Problem Solving vs. Functional Behavior Assessment (FBA)

  19. Assessment for Intervention Selection • Alignment with core curriculum • 3-5 behavioral expectations • Evidence-based Interventions • Interventions for which data from scientific, rigorous research designs have demonstrated (or empirically validated) the efficacy of the intervention. • Big Idea: the intervention has shown to improve the results for students who receive the intervention • Research-based Curricula • May incorporate design features that have been research generally; however, the curriculum or program as a whole has not been studied using a rigorous research design. --National Center on Response to Intervention

  20. Big “I” Interventions vs. Little “i” Interventions

  21. Major Features of Secondary Interventions • Intervention is continuously available • Intervention is continuously available • Rapid access to intervention (3 days) • Very low effort by teachers • Consistent with school-wide expectations • Implemented by all staff/faculty in a school • Home/school linkage • Flexible intervention matched to function of behavior

  22. Progress Monitoring • Allows practitioners to answer critical questions: • Are students making progress at an acceptable rate? • Quantify student rates of improvement or responsiveness to instruction • Are students meeting short-term goals necessary for achieving long-term goals? • Identify students who are not making adequate progress • Does the instruction need to be adjusted or changed? • Evaluate instructional effectiveness.

  23. Progress Monitoring • Continuous Progress Monitoring to confirm risk status and monitor progress of at-risk students • Collection of data on a monthly, weekly, daily rate • Use of data for decision making

  24. Data-Based Decision Making • Utility and value: • Instruction • Who needs assistance? • What type of instruction or assistance is needed? • Is the duration and intensity sufficient? • Movement within the Multiple Levels • When are students moved to something more/less intensive? • Who is responding and/or not responding? • Disability Identification • When do you refer for special education evaluation? • How does this student compare to his/her peers? • What appropriate instruction received by the student?

  25. Fidelity of Implementation • Without considering fidelity of implementation, it is unknown: • whether students fail to respond to secondary supports. • if staff have failed to provide adequate supports. • Meeting time devoted to monitoring and improving fidelity of implementation may seem like time better spent discussing student progress, but is a valuable and critical investment of resources for all students.

  26. Check In, Check Out (CICO) • Evidence-based intervention • Evidence that schools can successfully implement • Evidence of decreased problem behavior • Evidence of effectiveness for 60-75% of students in need of secondary supports (Crone, Horner, & Hawken, 2004)

  27. CICO Research • More effective with students with attention-maintained problem behavior (March & Horner, 2002; McIntosh, et. al., 2009; Campbell & Anderson, 2008) • Effective across behavioral functions (Hawken, O’Neill, & MacLeod, 2011) • Students who do not respond to CICO may benefit from function-based, individualized interventions (Fairbanks, et. al., 2007; March & Horner, 2002; Macleod, Hawken, & O’Neill, 2010)

  28. Check In, Check Out (CICO) • Behavioral Priming/Behavioral Momentum • Start each school day positively • Start each class positively • Student recruitment of contingent adult attention • Predictability • Self-management • Data-based Decision Making • High level of efficiency

  29. CICO Intervention Overview • Increased Structure • Prompts for correct behavior throughout the day • Systematic linking of a student with at least one positive adult • Increased opportunity for feedback • Performance feedback related to student behavior • High rates of adult attention • Inappropriate behavior is less likely to be ignored or reinforced

  30. CICO Intervention Overview • Increased Predictability • Each day begins with a positive contact • Each class/period begins with a positive contact • Student is continuously set up for success • Systematic communication between school and home • Increased time for student skill development • Increased ability to self-monitor progress/performance • Organized to fade into a self-management system

  31. CICO Intervention Overview • Elevated recognition for appropriate behavior • Adult attention delivered at the start and end of the day • Adult attention delivered during each targeted period • Program can be applied in all supervised locations • Classroom and non-classroom settings

  32. CICO Cycle Student Identified for CICO CICO Implemented CICO Coordinator summarizes data for decision making Morning Check In Regular Teacher Feedback Family Feedback Frequently scheduled meetings to analyze student progress Afternoon Check Out Continue Program Revise Program Exit Program

  33. Cycle of Feedback • Morning Check In • Start school day positively • Check student “status” • Check Daily Progress Report (DPR) that was sent home • Provide new DPR for the current day • Regular Teacher Feedback • Start each class positively • Complete DPR • Provide feedback to student at the end of period in relation to CICO goals

  34. Cycle of Feedback • Afternoon Check Out • End school day positively and encourage for tomorrow • Review the completed Daily Progress Report • Record points in CICO-SWIS • Send communication home to family regarding the CICO day • Parent Feedback • Student shares DPR with parent/family • Parent provides positive feedback and encouragement • Parent communicates with school • Example: signed DPR

  35. Daily Progress Report

  36. Team Meeting and Progress Monitoring • Team Meeting • Review student progress • Adjust support plan if no improvement within one week • Build self-management steps when appropriate • Exit when appropriate • Report to school-wide team, administration, and whole faculty

  37. CICO Progress Monitoring How is each student doing in relation to the school-wide goal?

  38. CICO Progress Monitoring What is one student’s pattern over time?

  39. CICO Progress Monitoring What does one student’s average day look like?

  40. CICO Progress Monitoring What is one student’s pattern over time in a single period?

  41. For More Information

  42. Linking Academic and Behavior Supports • Effective school-wide and classroom-wide behavior support is linked to increased academic engagement. • Improved academic engagement with effective instruction is linked to improved academic outcomes. • The systems needed to implement effective academic supports and effective behavior supports are very similar: • Clear Goals and Expected Outcomes • Appropriate Instruction • Feedback and Encouragement • Error Correction • Monitoring

  43. Session Intentions

  44. Questions, Answers, Discussion

More Related