1 / 61

Using Psychosocial and Behavioral Data to Support Multi-tiered Systems of Support

Using Psychosocial and Behavioral Data to Support Multi-tiered Systems of Support. John Crocker Director of School Mental Health & Behavioral Services Methuen Public Schools. Agenda.

juliop
Download Presentation

Using Psychosocial and Behavioral Data to Support Multi-tiered Systems of Support

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Psychosocial and Behavioral Data to Support Multi-tiered Systems of Support John Crocker Director of School Mental Health & Behavioral Services Methuen Public Schools

  2. Agenda Session attendees will be engaged in a discussion regarding how psychosocial and behavioral data can be used to identify students for services, inform program design and evaluation, and aid in monitoring student progress in order to move students between tiers of support. • Brief summary of MTSS in Methuen (PBIS and CSMHS) • Overview of data systems • Applications of data across systems • Identification of individual students and groups for intervention • Progress monitoring • Ongoing needs assessment (informing program design and resource allocation) • System evaluation

  3. Multi-tiered System of Support (MTSS) in Methuen • Positive Behavioral Interventions and Supports (PBIS) • Participation in the DESE PBIS Academy • District-wide implementation across five schools • Comprehensive School Mental Health System (CSMHS) • Participation in the Comprehensive School Mental Health System (CSMHS) Collaborative for Improvement and Innovation Network (CoIIN) • Continued development of a district-wide CSMHS in partnership with the University of Maryland: National Center for School Mental Health (NCSMH)

  4. What is School-wide PBIS? “...is the emphasis on schoolwide systems of support that include proactive strategies for defining, teaching, and supporting appropriate student behaviors to create positive school environments.” “...a continuum of positive behavior support for all students within a school is implemented in areas including the classroom and non-classroom settings.” “...improve the link between research-validated practices and the environments in which teaching and learning occurs.” “...focused on creating and sustaining Tier 1 supports (universal), Tier 2 supports (targeted group), and Tier 3 supports (individual).” “...mak[es] targeted behaviors less effective, efficient, and relevant, and desired behavior more functional.” Adapted from: https://www.pbis.org/school

  5. Positive Behavioral Interventions and Supports (PBIS) • Statement of purpose (behavioral purpose statement) • Clearly defined expected behavior (behavioral expectations matrix) • Procedures for teaching the expected behavior • Continuum of procedures for encouraging the expected behavior (recognition and reinforcement) • Continuum of procedures for discouraging problem behavior • Procedures for collecting and using data to drive decision making

  6. Comprehensive School Mental Health System (CSMHS) “Comprehensive School Mental Health System (CSMHS ) is defined as school-district-community-family partnerships that provide a continuum of evidence-based mental health services to support students, families and the school community.” • Provides a full array of tiered mental health services • Includes a variety of collaborative partnerships • Uses evidence-based services and supports

  7. Evidence-Based Services and Supports Multi-tiered System of SEL & Mental Health Services and Supports • Tier I - Universal Supports and Interventions; Prevention Practices • Tier II - Targeted/Selected/Group Supports and Interventions • Tier III - Intensive/Individualized Supports and Interventions Examples at each tier: • Universal screening (Tier I) • Expanding and improving SEL and mental health literacy curriculum (Tier I) • Scaling up CBT therapy groups (Tier II) • Individual therapy (Tier III)

  8. Expanding & Improving Mental Health Services and Supports • Bringing evidence-based therapeutic services directly to students • Ensuring only quality mental health services are provided to students • Focusing on prevention and the promotion of mental wellness • Selecting practices that are proven to be effective for the selected population • Providing ongoing professional development, case consultancy, and supervision to ensure the fidelity of implementation of therapeutic practices

  9. CSMHS Quality and Sustainability Collaborative Improvement and Innovation Network (CoIIN) and Beyond • Grant funded partnership with the University of Maryland’s Center for School Mental Health (CSMH) • Methuen is 1 of 12 districts selected nationally for participation in the first cohort • Implementation of National Performance Measures to improve the quality and sustainability of school mental health services • Methuen receives ongoing support, resources, training, and assistance with implementation of project initiatives from the CSMH • Communication is frequent, ongoing, and involves the reporting out of progress made toward achieving CoIIN goals (PDSA cycles) • School Mental Health Improvement and Innovation Task Force • National Coalition for the State Advancement of School Mental Health (NCSA-SMH)

  10. 25 CoIIN District-Community School Mental Health Systems

  11. Action Planning and PDSA Cycles • Plan • Define the objective, questions, andpredictions • Plan for data collection • Do • Carry out the plan • Collect and analyze data • Study • Complete the analysis of the data andcompare the results to the predictions • Summarize what was learned • Act • Determine whether the change will beabandoned, adapted, or adopted

  12. Improvement

  13. Mental Health Screening: Questions to Consider Where do we start? Which students should we screen? How do we choose our screening tools? What about consent? What about staff readiness? What will the parent population say? How are we going to pay for this?

  14. Implementing Universal Screening: Starting Small • Rapidly testing at the micro-level allowed the team to: • Identify areas to improve • Establish systems to make screening efficient and sustainable • Build off of successes to ensure sustainability after scaling up • Ad hoc screening with individual students • Allowed the team to assess the utility of various measures • Small tests of change + High confidence in success = Low cost of failure • Active consent • Written consent secured during the initial phase of screening • What were the drawbacks? • How can we build the capacity to screen students more readily?

  15. Evolving Practice: Seeking Innovative Strategies Initial Phase of Implementation • Active Consent • Paper and pencil screening • Single-student or small group screening • Administration facilitated by SMH staff Improved Practices • Passive Consent and Opt-out • Electronic screening • Grade-level or school-wide screening • Administration through advisory and tech courses

  16. Piloting Non-ODR Data Collection • How can we better understand the minor behaviors that occur at a high rate but that do not warrant an ODR? • What will the collection of non-ODR data support for our implementation efforts? • Can we create a system that is efficient, accessible to all, and that produces actionable data to allow us to address minor behaviors before they become more problematic?

  17. Piloting Non-ODR Data Collection • Small pilot data collection project • Six weeks of data collection • ~30 staff members • 759 reports on minor behavior (non-ODRs) • Electronic submission that allowed for ease of analysis • What did we learn? • Highest reported non-ODR behaviors • Frequency of minor behaviors that impact learning • Window of time during the day with the greatest number of non-ODR reports • Why was this important? • Allowing for a more proactive response to behavior • Needs assessment related to behavior • Starting the discussion about policy revision

  18. Splitting Reported Behavior Down the Middle • Prior to the pilot data collection, there was no formal distinction between reported minor and major behavior • Were all behaviors created equally? • How can we be proactive if we are not aware of small problems that become much larger when unchecked? • Minor behaviors add up • Prevention through instruction • Leveraging all staff to explicitly teach behavior and intervene to address problem behaviors (Classroom managed behavior) • How can we change how behavior has been managed pre-PBIS?

  19. Changing the Philosophy of Behavior Management: The Behavior Flowchart • Aids in the decision making process when reporting behavior • Allows for multiple points of staff intervention and explicit instruction regarding behavioral expectations • Reinforces staff use of PBIS vetted classroom management strategies • Aligns behavior management practices across the district

  20. 2016-2017 - Scaling Up Data Collection • Collecting the right data using a custom data collection tab • Incorporating elements from the pilot data collection • Aligning data collection efforts across the district • Collecting data efficiently • Reports are entered in seconds from any device • The reporting tool incorporates drop-down menus for ease of use and allows for improved data analysis • Allowing equal access to reporting for all staff • Creating a custom reporting tool that makes the data actionable • Data reports can be generated to display aggregate-level data across multiple periods of time • Data can be disaggregated to identify individual students in seconds

  21. Using Data to Identify Students for Intervention

  22. The Impact of Universal Mental Health Screening Increasing proactive service delivery for students who require mental health services. • Identification of individual students who may require mental health services and supports (group therapy, individual therapy, etc.) • Proactive identification and referral for services serves to reduce the overall impact of mental health problems on students. • The reduction of crises through preventative care improves the overall functioning of a mental health system and decreases the larger impact of crises on the school as a whole. 66% increase in identification of students who require mental health services following implementation of mental health screening in 17-18.

  23. Screening by Area of Concern

  24. Group Therapy - Tier II Step 1: Analyze Screening Data to Identify Potential Group Members Step 2: Referral Process to Identify Group Members Step 3: Counselor Interviews with Identified Population/Collect Pre-Group Data Step 4: Obtain Informed Consent Step 5: Group Sessions & Progress Monitoring Step 6: Collect Post-Group Data/Group Evaluation

  25. Intervention / Treatment Planning ID PRESENTING PROBLEM ADJUSTMENT TO PRACTICE PROGRESS MONITORING IMPLEMENTATION OF EBP PROGRESS MONITORING ADJUSTMENT TO PRACTICE BASELINE DATA COLLECTION TERMINATION OF EBP OUTCOME DATA COLLECTION

  26. Intervention Plan / Treatment Planning Tool

  27. Intervention/Treatment Planning - Tier III Intervention plans have been implemented for approximately 5% of the student population since the 16-17 school year. Intervention plans consist of: • Documentation of the presenting problem • An articulated treatment plan using evidence-based services and supports to directly address the presenting problem • A data collection plan that outline the frequency of data collection and the type of data to be collected related to the presenting problem Use of intervention plans has supported: • Measurement of individual student growth after the start of services • Assessment of the efficacy of implemented services and supports • Self-reflection and adjustment to practice • Accountability for individual staff members and the larger CSMHS

  28. Using Behavioral Data to Identify Students How can we use this data to identify students who may be candidates for more targeted services? • Establishing preliminary data rules (3+ non-ODRs in a two week period) • Assessing the needs of our students • Assessing the accuracy of the data related to the fidelity of implementation How can we truly make this data actionable by designing an intervention protocol to close out referrals for students who have met our data rule?

  29. Table

  30. Managing Referrals: The Non-ODR Intervention Team • Year 3 of implementation; 733 referrals made to date • The non-ODR intervention team consists of: • Counselors • Administrators • PBIS team members • Team members have access to the referral list of students who meet the non-ODR data rule • Referral cycles run bi-weekly • Team members: • Meet with the student • Explicitly teach the expected behavior • Assess the function of the behavior • Process and problem solve to reduce the behavior

  31. Managing Referrals: The Non-ODR Intervention Team • Information gained through the processing and problem solving meeting is communicated back to the staff working with the student (including all individuals who entered a non-ODR for the student) • Follow-up will assess the efficacy of the intervention and lead into Tier II interventions if necessary • Students who do not respond to the processing / problem-solving meeting are referred for a tier II intervention, such as Check-in / Check-out (CICO) • Program is scaling-up to grammar schools in the 18-19 school year

  32. Progress Monitoring

  33. Activity In small groups, discuss the following questions: • How do we know when our therapeutic interventions are working? Not working? • What data do we typically use to assess the efficacy of our interventions? • How often are qualitative data or secondary / tertiary outcomes used to evaluate the impact of SMH staff?

  34. What is Progress Monitoring? “Progress monitoring is used to assess students’ academic performance, to quantify a student rate of improvement or responsiveness to instruction, and to evaluate the effectiveness of instruction. Progress monitoring can be implemented with individual students or an entire class.In progress monitoring, attention should focus on fidelity of implementation and selection of evidence-based tools, with consideration for cultural and linguistic responsiveness and recognition of student strengths.” -Center on Response to Intervention (RtI)

  35. Progress Monitoring: A Research-driven Approach “Although monitoring of treatment response is standard practice for many medical conditions, practitioners in mental health treatments, and substance abuse treatment in particular, have been slow to adopt these practices. Progress monitoring (PM), consisting of measurement and feedback, has the potential to significantly improve treatment outcomes.” -Goodman, McKay, & DePhilippis (2013) “Research shows that when both therapists and clients receive feedback on progress, clients tend to have better outcomes.” -Lambert, et al. (2002)

  36. The Importance of Progress Monitoring • Gauge the efficacy of the therapeutic or behavioral approach - Determine what is working and what is not • Adjustment to practice - Change the treatment / intervention plan if the student is not responding to the therapeutic or behavioral approach • Improves: • Student engagement in services • Quality of services • Consistency of therapy sessions or behavioral feedback • Staff self-assessment and adjustments to practice

  37. Measure Twice, Cut Once... What specific problem am I hoping to help the student with? Does my therapeutic approach / behavioral intervention match the needs of the student? If the student is making progress, what will change? What tools exist to measure this change? How often should I measure this change? Are there multiple changes that I can measure? How will this data inform my practice?

  38. What Are We Measuring? • Symptom presentation • Emotional / behavioral regulation • Specific behaviors • Engagement • Self-concept • Overall functioning Consider multiple measures of progress to gain a more complete picture of the impact of the intervention.

  39. Shape: School Health Assessment and Performance Evaluation System

  40. Methods for Conducting Progress Monitoring • Embedding progress monitoring into individual and group therapy sessions • Leveraging the behavioral data system used by the district • Leveraging observations from parents and staff (behavior point sheets; teacher-completed scales) • Collecting wide-scale baseline data using universal mental health screening

  41. Chart

  42. Line Chart

  43. Line Chart 2 • Progress monitoring intervals of two weeks (GAD-7, PHQ-9, and SDQ subscales) • Graphical history of the student’s response to treatment

  44. Sample CICO Run-chart

  45. Post-Group Data/Group Evaluation Average GAD-7 score pre-group: 15.22 Average GAD-7 score post-group: 8.42 Indicates ~7 point average decrease on the GAD-7 (mild anxiety)

  46. Ongoing Needs Assessment & System Evaluation

  47. Using Aggregated Behavioral & Psychosocial Data Understanding the mental health needs of the district comprehensively to inform the design of the mental health system. • Aggregated data can function as a needs assessment • Informs SEL and behavioral expectations curriculum design and delivery • Informs prevention work • Informs the design of Tier II interventions that target specific areas of need • Identifies funding and resources gaps • Understanding the connection between behavioral and psychosocial functioning on academic achievement

  48. 13.36 percent of students in grades 5-8 scored in the moderate to severe ranges for internalizing issues (depression, anxiety, etc.) Methuen Public Schools (2018) Table 2

  49. Behavioral Data Analysis • More than 101,000 non-ODRs have been logged district-wide since the inception of data collection for minor behaviors • Collection of this data has supported: • Understanding the top minor behaviors in the school / district in order to aid in the design of Tier I instruction that addresses targeted needs (Late to Class) • Understanding how staff intervene when behaviors occur in the classroom (Teach, Process, Problem Solve) • Understanding which responses most effectively extinguish specific minor behaviors

  50. Behavioral Data Analysis • Collection of this data has supported: • Identification of students who may require more intensive interventions (~150 students per year = ~7.3% of the total student population receive 3+ non-ODRs in a two-week period) • Understanding the fidelity of implementation (% of staff using the reporting tool for non-ODRs = ~91%) • How does this impact our identification of students? • How does this impact our total data analysis?

More Related