1 / 112

Best Practices in Data-Based Decision Making Within an RTI Model

Best Practices in Data-Based Decision Making Within an RTI Model. Gary L. Cates, Ph.D. Illinois State University GaryCates.net Ben Ditkowsky , Ph.D. Lincolnwood School District 74 MeasuredEffects.Com. Acknowledgments.

deo
Download Presentation

Best Practices in Data-Based Decision Making Within an RTI Model

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Best Practices in Data-Based Decision Making Within an RTI Model Gary L. Cates, Ph.D. Illinois State University GaryCates.net Ben Ditkowsky, Ph.D. Lincolnwood School District 74 MeasuredEffects.Com

  2. Acknowledgments • Cates, Blum, & Swerdlik (2011). Authors of Effective RTI Training and Practices: Helping School and District Teams Improve Academic Performance and Social Behavior and this PowerPoint presentation. Champaign, IL: Research Press.

  3. Response to Intervention Is Data Based, Decision Making • Comprehensive system of student support for academics and behavior • Has a prevention focus • Matches instructional needs with scientifically based interventions/instruction for all students • Emphasizes data-based decision making across a multi-tiered framework

  4. Data Based Decision Making with Universal Screening Measures

  5. Presentation Activity 1 • What have you heard about universal screening measures? • What are your biggest concerns?

  6. 3 Purposes of Universal Screening • Predict which students are at risk for not meeting AYP (or long-term educational goals) • Monitor progress of all students over time • Reduce the need to do more in-depth diagnostic assessment with all students • Needed for reading, writing, math, and behavior

  7. Rationale for Using Universal Screening Measures • It is analogous to medical check-ups (but three times a year, not once) • Determine whether all students are meeting milestone (i.e., benchmarks) for predicted adequate growth • Provide intervention/support if they are not

  8. Characteristics of Universal Screening Measures • Brief to administer • Allow for multiple administration • Simple to score and interpret • Predict fairly well students at risk for not meeting AYP

  9. Presentation Activity 2 • What universal screening measures do you have in place currently for: • Reading? • Writing? • Math? • Behavior? • How do these fit with the characteristics of USM outlined on the previous slide?

  10. Examples of Universal Screening Measures for Academic Performance (USM-A) Curriculum-Based Measurement

  11. Data-Based Decision Making with USM-A

  12. Student Identification: Percentile Rank Approach • Dual discrepancy to determine a change in intensity (i.e., tier) of service • Cut Scores • Consider percentiles • District-derived cut scores are based on screening instruments’ ability to predict state scores • Rate of Improvement • Average gain made per day/per week?

  13. sampling of students all students included

  14. Student Identification: Dual-Discrepancy Approach • Rate of Improvement • Average gain made per day/per week? • Compared to peers (or cut score) over time

  15. sampling of students all students included

  16. Dual Discrepancy • Discrepant from peers (or empirically supported cut score) at data collection point 1 (e.g., fall benchmark) • Discrepancy continues or becomes larger at point 2 (e.g., winter benchmark) • This is referred to a student’s rate of improvement (ROI)

  17. Resources as a Consideration • Example of comparing percentile rank or some national cut score without considering resources • You want to minimize: • False positives • False negatives • This can be facilitated with an educational diagnostic tool

  18. Correlations • Direction (positive or negative) • Magnitude/strength (0 to 1) • If you want to understand how much overlap (i.e., variance) between the two is explained, then square your correlation r = .70 then about 49% overlap (i.e., variance)

  19. A Word About Correlations • A correlation tells us about the strength of a relationship • A correlation does not tell… • …the direction of the relationship • If A causes B, or if B cause A <or> • …if the relationship is causal or if there is another variable • if C causes A and B • Strong correlations do not always equate to accurate prediction of specific populations

  20. Presentation Activity 3 • How are you currently making data-based decisions using the universal screening measures you have? • Do you need to make some adjustments to your decision-making process? • If you answered yes to the question above, What might those adjustments be?

  21. Data-Based Decision Making with USM-B

  22. Some Preliminary Points • Social behavior screening is just as important as academic screening • We will focus on procedures (common sense is needed: If a child displays severe behavior, then bypass the system we will discuss today) • We will focus on PBIS and SSBD • The programs are examples of basic principles • You do not need to purchase these exact programs

  23. Office Discipline Referrals • Good as a stand-alone screening tool for externalizing behavior problems • Also good for analyzing schoolwide data • Discussed later

  24. Teacher Nomination • Teachers are generally good judges • Nominate three students as externalizers • Nominate three students as internalizers • Trust your instincts and make decision • There will be more sophisticated process to confirm your choices

  25. Confirming Teacher Nominations with Other Data • Teacher, Parent, and Student Rating Scales • BASC • CBCL (Achenbach)

  26. Example: Systematic Screening for Behavior Disorders (SSBD) • Critical Events Inventory: • 33 severe behaviors (e.g., physical assault, stealing) in checklist format • Room for other behaviors not listed • Adaptive Scale: Assesses socially appropriate functional skills (e.g., following teacher directions) • Maladaptive Scale: Assesses risk for developing antisocial behavior (e.g., testing teacher limits)

  27. Data-Based Decision Making Using Universal Screening Measures for Behavior • Computer software available • Web-based programs also available • See handout (Microsoft Excel Template)

  28. Average Referrals Per Day Per Month

  29. ODR Data by Behavior

  30. ODR Data by Location

  31. ODR Data by Time of Day

  32. ODR Data by Student

  33. Review of Important Points: Academic Peformance • USMs used for screening and progress monitoring • It is important to adhere to the characteristics when choosing a USM • USM-A’s typically are similar to curriculum-based measurement procedures • There are many ways to choose appropriate cut scores, but it is critical that available resources be considered

  34. Review of Important Points: Behavior • Social behavior is an important area for screening • Number of office discipline referrals is a strong measure for schoolwide data analysis and external behavior • Both internalizing and externalizing behaviors should be screened using teacher nominations • Follow-up with rating scales • Use computer technology to facilitate the data-based decision-making process

  35. Data Based Decision Making with Diagnostic Tools for Academic Performance and Social Behavior

  36. Presentation Activity 1 • What have you heard about diagnostic tools? • What are your biggest concerns?

  37. 3 Purposes of Diagnostic Tools • Follow up with any student identified on the USM as potentially needing additional support • Identify a specific skill or subset of skills for which students need additional instructional support • Assist in linking students with skill deficits to empirically supported intervention

  38. Rationale for Using Universal Screening Measures • Rule out any previous concerns flagged by a universal screening measure • Find an appropriate diagnosis • Identify an effective treatment

  39. Characteristics of Diagnostic Tools • Might be administered in a one-to-one format • Require more time to administer than a USM • Generally contain a larger sample of items than a USM • Generally have a wider variety of items than a USM

  40. Presentation Activity 2 • What diagnostic tools (DT) do you have in place currently for: • Reading? • Writing? • Math? • Behavior? • How do these fit with the characteristics of DTs outlined on the previous slide?

  41. Examples of Diagnostic Tools for Academic Skills (DT-A) at Tier III and Special Education Curriculum Based Evaluation

  42. Curriculum-Based Evaluation • Answer this: What does the student need in addition to what is already being provided (i.e., intensification of service)? • Conduct an analysis of student responding • Record review: Work samples • Observation: Independent work time • Interview: Ask the student why he or she struggles • Develop a hypothesis based on the above • Formulate a “test” of this hypothesis

  43. Data-Based Decision Making with DT-A

More Related