1 / 94

RTI Data Analysis

. Quick review of RTI Core Components. . Universal ScreeningTiered InterventionsResearch-based PracticesFidelity MonitoringData-based Decision Making. . Tier I

asa
Download Presentation

RTI Data Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. RTI Data Analysis Making Sense of it All

    2. Quick review of RTI Core Components

    3. Universal Screening Tiered Interventions Research-based Practices Fidelity Monitoring Data-based Decision Making

    4. Tier IMeets the needs of 80-85% of students Tier IIIs provided to the remaining 15 to 20% of students and meets the needs of all but about 5 to 8% (when provided in conjunction with Tier I) Tier IIIIs provided to the Tier II non-responders

    5. Data-based Decision Making Universal Screening FIRST--Identifies systemic problems Instructional curricular THEN--Identifies individual student problems

    6. One of the most common mistakes in analyzing RTI data analysis is skipping the first step

    7. Universal Screening Data Analysis

    8. What are you looking for? Is Tier I working? How can you tell? Do you have any curriculum problems? Do you have any instructional problems?

    9. Is Tier I Working?

    10. Yes!

    11. Is Tier I Working?

    12. No.

    13. Is the problem with the curriculum or is it with instruction? How can you tell?

    14. Using CBM to Identify Curriculum Problems

    15. AIMSweb

    16. District-wide Screening Results

    17. Universal Screening Interpreting Results Between and Within Campuses

    18. Heres a sample of universal academic screening data in reading fluency at second grade. Data collected via CBM in August. Do you see any problems? Well, first of all, Barringer Elementary seems EXEMPLARY!! Yea!Heres a sample of universal academic screening data in reading fluency at second grade. Data collected via CBM in August. Do you see any problems? Well, first of all, Barringer Elementary seems EXEMPLARY!! Yea!

    19. Montoya Elementary School has a problem with one classroom Saenz Elementary School has a systemic problem at second grade. Could be instructional. Could be curricular Need to find out whats up before analyzing individual student performanceMontoya Elementary School has a problem with one classroom Saenz Elementary School has a systemic problem at second grade. Could be instructional. Could be curricular Need to find out whats up before analyzing individual student performance

    20. Selecting Universal Screening Tools

    22. Tier II RTI at the Individual Student Level

    23. Quick Review of Problem-Solving Method

    25. Collect Data and Define the Problem The team uses data to analyze the problem develop a hypothesis about the core deficit or reason for the problem

    26. Define the Problem Concrete, Measurable usually stated as the difference between the students performance and a benchmark standard Collect baseline data re: behavior or performance Data must be high quality because this is what you will analyze later

    27. 5 Components of Data-Driven Instruction good baseline data, measurable instructional goals, frequent formative assessment, professional learning communities, and focused instructional interventions.

    28. Teachers and administrators have access to lots of data. How are you using it? You must have high quality data because

    29. More is not always better And not all data is worth analyzing

    30. GIGO

    31. How do you select a Progress Monitoring tool that meets your needs?

    32. High Quality Academic Data 6 Characteristics of Effective Progress Monitoring Systems Adapted from Fuchs and Fuchs, 1999

    37. Curriculum Based Measurement CBM is one form of Progress Monitoring with a growing research base. CBA vs CBM Research and application dates to the 70s (Deno) First used to assess progress toward IEP goals More research in reading than math Is it a General Outcome Measure? Does NOT assess fluency for the sake of fluency!

    38. Why not use end of unit tests? Susie is a 4th grader referred for special education evaluation because she is getting further behind her peers. She is in a Tier 2 reading intervention, working at second grade level.

    39. Data from End of Unit Tests

    41. BUT WHAT ABOUT TAKS ?

    42. Do not make me bring out my friend

    43. Do not despair

    44. Research on CBM Oral Reading Fluency and Performance on Statewide Assessments Colorado (Shaw & Shaw, 2002) Florida (Buck & Torgeson, 2003; Castillo, Torgeson, Powell-Smith & Al Otaiba, 2003) Illinois (Sibley, Biwer, & Hesch, 2001) Michigan (McGlinchey & Hixson, 2004) Minnesota (Hintze & Silberglitt, 2005) North Carolina (Barger, 2003) Oregon (Crawford, Tindal & Stieber, 2001) Washington (Stage & Jacobson)

    45. Reading Average correlation between CBM ORF and performance on state assessments was in the .60 to .75 range

    46. Math Helwig, Anderson, and Tindal, 2002 CBM math probe48 problems including both computation and problem solving, untimed Predicted which students would meet the state math standards with 87% accuracy

    47. Math Shapiro, Keler, Lutz, Santoro, & Hintze, 2006 CBM & state assessment (PSSA) CBMMath Computation (25 mixed operation problems) Positive Predictive Power .85 for Spring administration CBMMath Concepts (18 or 24 problems) Positive Predictive Power .88 to .91 for Spring administration

    48. Set a Goal Use baseline data and results of problem analysis to set a measurable goal for the student Goal must be Challenging Attainable

    49. SMART Goals Specific, Measurable, Attainable, Results-Oriented, and Time-Bound. Example: The percentage of third grade students scoring 2100 or higher on the state mathematics test will increase from 64% in Spring 2008 to 82% in Spring 2009. Focus areas for improvement Number sense Computation Measurement

    50. Match an intervention to the students deficit Check the research supporting the proposed intervention to verify: The effect size is adequate (does the intervention result in large enough improvement to allow for goal attainment?) The duration is appropriate (will the intervention result in improvement within the timeline established for goal attainment?) www.ies.ed.gov (What Works Clearinghouse)

    51. Implement the Intervention Monitor Progress Monitor Treatment Fidelity

    52. Charting Progress Data Using Microsoft Excel + fast + accessible + easy to learn + easy to make changes - can be a little intimidating for the novice - requires continuous access to a computer Sample CASTLE School Data Tutorials www.schooldatatutorials.org

    54. Why is it important to use equal time intervals on the X axis?

    55. Sample

    56. Trend lineStraight line that best estimates the trend in a set of data points. Also referred to as slope of improvement

    57. Tukey Method Step 1: Divide the data points into three equal sections by drawing two vertical lines. Step 2: In the first and third sections, find the median data-point and median instructional week. Locate the place on the graph where the two values intersect and mark with an X. Step 3: Draw a line through the two Xs. (Hutton, Dubes, & Muir, 1992)

    59. Step 1: Divide the data points into three equal sections by drawing two vertical lines.

    61. Step 2: In the first and third sections, find the median data-point and median instructional week. Locate the place on the graph where the two values intersect and mark with an X.

    62. Tukey Method

    63. Step 3: Draw a line through the two Xs.

    64. Tukey Method

    65. Evaluating the Response Problem-solving team reviews data to determine how the student has responded to the intervention. Document the teams actions.

    66. Evaluating Response Sample Decision Rules In CBM Screening Data: Some (Fuchs) have recommended students at 20th percentile on CBM data should be identified as non-responders Data: 4 data points below the goal line Decision: change the intervention Data: 7 data points above the goal line: Decision: increase the goal

    67. Evaluating Response Data: students slope has remained essentially the same Decision: the team may try different Tier II interventions or add supplemental interventions or, eventually, move to Tier III Data: students learning slope has declined Decision: the team will need to change the intervention or move the student to Tier III

    71. Progress Monitoring Resources Dynamic Indicators of Basic Early Literacy Skills (DIBELS) http://dibels.uoregon.edu/index.php Edcheckup http://www.edcheckup.com EdProgress http://www.edprogress.com

    72. Progress Monitoring Resources Evidence-Based Progress Monitoring and Improvement System http://www.aimsweb.com McGraw-Hill Digital Learning www.ctb.com/mktg/ypp/ypp_index.jsp Intervention Central http://www.interventioncentral.org

    73. Progress Monitoring Resources Monitoring Basic Skills Progress (MBSP) www.studentprogress.org/chart/progressmonitoringtools/mbsp_reading.htm National Center on Accessing the General Curriculum http://www.cast.org/publications/ncac/ncac_curriculumbe.html

    74. Progress Monitoring Resources National Center on Student Progress Monitoring www.studentprogress.org National Consortium on Oral Reading Fluency www.cast.org/system/galleries/download/ncac/CurBasEval.pdf Read Naturally http://www.readnaturally.com

    75. Managing Progress Data Commercial Products/Systems AIMSweb DIBELS CASTLE websitelists 10 software packages for monitoring daily/weekly and 17 packages for monitoring 3 to 10 times per year www.scottmcleod.net/storage/2006%20-%20CASTLE%20-%20Formative%20Assessment%20Software.pdf

    76. Online help for Do-It-Yourselfers Intervention Central (Chartdog) www.interventioncentral.org CASTLE School Data Tutorials Tutorials for managing, graphing data in Excel www.schooldatatutorials.org

    77. Adam Adams teacher is concerned about his reading.

    78. Existing Data Universal Interventions have been in place for 6 weeks Teacher has consulted with lead teacher and implemented recommended modifications Teacher has provided differentiated instruction

    79. Adams teacher is using Hasbrouks norms for fluency as her comparison standard

    80. Adam attends Montoya Elementary School Universal Screening Tool: CBM WRCPM administered 3 times per year Universal Screening Data (Fall): Adams score = 11 (10th percentile) Class Average Score = 51

    81. Adams Progress1st Six Weeks CBM progress data collected weekly:

    83. Decision Point Adam is referred to the Problem Solving Team Decisions: Set a Goal and Implement Tier II Intervention Goal: Adams performance will increase from the 10th percentile to the 25th percentile by the date of the third universal screening.

    84. Note: The goal is long term (30 weeks) The goal line can be used to assess progress at much smaller intervals (weekly or twice weekly if necessary)

    85. Hasbroucks Fluency Norms

    87. Creating Adam Goal Line

    88. Creating Adams Goal Line

    90. Decision Point Decision: Adam is making progress. Continue to implement the intervention and monitor progress.

    92. Decision Point Decision Raise Adams goal and continue to implement the intervention

    94. The team continues to review data and make decisions following pre-established decision rules

    95. Thank You! Mary Barringer, Ph.D. mary@thesbsgroup.org 979-220-4436

More Related