Data analysis within an rti 2 framework linking assessment to intervention
1 / 81

Data Analysis within an RtI 2 Framework: Linking Assessment to Intervention - PowerPoint PPT Presentation

  • Uploaded on

Data Analysis within an RtI 2 Framework: Linking Assessment to Intervention. Aimee R. Holt, PhD Middle Tennessee State University. What is RTI 2 ?. A systematic and data-based method for addressing academic concerns: identifying defining & r esolving Brown- Chidsey & Steege (2010).

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about ' Data Analysis within an RtI 2 Framework: Linking Assessment to Intervention ' - kinipela-aidan

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Data analysis within an rti 2 framework linking assessment to intervention

Data Analysis within an RtI2 Framework: Linking Assessment to Intervention

Aimee R. Holt, PhD

Middle Tennessee State University

What is rti 2
What is RTI2?

  • A systematic and data-based method for addressing academic concerns:

    • identifying

    • defining &

    • resolving

      Brown-Chidsey & Steege (2010)

Rti 2 is a general education initiative
RTI2 is a general education initiative….

  • Components of RTI2

    • High-quality instruction

    • Frequent assessment of academic skills

    • Data-baseddecision making

      Brown-Chidsey & Steege (2010)

Problem solving

  • Analyze the Assessment Plan Results

  • Develop an Intervention Plan

  • Analyze the Results of Implementation

  • Determine Next Steps

Implement Plan

Progress Monitor

Problem Solving

  • At each tier within RTI2, a problem solving model is employed to make decisions

Problem Identification

Plan Evaluation

Problem Analysis

Universal screeners
Universal Screeners

  • LEAs are required to:

    • Administer a nationally normed,

    • skills-baseduniversal screener

    • to students at their grade level

  • For K-8, Universal Screeners shouldbe administered 3X per year

  • In grades 9-12, there are multiple sources of data that can be reviewed, such as:

    • EXPLORE, PLAN and ACT; Tennessee Comprehensive Assessment Program (TCAP) which includes Writing (TCAP-WA), End of Course (EOC), 3-8 Achievement and in 2014-2015, Partnership for Assessment of Readiness for College and Careers (PARCC); TVAAS

Characteristics of appropriate universal screening tools
Characteristics of Appropriate Universal Screening Tools

  • Helps answer questions about efficiency of core program

    • Aligns with curriculum for each grade level

      • Skills mastery aligns with state mandated year-end assessment

        Ikeda, Neessen, & Witt (2008).

3 types of cbm s
3 Types of CBM’s

  • General Outcome Measures (GOM’s)

  • Skill Based Measures

  • Sub-skill Mastery Measures

General outcome measures
General Outcome Measures

  • GOMs

    • sample performance

    • across several goals at the same time

    • capstone tasks

      • Ex. Oral reading fluency

  • Can be used for

    • screening(benchmarking),

    • survey & specific level assessment

    • progress monitoring

Skills based measures
Skills-Based Measures

  • SBM are similar to GOM’s but can be used when capstone tasks are not available

    • Ex. Math computation

  • Can be used for

    • screening(benchmarking),

    • survey & specific level assessment

    • progress monitoring

Subskill mastery measures
Subskill Mastery Measures

  • SMMs are very narrowin focus

    • Ex. Names of letters

      • Should not be used for benchmarking

        • (exception… early skills such as Letter Naming Fluency, Letter Sound Fluency, Number Naming Fluency)

Making decisions about group data
Making Decisions about Group Data Screeners

  • Review universal screening data to answer the following questions:

    • Is there a class wide problem?

    • Who needs a Tier II intervention?

      • Be sure to examine students at the margin

    • Does anyone need Tier III now?

Who needs a tier ii or enrichment
Who needs a Tier ScreenersII or Enrichment?

  • Winter Benchmark for ORF:

    • 90th %- 153;

    • 25th% - 72;

  • Winter Benchmark for Maze:

    • 90th % - 25;

    • 25th% - 9;

  • Instructional level criteria

    • For contextual reading – 93-97% correct

    • For most other academic skills – 85-90% correct

26 /98%



26 /79%

68/ 95%


08 /80%


Examining students at the margins
Examining students at the Margins Screeners

  • Winter Benchmark for ORF:

    • 90th %- 153;

    • 25th % - 72;

  • Winter Benchmark for Maze:

    • 90th % - 25;

    • 25th % - 9;

  • Instructional level criteria

    • For contextual reading – 93-97% correct


11 /100%





Identifying who needs tier iii
Identifying Screenerswho needs Tier III

  • Winter Benchmark for ORF:

    • 25th% - 72;

    • 10th % -44

  • Winter Benchmark for Maze:

    • 25th % - 9;

    • 10th % - 6

  • Instructional level criteria

    • For contextual reading – 93-97% correct

46 / 76%

6 / 80%

42 / 83%

5 / 75%

Linking assessment to interventions
Linking Assessment to Interventions…. Screeners

  • Research has shown that effective interventions have certain features in common:

    • Correctly targeted to the student’s deficit

    • Appropriate level of challenge (instructional range)

    • Explicit instruction in the skill

    • Frequent opportunities to practice (respond)

    • Provide immediate corrective feedback

      (e.g., Brown-Chidsey & Steege, 2010; Burns, Riley-Tillman, & VanDerHeyden, 2013; Burns, VanDerHeyden, & Boice, 2008;)

Academic instruction in reading
Academic Instruction in Reading Screeners

  • Both NCLB and IDEA require that instruction in the general education setting cover all 5 areas of reading identified by the National Reading Panel

Phonological awareness
Phonological Awareness Screeners

  • A metacognitive understanding that words we hear have internal structures based on sound

    • Research on PA has shown that it exerts an independent causal influence on word-level reading. (Berninger& Wagner, 2008)

    • Phoneme– smallest unit of speech

      • The English language has 44-46 phonemes

Phonics Screeners

  • Alphabetic principle - Linking phonological (sound) and orthographic (symbol) features of language (Joseph, 2006)

    • Important for learning how to read and spell

      • National Reading Panel –students with explicit AP instruction showed benefits through the 6th grade

    • Phonological awareness is a prerequisite skill

  • Word identification: the instance when a reader accesses one or more strategies to aid in reading words (e.g., applying phonic rules or using analogies)

    • Decoding – blending sounds in words or using letters in words to cue the sounds of others in a word (Joseph, 2006)

  • Word recognition: the instant recall of words or reading words by sight; automaticity

  • Fluency
    Fluency Screeners

    • “ The ability to read a text quickly, accurately, and with proper expression” (NRP, 2000 p.3-5)

    • Most definitions of fluency include an emphasis on prosody – the ability to read with correct expression, intonation and phrasing (Fletcher et al., 2007)

    • National Reading Panel -Good reading fluency skills improved recognition of novel words, expression during reading, accuracy and comprehension

    Vocabulary text comprehension skills
    Vocabulary & Text Comprehension Skills Screeners

    • Vocabulary knowledge – including understanding multiple meanings of words; figurative language etc..

    • Identifying stated details

    • Sequencing events

    • Recognizing cause and effect relationships

    • Differentiating facts from opinions

    • Recognizing main ideas – getting the gist of the passage

    • Making inferences

    • Drawing conclusions

    What w ould assessment at tier ii look like
    What ScreenersWould Assessment at Tier II Look Like?

    So you have identified your at risk students now what
    So you have identified your “at risk students”- now what?

    • You will need to conduct Survey Level Assessment (SLA) for these students

    • Survey Level Assessment (SLA)

      • Can be used to: (a) provide information on the difference between prior knowledge and skills deficits to be used to plan instructional interventions & (b)serve as baseline for progress monitoring

    Why is it important to conduct s urvey l evel a ssessments before beginning tier ii interventions
    Why is it important to conduct what?Survey Level Assessments before beginning Tier II interventions?

    • The primary question being addressed by the survey level assessment at Tier II is

      • “What is the CATEGORY of the problem”

      • (What is the specific area of academic deficit?)

        (e.g., Riley-Tillman, Burns, Gibbons, 2013)

    An example of survey level assessment using dibels
    An Example of Survey Level Assessment Using DIBELS what?

    1) Start at student’s grade level

    2)Test backwards by grade until the student has reached the “low risk” benchmark for a given skill

    •Low risk/ established indicates the student has “mastered” that skill

    For example in reading
    For example what?….. In reading

    • comprehension & fluency =

      • comprehension intervention

  • comprehension + low fluency, but decoding =

    • fluency intervention

  • comprehension + fluency + decoding, but phonemic awareness skills

    • decoding intervention

      Riley-Tillman et al., (2013)

  • Let s look at michael a 2 nd grade student
    Let’s look at Michael a 2 what?nd grade student

    • At the fall benchmark, he was identified on ORF as being in the some risk range.

      • His score was 30 wcpm

  • Survey level assessment were conducted using:

    • DORF 1st grade – (fluency)

    • DNWF 1st grade – (decoding)

    • DPSF 1st grade – (phonemic awareness)

  • Problem




    Michael’s Scores what?

    DIBELS Scores Representing Skills Mastery

    • DORF – 35 wcpm

    • DNWF – 28 scpm

    • DPSF – 38 pcpm

    What next
    What next…. what?

    • You link your assessment data to an intervention that targets the category of skill deficit that was identified

    • You select progress monitoring probe(s) that assess that skill

    • You set the student’s goal for improvement

      • You can use ROI & Gap Analysis Worksheets to help with this

    What progress monitoring is not
    What progress monitoring is not… what?

    • It is NOT an instructional methodor intervention

    • Think of progress monitoring as a templatethat can be laid over goals and objectives from an assortment of content areas

    Does a student require tier iii intervention
    Does a student require Tier III intervention? what?

    • Step 1: Need to check to see if the data can be interpreted

      • A minimum of 8-10 data points, if progress monitoring every other week, OR10-15 data points, if progress monitoring weekly to make a data-based decision to change to Tier III.

    • Step 2 what?: Examine Rate of Improvement

      • You can compare the student’s actual ROI to the goalthat was established

      • You can use the ROI worksheets

    • Let’s complete one for Michael

    13 what?




    1.44 what?




    You also can visually analyze the graphed progress monitoring data
    You also can visually analyze the graphed progress monitoring data

    • Calculate the trend line of the intervention data points and compare it to the aim (goal) line.

      • If the slope of the trend line is lessthan the slope of the aim line, the student may need to be moved to Tier III.

        • Especially if it appears that given the student’s current ROI that they will not meet year end grade level standards

    Dual discrepancy
    Dual monitoring dataDiscrepancy

    • -A student should be deficient in level and have a poor response to evidenced-based interventions (slope) to the degree that he/she is unlikely to meet benchmarks in a reasonable amount of time without intensive instruction to move:

      • between Tier II to Tier III as well as between Tier III and referral for a comprehensive special education evaluation.

      • (e.g., Brown-Chidsey& Steege, 2008; Lichenstien, 2008)

    Specific level assessment
    Specific Level Assessment monitoring data

    • Functional analysis of skills

      • Are used to:

        • (a) identify specific skills deficits;

        • (b) students prior knowledge; &

        • (c) serve as baseline for progress monitoring

      • specific level assessments rely primarily on subskill mastery measures.

        • “drill down” to specific deficits

    Functional analysis

    R- review monitoring data

    I – interview

    O – observe

    T - test

    I – instruction

    C – curriculum

    E – environment

    L- learner

    Functional Analysis

    RIOT/ICEL Matrix

    Linking assessment data to intervention at tier iii

    Student monitoring data

    Match = Success



    Linking Assessment Data to Interventionat Tier III

    • The learner

      • focus on alterable learner variables

      • identify academic entry levelskills

    • The task

      • level of the material the student is expected to master

    • The instruction

      • research-based methods and management strategies used to deliver curriculum

    Targets for academic instructional materials
    Targets monitoring datafor Academic Instructional Materials

    • Instructional level

    • contextual reading – 93-97% correct

    • other academic skills – 85-90% correct

      • Produce larger gains more quickly

        Gravois, T.A., & Gickling, E.E. (2008). Best practices in instructional assessment. In A. Thomas & J.Grimes (Eds.), Best practices in school psychology (5th ed., pp. 503 518). Bethesda, MD: National Association of School Psychologists.

    Phonemic awareness hierarchy
    Phonemic Awareness Hierarchy monitoring data

    • Daly, Chafouleas, & Skinner (2005)

    Let s look at michael again
    Let’s look at Michael again….. monitoring data



    • Specific Level Assessment –

    • Phonics:

      • Decoding Skills test

      • Developmental Spelling Analysis

  • Sight words:

    • Graded word list

  • Phonemic Awareness:

    • LAC 3

  • Linking specific level assessment data to interventions
    Linking specific level assessment data to interventions monitoring data….

    • Basing interventions on direct samples of student’s academic skills has been shown to result in larger effect sizes than interventions derived from other data

      • This is also known as a skill by treatment interaction

      • Burns,Codding, Boice & Lukito, (2010)

    Level monitoring data

    • Central location of data within a phase

      • often compared to benchmark (goal/aim line)

      • can also look at mean or medianfor each phase

        • (e.g., Daly III et all., 2010; Hixson et al., 2008; Riley-Tillman & Burns, 2009)

      • Can conduct a Gap Analysis using the worksheet

    Slope trend
    Slope/Trend monitoring data

    • How the central location changes over time

      • With academic data we are usually looking for an increase in skills

      • Target students ROI can be compared with peer groups ROI or benchmark

        (e.g., Daly III et all., 2010; Hixson et al., 2008; Riley-Tillman & Burns, 2009)

    2 approaches for analyzing slope
    2 approaches for analyzing slope monitoring data

    • Calculate ROIand compare to an identified peer group using the ROI worksheet

    • Plot the trend line and compare the aim (goal) line to the slope (trend) line

    Variability monitoring data

    • Should be examined both within and between phases

      • General rule- most of the variability in the data should explained by the trend line

        • 80% of the data points should fall with in 15% of the trend line

    Deciding to refer for sld evaluation
    Deciding to refer for SLD evaluation monitoring data

    • As part of the teams decision to refer for an SLD evaluation, a Gap Analysis should be conducted

    • Let’s look at how to complete the Gap Analysis worksheet with Michael

    Gap analysis
    Gap Analysis monitoring data




    Conducting a gap analysis
    Conducting a Gap Analysis monitoring data

    • Step 2










    Additional consideration
    Additional Consideration monitoring data

    SEM monitoring data

    • Additionally, we cannot ignore issues such as interpreting CBM scores in light of SEM or CI when those scores are used for such as diagnoses and eligibility determinations

      • For more detailed discussion including suggested SEM guidelines for oral reading fluency scores in grades 1-5 see:

        • Christ, T. J. Silberglitt, B. (2007). Estimates of the standard error of measurement for curriculum-based measures of oral reading fluency. School Psychology Review, 36, pp. 130-146.

    Use of progress monitoring in special education
    Use of Progress Monitoring in Special Education monitoring data

    • Because CBM data

      • can be directly tied to skill development necessary to be successful in the curriculum,

      • they possess a higher level of sensitivity, and

      • allows for graphic representation;

      • they allows for development of a higher quality IEP

    • Progress monitoring shouldcontinueafter the IEP is initiated

    • Exit criteria can be set to determine if early reevaluation can be completed due to student success.

    Helpful resources
    Helpful monitoring dataResources

    Helpful resources from nasp
    Helpful monitoring dataResources from NASP

    Additional helpful resources
    Additional Helpful Resources monitoring data

    • Guilford Press