Response to intervention using data to enhance outcomes for all students
Download
1 / 148

Response to Intervention: Using Data to Enhance Outcomes for all Students - PowerPoint PPT Presentation


  • 171 Views
  • Uploaded on

Response to Intervention: Using Data to Enhance Outcomes for all Students. Amanda VanDerHeyden Education Research and Consulting, Inc. Objectives Today. Overview of RTI, RTI decision making, and expected outcomes Specific How-To for RTI

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Response to Intervention: Using Data to Enhance Outcomes for all Students' - elina


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Response to intervention using data to enhance outcomes for all students l.jpg

Response to Intervention: Using Data to Enhance Outcomes for all Students

Amanda VanDerHeyden

Education Research and Consulting, Inc.


Objectives today l.jpg
Objectives Today all Students

  • Overview of RTI, RTI decision making, and expected outcomes

  • Specific How-To for RTI

    • Interpreting Assessment Data to determine need for Tiers 1, 2, and 3 Interventions

    • Selection and implementation of Tier 2 Interventions

    • Selection and implementation of Tier 3 Interventions

  • Implementing intervention for sustenance and system change


Slide3 l.jpg

16 x 3 = 48 hours all Students


What is rti l.jpg
What is RTI? all Students

  • A science of decision making and way of thinking about how educational resources can be allocated (or reallocated) to best help all children learn

  • Major premium on child outcomes


Rti is not l.jpg
RTI is Not all Students

  • A program, a curriculum, an intervention, a particular model


Data allow us to l.jpg
Data allow us to all Students

  • Provide faster, more effective services for ALL children

  • Work “smarter” not harder, better utilize the talents of the school psychologist and school-based assessment and intervention teams.

  • Make implementation SIMPLE and EASY for teachers (low cost, few errors)

  • Prevent diagnosis


Early screening identifies children at risk of reading difficulty l.jpg

5.2 all Students

2.5

Early Screening Identifies Children At Risk of Reading Difficulty

J

5

4

Low Risk on Early Screening

Reading grade level

3

2

At Risk on Early Screening

1

1 2 3 4

Grade level corresponding to age

This Slide from Reading First Experts

From Reading First


Early intervention changes reading outcomes l.jpg

With substantial instructional intervention all Students

4.9

With research-based core but without extra instructional intervention

3.2

Intervention

Control

Early Intervention Changes Reading Outcomes

J

5.2

5

4

Low Risk on Early Screening

Reading grade level

3

2.5

2

At Risk on Early Screening

1

1 2 3 4

Grade level corresponding to age

This Slide from Reading First Experts

From Reading First


Evolution l.jpg
Evolution all Students

  • “Wait to Fail”

    • Let’s provide services early!

  • Costly sp ed programs not improving learning

    • Let’s shift resources to provide services in less restrictive setting!

  • Increasing numbers of children struggling in general ed

    • Let’s provide help in general education!

  • Traditional measures are de-contextualized and the constructs are problematic

    • Let’s help children who struggle academically by measuring performance in response to certain intervention strategies and then deliver what works!


Slide10 l.jpg

Rationale for System Change all Students

  • Level and Rate of Performance

  • Return to General Education

  • Lack of Certified Teachers

  • No demonstrated instructional techniques that differentially benefit SLD

  • Drop-out

  • Disproportionate Representation by Ethnicity

200-300% increase in SLD


History of rti l.jpg
History of RTI all Students

  • Effective Instruction Lit, CBA/M Lit

  • Lab-Quality Intervention Programs

  • Progress Monitoring Data and Problem-Solving Models

  • Reading First data


Consistent with l.jpg
Consistent with all Students

  • NCLB

  • Reauthorized IDEA

  • Recommendations of panel reports on: Minority students in special education, National Reading Panel, Science and Math Initiative


Impetus l.jpg
Impetus all Students

  • Faster, more effective services for ALL children

  • Work “smarter” not harder, better utilize the talents of the school psychologist and school-based assessment and intervention teams.

  • Make implementation SIMPLE and EASY for teachers (low cost, few errors)


Why rti l.jpg
Why RTI? all Students

  • Viable alternative to traditional diagnosis of high-incidence disabilities, particularly Learning Disability (LD)

  • Reauthorized IDEA guidelines for identifying LD state that:

    a) A severe discrepancy between achievement and intellectual ability shall not be required

    b) A response to intervention (RTI) may be considered


Why rti15 l.jpg
Why RTI? all Students

  • RTI can address the problem of disproportionate identification of children with LD by race and gender

  • The utility of curriculum-based measures (CBM) for:

    • Identifying children not likely to benefit from the general education curriculum without assistance,

    • Predicting important long-term outcomes

    • Tracking individual student growth and informing instructional programming changes has been established


Considerations l.jpg
Considerations all Students

  • There has been some consensus concerning the need for change; however, there has not been consensus on how this change can best be achieved.

  • Whereas RTI has considerable promise as a tool within special and general education, it is a vulnerable construct if misapplied.


Improved treatment validity l.jpg
Improved Treatment Validity all Students

  • Direct link to treatment or consequential validity

  • Efforts are focused to:

    • Properly articulate a concern

    • Develop targeted intervention to resolve the concern

    • Collect information to determine whether or not the concern has been adequately addressed or whether different solution efforts need to be implemented

  • This approach changes the goal of assessment from what some have described as “admiration” of the problem to problem-solving.


Contextualized decision making l.jpg
Contextualized Decision Making all Students

  • RTI emphasizes the pre-referral conditions (child and environment) and this context becomes part of the decision-making equation.

  • Allows practitioners to quantify:

    • The state of instructional affairs in the child’s regular education environment

    • Potential learning given optimal instructional conditions

  • RTI may enable improvement in general education programming leading to children receiving assistance in a more efficient manner.


Improved identification accuracy for ld l.jpg
Improved Identification Accuracy for LD all Students

  • Under RTI models intervention becomes a specified, operationalized variable, thus false positive identification errors should be reduced dramatically.

  • Removing the current reliance on teacher identification and requiring direct measures of child performance in context will enhance identification accuracy.


More effective intervention l.jpg
More Effective Intervention all Students

  • RTI is likely to facilitate less restrictive interventions and placements for children.

  • RTI allows school psychologists to bring their expertise to bear on assessment strategies at the classroom level and assist teachers to use data formatively to enhance their instructional programming.



Decision making criteria l.jpg
Decision-making Criteria all Students

  • Must be operationalized and validated through research

  • The purpose of RTI will be critical to determining how implementation should proceed


New challenges for teams l.jpg
New Challenges for Teams all Students

  • Effective intervention delivery will depend on relevant intervention variables

  • To be effective the intervention must have been:

    • Properly identified

    • Implemented with integrity and with sufficient frequency, intensity, and duration


Steep model l.jpg

STEEP Model all Students

Screening to Enhance Educational Progress


Tier 1 math screening l.jpg
Tier 1: Math Screening all Students

  • Math Probe:

    • Group administered.

    • Materials: Worksheet consisting of a series of problems sampling the target skill(s) (e.g., sums to 5, double digit multiplication with regrouping).

    • Timing: 2 minutes

    • Information obtained: digits correct in two minutes.


Math probe example l.jpg
Math Probe Example all Students

  • Total Digits: 38

  • Errors: 5

  • Digits Correct: 33


Tier 1 writing screening l.jpg
Tier 1: Writing Screening all Students

  • Writing Probe:

    • Group administered.

    • Materials: story starter (e.g., If I had a million dollars…) printed at the top of a blank page.

    • Timing: 1 minute to think, 3 minutes to write.

    • Scoring: words written or correct word sequences in three minutes.


Writing example l.jpg
Writing Example all Students


Tier 1 reading screening l.jpg
Tier 1: Reading Screening all Students

  • Reading Probe:

    • Individually administered

    • Materials: A content-controlled reading passage.

    • Procedure: The student reads aloud as the teacher listens and records errors.

    • Timing: 1 minute

    • Information obtained: words read correctly in one minute.


Cbm reading sample scoring l.jpg
CBM Reading: Sample Scoring all Students

  • TRW=63

  • Errors=6

  • CRW=58


Class wide screening l.jpg
Class-wide Screening all Students


Feedback to teachers l.jpg
Feedback to Teachers all Students




Tier 2 can t do won t do assessment l.jpg
Tier 2: Can’t Do/Won’t Do Assessment all Students

  • “Can’t Do/Won’t Do”

  • Individually-administered

  • Materials

    • Academic material that student performed poorly during class assessment.

    • Treasure chest: plastic box filled with tangible items.

3-7 minutes per child





Response to intervention l.jpg
Response to Intervention all Students

Before Intervention

During Intervention

#Correct

Avg. for his Class

Each Dot is one Day of Intervention

Intervention Sessions

Intervention in Reading


Slide40 l.jpg

Response to Intervention all Students

Before Intervention

During Intervention

#Correct

Avg. for his Class


Vehicle for system change system wide math problem l.jpg
Vehicle for System Change: all StudentsSystem-wide Math Problem

Instructional range

Frustrational range

Each bar is a student’s performance



Rest of grade at standard l.jpg
Rest of Grade at Standard all Students

Classroom

A

B

C

D

E

F




Class wide intervention l.jpg
Class-wide Intervention all Students

Teacher F Mult 0-12

120

100

80

Digits Correct Two Minutes

60

40

20

0

11/7/2003

10/24/2003

10/31/2003

11/14/2003

11/18/2003

Weeks




Growth obtained l.jpg
Growth Obtained all Students

actual growth

aimline


Effect on high stakes scores l.jpg
Effect on High-Stakes Scores all Students

VanDerHeyden, in prep


Effect on high stakes scores51 l.jpg
Effect on High-Stakes Scores all Students

VanDerHeyden, in prep


District wide implementation data l.jpg
District-wide Implementation Data all Students

  • Vail Unified School District

    • www.vail.k12.az.us

  • Three years, system-wide implementation of STEEP grades 1-8


System outcomes l.jpg
System Outcomes all Students

  • Referrals reduced greater than half

  • % who qualify from 50% stable baseline over three years to nearly 100%

  • SLD down from 6% of children in district in 2001-2002 (with baseline upward trend) to 3.5% in 2003-2004 school year

  • Corresponding gains on high-stakes tests (VanDerHeyden & Burns, 2005)

  • Intervention successful for about 95 to 98% of children screened

VanDerHeyden, Witt, & Gilbertson, 2007


Cost reduction l.jpg
Cost Reduction all Students

VanDerHeyden, Witt, & Gilbertson, 2007


Slide55 l.jpg

Findings all Students

  • Number of Evaluations dramatically reduced– 70% at highest referral school

  • Diverse settings, psychologists of diverse backgrounds and no prior experience with CBM or functional academic assessment

  • Percentage qualify increased at 4 of 5 schools

  • Disproportionate representation of males positively affected

  • Number of children placed dramatically reduced

VanDerHeyden, Witt, & Gilbertson, 2007


Slide56 l.jpg

Team Decision-Making Agreement all Students

VanDerHeyden, Witt, & Gilbertson, 2007


Team decision making l.jpg
Team Decision-Making all Students

VanDerHeyden, Witt, & Gilbertson, 2007


Fall to spring reading growth l.jpg
Fall to Spring Reading Growth all Students

VanDerHeyden & Witt, 2005


What proportion of ethnicity represented before and after intervention in risk category l.jpg
What Proportion of Ethnicity Represented Before and After Intervention in Risk Category?

VanDerHeyden & Witt, 2005


Slide60 l.jpg

Criterion + Intervention in Risk Category?

Criterion -

PVS +

PVS -

.76/.11=6.9: children with learning problem about 7 times more likely to have a failed RTI than children without a learning problem

.24/.89=.27: children without a learning problem are 3.7 times more likely to have a successful RTI than those children with a learning problem

Sens- .76

Spec- .89

PPP- .59

NPP- .95

+ LR- sens/(1-spec)

- LR- (1-sens)/spec


Slide61 l.jpg

Criterion + Intervention in Risk Category?

Criterion -

Probe +

Probe -

.94/.46=2.0: children with learning problem about 2 times more likely to perform below screening benchmark than children without a learning problem

.06/.54=.11: children without a learning problem are 9 times more likely to have a above criterion screening score than those children with a learning problem

Sens- .94

Spec- .54

PPP- .29

NPP- .98


Identification accuracy l.jpg
Identification Accuracy Intervention in Risk Category?

  • High-achieving classrooms (<20%)

    • Procedures paired with RTI criterion were more accurate than other commonly used screening devices

  • Low-achieving classrooms (>50%)

    • Procedures paired with RTI criterion were more accurate than other commonly used screening devices

VanDerHeyden & Witt, 2005


Questions l.jpg
Questions Intervention in Risk Category?

  • Is there a classwide problem?

  • Is there a gradewide problem?

  • What’s the most efficient way to deliver intervention?


Screening tells you l.jpg
Screening tells you Intervention in Risk Category?

  • How is the core instruction working?

  • What problems might exist that could be addressed?

  • Most bang-for-the-buck activity

  • Next most high-yield activity is classwide intervention at Tier 2.


Screening guidelines l.jpg
Screening Guidelines Intervention in Risk Category?

  • Efforts at Tier 1 pay off with fewer children needing individual intervention

  • 3 times per year, single probe

  • Use small team of trained coaches

  • Prepare all needed materials in a packet for each teacher

  • Score and return within 1 week on graph

  • Use data to generate aimlines, can be used to set benchmarks


Slide66 l.jpg

Here is 3 Intervention in Risk Category?rd grade at Cottonwood. Each circle corresponds to a students score on a reading CBM probe in March and the AIMS reading score the same month. So the circle near the blue lines is a child who read 158 wc/min and scored 486 on the AIMS Reading. The diagonal line represents the “best fit” line or the line closest to all the circles. This shows there is a strong positive correlation between CBM and AIMS reading scores with this group.


How to set a benchmark l.jpg
How to Set a Benchmark Intervention in Risk Category?

This is words read correctly per minute. You move this line up and down to “catch” as many of those who will not pass as possible.

431 = “pass” for AIMS. This line does not move


Setting 95 wc min as the pass standard for cbm l.jpg
Setting 95 wc/min as the “pass” standard for CBM Intervention in Risk Category?

These are children who were predicted to pass AIMS based on CBM and did pass. “Hits”

These are the children predicted to pass AIMS who actually failed. “False Negative Errors.” The worst kind of error.

These are the children predicted to fail AIMS who actually passed. “False Positive Errors.”

These are the children who were predicted to fail AIMS based on CBM who did fail. “Hits”


Slide69 l.jpg
Moving the horizontal line up will “catch” two more cases who failed the AIMS, but will result in many more “false positive errors.”

Thus, 95 wc/min at 3rd grade at Cottonwood is a good standard that will tell you which children are likely to fail the AIMS reading section.


Roc printout l.jpg
ROC printout cases who failed the AIMS, but will result in many more “false positive errors.”


Mastery model measurement cba l.jpg
Mastery Model Measurement (CBA) cases who failed the AIMS, but will result in many more “false positive errors.”

Letter naming fluency

Isolated sound fluency

Beginning sound fluency

Ending sound fluency


General outcome measurement cbm l.jpg
General Outcome Measurement (CBM) cases who failed the AIMS, but will result in many more “false positive errors.”

Beginning sound fluency

Letter naming fluency

Isolated sound fluency

Ending sound fluency

Words read correctly per minute


Tracking year long growth l.jpg
Tracking Year-Long Growth cases who failed the AIMS, but will result in many more “false positive errors.”

mastery

aimline

instructional


Slide74 l.jpg

Pass the AIMS cases who failed the AIMS, but will result in many more “false positive errors.”

Digits Correct Two Minutes

1

12

Weeks


Slide75 l.jpg

Academic Systems cases who failed the AIMS, but will result in many more “false positive errors.”

Behavioral Systems

  • Intensive, Individual Interventions

  • Individual Students

  • Assessment-based

  • High Intensity

  • Of longer duration

  • Intensive, Individual Interventions

  • Individual Students

  • Assessment-based

  • Intense, durable procedures

  • Targeted Group Interventions

  • Some students (at-risk)

  • High efficiency

  • Rapid response

  • Targeted Group Interventions

  • Some students (at-risk)

  • High efficiency

  • Rapid response

  • Universal Interventions

  • All students

  • Preventive, proactive

  • Universal Interventions

  • All settings, all students

  • Preventive, proactive

Any

Curriculum

Area

1-5%

1-5%

5-10%

5-10%

Students

80-90%

80-90%

Dave Tilly, 2005


Weighing a cow doesn t make it fatter l.jpg

“Weighing a cow doesn’t make it fatter.” cases who failed the AIMS, but will result in many more “false positive errors.”


Slide77 l.jpg

Academic Systems cases who failed the AIMS, but will result in many more “false positive errors.”

Behavioral Systems

  • Intensive, Individual Interventions

  • Individual Students

  • Assessment-based

  • High Intensity

  • Of longer duration

  • Intensive, Individual Interventions

  • Individual Students

  • Assessment-based

  • Intense, durable procedures

  • Targeted Group Interventions

  • Some students (at-risk)

  • High efficiency

  • Rapid response

  • Targeted Group Interventions

  • Some students (at-risk)

  • High efficiency

  • Rapid response

  • Universal Interventions

  • All students

  • Preventive, proactive

  • Universal Interventions

  • All settings, all students

  • Preventive, proactive

Any

Curriculum

Area

1-5%

1-5%

5-10%

5-10%

Students

80-90%

80-90%

Dave Tilly, 2005


Class wide intervention78 l.jpg
Class-wide Intervention cases who failed the AIMS, but will result in many more “false positive errors.”

  • Use pair-peered practice (classwide peer tutoring, PALS)

  • Model, Guided Practice, Independent timed practice with delayed error correction, reward contingency


Slide79 l.jpg

Instructional Hierarchy cases who failed the AIMS, but will result in many more “false positive errors.”

Finally, problem-solving/ application practice should occur here with a mastery level skill– Core Instruction- Not Manipulated but could be

But fluency building should happen here with an instructional level skill– Intervention Focus was here

To gain the steepest growth, introduction of new skills should happen here– Core Instruction- Not manipulated

Generalization

Fluency

Acquisition


Reading classwide intervention l.jpg
Reading classwide intervention cases who failed the AIMS, but will result in many more “false positive errors.”


Select a few good interventions to keep it simple l.jpg
Select a Few Good Interventions to Keep it Simple cases who failed the AIMS, but will result in many more “false positive errors.”


Intervention plan 15 min per day l.jpg
Intervention Plan- 15 Min per Day cases who failed the AIMS, but will result in many more “false positive errors.”

  • Protocol-based classwide peer tutoring, randomized integrity checks by direct observation

  • Model, Guide Practice, Independent Timed Practice with delayed error correction

  • Group performance contingency

  • Teachers encouraged to

    • Scan papers for high error rates

    • Do 5-min re-teach for those with high-error rates

    • Provide applied practice using mastery-level computational skill


Sample sequence l.jpg
Sample Sequence cases who failed the AIMS, but will result in many more “false positive errors.”


Intervention plan l.jpg
Intervention Plan cases who failed the AIMS, but will result in many more “false positive errors.”

  • Class Median reaches mastery range for skill, next skill is introduced

  • Following promising results at one site in 2002-2003, lead to implementation district-wide grades 1-8 for all children by 2004-2005.


Class wide math intervention l.jpg
Class-wide Math Intervention cases who failed the AIMS, but will result in many more “false positive errors.”


With teacher support l.jpg
With teacher support cases who failed the AIMS, but will result in many more “false positive errors.”

  • Consider time, resources, materials

  • Remove skill barriers with

    • classroom training for students

    • classroom coaching for teachers

  • Remove implementation barriers after use new steps

    • follow-up supportive meetings to problem solve.

    • frequent acknowledgment of a teacher’s efforts


Slide91 l.jpg

Training Package cases who failed the AIMS, but will result in many more “false positive errors.”

Tell

Rational

Step by step protocol

Show

Model

Do

Train students

Implement with guided practice

Implement independently with support


Slide92 l.jpg

88% of interventions are not used cases who failed the AIMS, but will result in many more “false positive errors.”

without support

Decisions do not always correspond to data (someone must check)


Slide93 l.jpg

Tier 2 Intervention Effects cases who failed the AIMS, but will result in many more “false positive errors.”


Class 1 at screening l.jpg
Class 1 at Screening cases who failed the AIMS, but will result in many more “false positive errors.”


Class 1 following 10 days intervention l.jpg
Class 1: Following 10 Days Intervention cases who failed the AIMS, but will result in many more “false positive errors.”


Class 1 following 15 days intervention l.jpg
Class 1: Following 15 Days Intervention cases who failed the AIMS, but will result in many more “false positive errors.”


Class 2 at screening l.jpg
Class 2 at Screening cases who failed the AIMS, but will result in many more “false positive errors.”


Class 2 following 5 days intervention l.jpg
Class 2: Following 5 Days Intervention cases who failed the AIMS, but will result in many more “false positive errors.”


Class 2 following 10 days intervention l.jpg
Class 2: Following 10 days Intervention cases who failed the AIMS, but will result in many more “false positive errors.”


Class 3 at screening l.jpg
Class 3 at Screening cases who failed the AIMS, but will result in many more “false positive errors.”


Class 3 following 5 days intervention l.jpg
Class 3: Following 5 days Intervention cases who failed the AIMS, but will result in many more “false positive errors.”


Following 10 days intervention l.jpg
Following 10 Days Intervention cases who failed the AIMS, but will result in many more “false positive errors.”


Tier 1 screening indicates class wide problem l.jpg
Tier 1 Screening Indicates Class-wide Problem cases who failed the AIMS, but will result in many more “false positive errors.”


Tier 2 class wide intervention104 l.jpg
Tier 2: Class-wide Intervention cases who failed the AIMS, but will result in many more “false positive errors.”

Teacher F Mult 0-12

120

100

80

Digits Correct Two Minutes

60

40

20

0

11/7/2003

10/24/2003

10/31/2003

11/14/2003

11/18/2003

Weeks


Increased difficulty intervention continues105 l.jpg
Increased Difficulty- Intervention Continues cases who failed the AIMS, but will result in many more “false positive errors.”


Slide106 l.jpg

Contextually-Relevant Comparisons and Use of Trend Data cases who failed the AIMS, but will result in many more “false positive errors.”


5 th grade math intervention l.jpg
5 cases who failed the AIMS, but will result in many more “false positive errors.”th Grade Math Intervention


Pre post changes to performance detected by cbm l.jpg
Pre-post changes to performance detected by CBM cases who failed the AIMS, but will result in many more “false positive errors.”

Instructional range

Frustrational range

Each bar is a student’s performance


Slide109 l.jpg

Fourth Grade cases who failed the AIMS, but will result in many more “false positive errors.”


Effect on sat 9 performance l.jpg
Effect on SAT-9 Performance cases who failed the AIMS, but will result in many more “false positive errors.”


Effect on cbm scores l.jpg
Effect on CBM Scores cases who failed the AIMS, but will result in many more “false positive errors.”


Slide112 l.jpg
Computation Gains Generalized to High Stakes Test cases who failed the AIMS, but will result in many more “false positive errors.”Improvements (Gains within Multiple Baselineshown as pre-post data)


Gains within multiple baseline shown as pre post data l.jpg
Gains within Multiple Baseline cases who failed the AIMS, but will result in many more “false positive errors.”(shown as pre-post data)


Identification accuracy114 l.jpg
Identification Accuracy cases who failed the AIMS, but will result in many more “false positive errors.”

VanDerHeyden, et al., 2003


Percent identified at each tier l.jpg
Percent Identified at each Tier cases who failed the AIMS, but will result in many more “false positive errors.”

VanDerHeyden, et al., 2003


Slide116 l.jpg

Academic Systems cases who failed the AIMS, but will result in many more “false positive errors.”

Behavioral Systems

  • Intensive, Individual Interventions

  • Individual Students

  • Assessment-based

  • High Intensity

  • Of longer duration

  • Intensive, Individual Interventions

  • Individual Students

  • Assessment-based

  • Intense, durable procedures

  • Targeted Group Interventions

  • Some students (at-risk)

  • High efficiency

  • Rapid response

  • Targeted Group Interventions

  • Some students (at-risk)

  • High efficiency

  • Rapid response

  • Universal Interventions

  • All students

  • Preventive, proactive

  • Universal Interventions

  • All settings, all students

  • Preventive, proactive

Any

Curriculum

Area

1-5%

1-5%

5-10%

5-10%

Students

80-90%

80-90%

Dave Tilly, 2005


Tier 3 l.jpg
Tier 3 cases who failed the AIMS, but will result in many more “false positive errors.”

  • Assessment Data

    • Instructional level performance

    • Error analysis (high errors, low errors, pattern)

    • Effect of incentives, practice, easier task

    • Verify intervention effect

  • Same implementation support as Tier 2

  • Instructional-level materials; Criterion-level materials


Assessment data l.jpg
Assessment Data cases who failed the AIMS, but will result in many more “false positive errors.”


Assessment data119 l.jpg
Assessment Data cases who failed the AIMS, but will result in many more “false positive errors.”


Tier 3120 l.jpg
Tier 3 cases who failed the AIMS, but will result in many more “false positive errors.”

  • Implement for 5-15 consecutive sessions with 100% integrity

  • Link to referral decision

  • Weekly graphs to teacher and weekly generalization probes outside of classroom, supply new materials

  • Troubleshoot implementation weekly


Lessons learned l.jpg
Lessons Learned cases who failed the AIMS, but will result in many more “false positive errors.”

  • Most individual interventions for reading

  • Standard protocols with slight modifications are best

  • Most interventions are successful (should be successful)

  • Generalization must be attended to

  • Team will not follow data without support and training to do so

  • Coordinate intervention start times with principal and stagger start dates (10-15 at a time plus Tier 2’s).

  • Organize master schedule for data collection and Tier times


Tier 3 intervention l.jpg
Tier 3 Intervention cases who failed the AIMS, but will result in many more “false positive errors.”

  • >5% of children screened (total population) IF solid Tier 1

  • Possibly as low as 2% IF solid Tier 1 and Tier 2

  • About 1-2% failed RTI; 10% of most at-risk


Successful rti l.jpg
Successful RTI cases who failed the AIMS, but will result in many more “false positive errors.”


2004 3rd grade l.jpg
2004- 3rd Grade cases who failed the AIMS, but will result in many more “false positive errors.”


Successful rti125 l.jpg
Successful RTI cases who failed the AIMS, but will result in many more “false positive errors.”


Successful writing l.jpg
Successful Writing cases who failed the AIMS, but will result in many more “false positive errors.”


Successful rti127 l.jpg
Successful RTI cases who failed the AIMS, but will result in many more “false positive errors.”


Measurement error l.jpg
Measurement Error cases who failed the AIMS, but will result in many more “false positive errors.”


Successful math l.jpg
Successful Math cases who failed the AIMS, but will result in many more “false positive errors.”


Unsuccessful math l.jpg
Unsuccessful Math cases who failed the AIMS, but will result in many more “false positive errors.”


Tips for effective implementation l.jpg

Tips for Effective Implementation cases who failed the AIMS, but will result in many more “false positive errors.”


Our recipe for intervention success l.jpg
Our Recipe for Intervention Success cases who failed the AIMS, but will result in many more “false positive errors.”

PREPARE

Identify and Use standard protocols for intervention

Develop all needed materials

Develop packets or put on a central web site

Determine graphing program


Our recipe for intervention success133 l.jpg
Our Recipe for Intervention Success cases who failed the AIMS, but will result in many more “false positive errors.”

TRAIN

Explain

Watch the teacher do it with the actual child before you leave

Call or meet teacher after first day to problem solve


Our recipe for intervention success134 l.jpg

COLLECT DATA AND SUPPORT cases who failed the AIMS, but will result in many more “false positive errors.”

Each week, graph intervention performance and do a generalization check with the child.

Graphed feedback to teachers with generalization checks for individual intervention once per week

Response-dependent performance feedback to sustain implementation accuracy

Monthly CBM to track growth and enhance existing Tier 1 Programs or advise new Tier 1

Data to principal weekly. Summarize effects and integrity of procedures.

Our Recipe for Intervention Success


Tracking title progress l.jpg
Tracking Title Progress cases who failed the AIMS, but will result in many more “false positive errors.”


Our recipe for intervention success136 l.jpg
Our Recipe for Intervention Success cases who failed the AIMS, but will result in many more “false positive errors.”

USE DATA TO MAKE DECISIONS

RTI successful if child performs criterion-level probe (from screening) in the instructional range. RTI unsuccessful if 15 consecutive intervention sessions and criterion probe is not in the instructional range.

Increase task difficulty for intervention if child scores at mastery on task during intervention sessions


Infrastructure for implementation l.jpg
Infrastructure for Implementation cases who failed the AIMS, but will result in many more “false positive errors.”

  • Grade-level planning periods can be utilized

  • Special education “team” at school can be utilized

  • School Psych must be on-site 1 day/week

  • Developing master schedule for Tier 1, 2, and 3 intervention times is useful

  • Integrate efforts with evaluation referral team efforts (consider major reduction in meeting time and shift to intervention efforts!)


Materials needed l.jpg
Materials needed cases who failed the AIMS, but will result in many more “false positive errors.”

  • Computer and software to organize data

  • Student data imported. Clerical person to enter data on-site for tier 1 screen only.

  • Color printer to print graphs + extra color cartridges

  • Probe materials, digital count-down timers

  • Intervention protocols, intervention materials (e.g., flashcard sets, reading materials)

  • Access to copier and some assistance with copying

  • Reinforcers for treasure chest (no more than $500 per school)


Guidelines for implementers l.jpg
Guidelines for Implementers cases who failed the AIMS, but will result in many more “false positive errors.”

Use single trial scores for screening

Following screening, grade-wide graphs to principal

Return data to teachers within 48 hours with personal interpretation at grade-level team meeting

Include principal in critical meetings

Involve teachers at all stages


Guidelines for implementers140 l.jpg
Guidelines for Implementers cases who failed the AIMS, but will result in many more “false positive errors.”

Learn about curriculum and instruction.

Integrate RTI with ongoing school and system reform efforts

Thoughtfully merge to subtract duplicate activities and to enhance more comprehensive supplemental and core instructional support activities that may be in place

Use RTI data to evaluate the value of ALL instructional programs or resource allocation decisions. Quantify bang for the buck using student performance data.


Lessons learned from vail usd l.jpg
Lessons Learned from Vail USD cases who failed the AIMS, but will result in many more “false positive errors.”

Infrastructure for education being a results-based enterprise

Accountability

Principal is the Instructional Leader of a school

Principal as change agent

School psychologist as change agent

Replace resources/substitute don’t add

Minimize meetings

Track outcomes that matter


Slide142 l.jpg

Upset parent cases who failed the AIMS, but will result in many more “false positive errors.”

Tantruming child

Bus 11 is late

Police on site for child abuse report

Principal

Visit from health department, bathroom out of paper towels again

Teacher out sick for rest of year

Meeting at district office/items due


Slide143 l.jpg

Principal cases who failed the AIMS, but will result in many more “false positive errors.”

FILTER-- How much time allocated to instruction? Children actively engaged? Standards introduced? Effective instruction occurring?

Upset parent

DATA on Learning

Check on health dept

Goal Setting

Check on police interview

Teacher Evaluation

Etc.

Allocation of Instructional Resources


Great instructional leaders l.jpg
Great Instructional Leaders cases who failed the AIMS, but will result in many more “false positive errors.”

Have a filter

Allocate time and resources according to their filter

Use an AIMLINE

Have a framework for making data-driven decisions (know how to access the data they need to reach timely decisions)

Hold teachers, staff, students accountable

Research findings on effective schooling


Great school psychologists l.jpg
Great School Psychologists cases who failed the AIMS, but will result in many more “false positive errors.”

Hand the principal the data the principal did not know to ask for but can’t live without

Follow the aimline and attend to implementation integrity

Understand the variables of effective instruction and engage in contextualized assessment that is technically valid for the purposes needed AND has treatment utility

Minimizes meeting time and “avoids the science of strange behavior…”


Great districts l.jpg
Great Districts cases who failed the AIMS, but will result in many more “false positive errors.”

Minimize time away from school, but use time together to review school improvement implementation efforts and ongoing results

Have the will to proactively chart the course of a district

Provide adequate resources and space for principals to be effective instructional leaders and hold them accountable for results

Respect the role of parents and actively engage them

Have a framework for evaluating results (know how to access data for decision making)

Evaluate quality of all programs locally and make decisions about continued use based on DATA.


Great teachers l.jpg
Great Teachers cases who failed the AIMS, but will result in many more “false positive errors.”

  • Use data to identify where more/different/less instruction is needed

  • Have as a goal to accelerate all learning of all children

  • Proactively address barriers to learning

  • Take responsibility for learning that occurs in the classroom

  • Are confident and ready to collaborate in the classroom

  • Appreciate childhood and children (a little humor, lots of patience, enthusiasm)


For more information l.jpg
For More Information cases who failed the AIMS, but will result in many more “false positive errors.”

[email protected]

www.isteep.com

Thank you to the US Dept of Education for providing all film clips shown in this presentation


ad