Evaluating the technical adequacy of fbas and bips how are schools doing
This presentation is the property of its rightful owner.
Sponsored Links
1 / 70

Evaluating the Technical Adequacy of FBAs and BIPs: How are Schools Doing? PowerPoint PPT Presentation

  • Uploaded on
  • Presentation posted in: General

Evaluating the Technical Adequacy of FBAs and BIPs: How are Schools Doing?. Rose Iovannone, Ph.D., BCBA-D [email protected] . Objectives. Participants will: Describe the purpose of the Technical Adequacy evaluation tool Apply a scoring rubric to case examples

Download Presentation

Evaluating the Technical Adequacy of FBAs and BIPs: How are Schools Doing?

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Evaluating the technical adequacy of fbas and bips how are schools doing

Evaluating the Technical Adequacy of FBAs and BIPs: How are Schools Doing?

Rose Iovannone, Ph.D., BCBA-D

[email protected]



  • Participants will:

    • Describe the purpose of the Technical Adequacy evaluation tool

    • Apply a scoring rubric to case examples

    • Discuss further use of the evaluation in their settings

Advance organizer

Advance Organizer

  • Essential Features of Tier 3 Behavior (FBA/BIPs)

  • Review of the Technical Adequacy Evaluation Tool and Rubric

  • Practice scoring

  • Discussion of how to use the tool in the future

Context for fbas bips

Context for FBAs/BIPs

  • FBA/BIP—substantial evidence base

  • Behavior ‘gold’ standard for nearly 20 years

  • Systemic and skill issues impeding implementation

  • Wealth of literature providing evidence-basis

    • BUT, does not address the contextual fit of FBA in school culture (Scott & Kamps, 2007)

      • Educators’ willingness and ability to engage in process

      • Level and intensity of FBA necessary to result in improvements

  • Conceptually, FBA seen as tool for use in multi-tiered system of supports rather than separate process

    • If part of process, may change traditional definition of what and who is involved in FBA

Examples of the problem

Examples of the Problem

  • Forms vs. skills

    • “Let’s create new forms” common solution

  • Paperwork vs. implementation

  • General vs. individualized

  • Training vs. coaching

  • Expert vs. collaborative team model

  • Separate silos vs. integrated, consistent process

  • Legalities vs. problem-solving

Evaluating the technical adequacy of fbas and bips how are schools doing

The Top Twelve List of Things Needed at Tier 3/Individualized Behavior Supports (Iovannone & Kincaid, in prep.)

  • Multiple levels of Tier 3

  • Consistent, fluent process with problem solving-process framework

  • Collaborative teaming

  • Problem identification

  • Data collection, simplified

  • Linking hypothesis to the FBA

  • Linking BIP to hypothesis

  • Multi-component behavior intervention plan matched to classroom context

  • Task-analyzed strategies

  • Teacher and classroom coaching/support

  • Array of outcome measures (child-specific, teacher fidelity, social validity, alliance, fidelity of process, technical adequacy of products)

  • Maintenance (beyond “warranty”)

1 multiple levels of tier 3 fba

1. Multiple Levels of Tier 3 FBA

  • Three levels of Tier 3

  • Match the level of need to the student

    • Level 1: Classroom consultation (Facilitator and teacher)

      • Brief PTR

      • ERASE (Terry Scott)

      • Guess and Check(Cindy Anderson)

    • Level 2: Comprehensive support (e.g., PTR; team-based process)

    • Level 3: Wrap around with person-centered planning

  • Tier 3 most effective if Tiers 1 and 2 implemented with fidelity

2 consistent tier 3 process

2. Consistent Tier 3 Process

  • Standardized process for ALL students requiring FBAs/BIPs

  • Incorporates following features:

    • Identifying students needing Tier 3

    • Determining level of FBA support necessary to answer referral concern

    • Decision points

    • Timelines between FBA, BIP, Support, Follow-up

    • Data tracking system

    • Coaching and fidelity

  • Flowchart

2 consistent tier 3 process problem solving process

2. Consistent Tier 3 Process—Problem Solving Process


What is the behavior of concern? What do we want to see less of? What do we want the student to do more of?


Functional Behavior Assessment Hypothesis


Is the plan effective? What are the next steps?


Behavior strategies linked to hypothesis; coaching/support

3 collaborative teaming

3. Collaborative Teaming

  • Discontinue expert model – need proficient facilitator to guide team

  • Three levels of knowledge represented on teams

    • Knowledge of student

    • Knowledge of ABA principles

    • Knowledge of district/campus context

  • Consensus process established

4 problem identification

4. Problem Identification

  • Primary problem with many ineffective FBA/BIPs is that the problem is not clearly identified:

    • Too general

    • Not defined

    • Baseline data confirming problem absent

    • Often, several behaviors listed and unclear which behavior was the focus of the FBA

    • Not uncommon to see behaviors of concern “change” throughout one FBA/BIP

  • Need to identify both the replacement behavior to increase as well as problem behavior to decrease—consider broad categories including academic, social, behavior

5 simplify data collection

5. Simplify Data Collection

  • Progress monitoring must be:

    • Feasible

    • Reliable

    • Sensitive to change

    • Flexible to match individual

    • Standardized (comparable across schools/students/districts)

  • Direct Behavior Ratings (DBRs) offer a solution

    • Research supports their effectiveness (see Chafouleas, Riley-Tillman)

    • LEAP (Phil Strain)

    • Individualized Behavior Rating Scale (IBRST) used in PTR (Iovannone et al., in press).

Evaluating the technical adequacy of fbas and bips how are schools doing

Case Study- Mike: Behavior Rating Scale


Brs psychometrics iovannone greebaum wang kincaid dunlap in press

BRS Psychometrics (Iovannone, Greebaum, Wang, Kincaid, & Dunlap, in press)

  • Kappa coefficients of:

    • Problem Behavior 1 (n = 105): .82

    • Problem Behavior 2 (n = 90) : .77

    • Appropriate Behavior 1 (n = 103): .65

    • Appropriate Behavior 2 (n = 56): .76

Other uses of brs

Other Uses of BRS

  • Systemic data tracking method for Tier 3

    • Sample system created by:

      • Cindy Anderson

      • School district in Florida

6 linking the hypothesis to the fba

6. Linking the Hypothesis to the FBA

  • Primary reason FBA is conducted

  • Hypothesis should be multi-component

    • When (antecedents) these contextual/environmental events are present…….

    • It is highly predicted that the behavior identified as the problem and focus of the FBA happens

    • As a result, the student:

      • Gets out of or away from activities, people, tangibles, sensory input, pain

      • Gets activities, people, tangibles, sensory input, pain attenuation

      • Confirmed by the consequences (what others do in response to the behavior) that typically occur

  • Method of organizing information

    • Competing behavior pathway

    • PTR Assessment Organization

Step 3 case study mike assessment summary table of problem behavior

Step 3: Case Study – MikeAssessment Summary Table of Problem Behavior

Screaming, Hitting

Step 3 case study mike assessment of appropriate behavior

Step 3: Case Study – MikeAssessment of Appropriate Behavior


Mike s hypotheses

Mike’s Hypotheses



7 linking the hypothesis to the bip

7. Linking the Hypothesis to the BIP

  • Other primary purpose of conducting FBA

  • STOP generating list of general strategies

  • Each component of hypothesis generates an intervention

    • Antecedents modified and made irrelevant

    • Replacement behavior so that problem behavior is ineffective

    • Functional equivalent reinforcer so the problem behavior is inefficient

8 multi component interventions matched to classroom context

8. Multi-Component Interventions Matched to Classroom Context

  • Multi-component interventions include prevention, teaching and reinforcement strategies

  • Team/Teacher(s) select strategies that are

    • feasible

    • effective

    • likely be implemented

9 task analyzed strategies

9. Task Analyzed Strategies

  • Forgotten art

  • Can’t just say “give choices”, “reinforce appropriate behavior”, etc., “student will comply”

  • Breaking down the interventions into sequence of steps

    • Allows teaching with precision

    • Allows assessment of teacher capacity

    • Provides foundation for training and for fidelity

Paris step 4 ptr intervention

Paris—Step 4: PTR Intervention

Case study jeff ptr intervention plan

Case Study Jeff: PTR Intervention Plan

Jeff intervention plan

Jeff—Intervention Plan

Jeff intervention plan1

Jeff—Intervention Plan

Jeff intervention plan2

Jeff—Intervention Plan

Jeff intervention plan3

Jeff—Intervention Plan

10 teacher and classroom coaching support

10. Teacher and Classroom Coaching, Support

  • Do not assume teacher/team knows how to implement plan

  • Schedule 30 minutes to review plan and go over steps

  • Problem-solve if teacher has difficulties

    • Modify plan

    • Choose different intervention

  • Teach the student the plan

Case study sample coaching checklist for mike

Case Study: Sample Coaching Checklist for Mike

Evaluating the technical adequacy of fbas and bips how are schools doing

11. Array of outcome measures (child-specific, teacher fidelity, social validity, alliance, fidelity of process, technical adequacy of products)

  • Individualized Behavior Rating Scale

  • Fidelity scores

  • Social validity- Did teacher like the process, are they likely to use strategies, would they do it again, etc.?

  • Alliance—Did they like you? Did they feel like you respected their input? Did you do a competent job as a consultant?

Ptr plan self assessment example for mike

PTR Plan Self-Assessment Example for Mike

12 maintenance beyond warranty

12. Maintenance (beyond warranty)

  • Dynamic process-not static

  • Decision making process based on data

  • Determine levels of support needed, fading, shaping, generalizing, extending, etc.

Steps for evaluating outcomes

Steps for Evaluating Outcomes

  • Make sure you have both fidelity measures (self and/or observation scores) AND student outcomes (Behavior Rating Scale measures)

  • Decision rules

    • What constitutes adequate fidelity? 80%, 70%, something else?

    • What constitutes adequate student progress? (e.g., 3 or more consecutive ratings at or above goal line?)

Primary decisions

Primary Decisions

  • If Fidelity scores are inadequate, determine the reasons (intervention too difficult, not feasible, not described adequately….)

    • Retrain/coach the teacher/implementer

    • Modify the interventions so that they are feasible, simpler

    • Select different interventions that match the hypothesis

  • Student outcomes (decision contingent upon outcome trend)

    • Maintain intervention

    • Intensify intervention

    • Modify intervention

    • Fade intervention components

    • Shape behavior outcomes to become closer approximations of desired behavior

    • Expand the intervention (additional people, additional settings or routines)

    • Conduct another FBA if hypothesis is suspect, team has new data, or context has changed

Evaluating the technical adequacy of fbas and bips

Evaluating the Technical Adequacy of FBAs and BIPs

Current status of fba bip implementation in schools scott kamps 2007

Current Status of FBA/BIP Implementation in Schools (Scott & Kamps, 2007)

  • Although FBA in special education law since 1997, no systematic policies adopted at federal level

  • No guidance on key components (who should do FBAs, what features must be included, etc.)

  • Three primary flaws in school-setting use (Scott, Liaupsin, Nelson, & McIntyre, 2005).

    • Often used as reactive process

      • Loses power of prevention in developing interventions addressing minor behaviors before they get serious

    • “Expert” model overlooks valuable input gained from persons with whom student consistently interacts

    • Rigid, rigorous procedures not feasible in public school settings

  • In response, schools have “implemented a variety of inexact practices and procedures that have been loosely labeled as FBA, the majority of which are not tied to any solid evidence base. (Scott, Anderson, & Spaulding, 2008)

Technical adequacy research

Technical Adequacy Research

  • Recent studies conducted exploring technical adequacy of FBAs

    • Blood, E., & Neel, R. S. (2007). From FBA to implementation: A look at what is actually being delivered. Education and Treatment of Children, 30, 67-80.

      • Evaluated FBAs/BIPs of 43 students in self-contained classrooms for EBD (K-12) in one school district in western US

      • Reviewed FBAs/BIPs for inclusion of essential components (listed in article)

      • Interviewed 6 EBD teachers about use of FBA/BIPs in planning and developing programs (e.g., “what is included on the plan?”, “How is plan implemented?” “How do you show progress?”

    • Van Acker, R., Boreson, L., Gable, R. A., & Potterton, T. (2005). Are we on the right course? Lessons learned about current FBA/BIP practices in schools. Journal of Behavioral Education, 14, 35-56.

      • 71 completed FBA/BIPs submitted for review from school districts across midwest state

      • Rating scale developed for analysis (see article for scale)

Some results of technical adequacy research

Some Results of Technical Adequacy Research

  • Teaming issues:

    • Teacher and other input not included

  • Identifying behaviors

    • Target behaviors were missing or inadequately defined

  • Match of FBA to Hypothesis

    • Attempt to assign one function/hypothesis to group of target behaviors (e.g., treated all behaviors as one behavior—collected data and developed interventions)

    • Hypothesis statements missing or inadequate

  • Behavior intervention plan development

    • Behavior strategies not linked with hypothesis statement(s)

    • Predominant type of BIP “hierarchical stock list of possible positive and negative consequences” that follow any problem behavior.

    • Replacement behaviors not included

    • Van Acker—46% FBA/BIPs reviewed only included aversive strategies

Some results of technical adequacy research1

Some Results of Technical Adequacy Research

  • Follow-up

    • Lack of follow-up support for monitoring and evaluating plan including fidelity

    • No follow-through on next steps (promote and check maintenance and generalization of behavior change)

  • Blood interviews with teachers

    • None was able to identify behavior goals nor describe behavior intervention

    • Did not use FBA/BIPs in development of behavior interventions

Purpose of our tool

Purpose of Our Tool

  • Determine the technical adequacy of FBA/BIPs and establish baseline

    • District

    • Campus/School

    • Individual

  • Second step in requesting Tier 3 technical assistance from Florida PBS/RTI:B Project (Interview of Tier 3 process first step)

  • Report generated to guide action planning

Development of tool

Development of Tool

  • Review of literature to identify essential components for adequate FBA/BIPs

  • Original measure included 24 items (FBA/BIP)

  • Edited to 20 items

  • Sent out to three national experts (Terry Scott, Cindy Anderson, Glen Dunlap) to review

    • Is the item essential?

    • Is the item worded clearly?

  • Final tool contains 18 items (9 FBA/9 BIP)

  • Scores range from 0-2 for each item.

Sample graphs tables generated by tool

Sample graphs/tables generated by tool

Sample graphs baseline post fba

Sample Graphs—Baseline/post FBA

Sample graphs bip baseline post

Sample Graphs BIP Baseline/Post

Sample graph total fba bip baseline post

Sample Graph: Total FBA/BIP Baseline/Post

Sample tables baseline post

Sample Tables Baseline/Post

Sample tables baseline post comparison

Sample Tables: Baseline/Post comparison

Practice time

Practice time

Before practicing

Before practicing….

  • Review of tool items

    • Evaluation

    • Scoring guide

Practice time1

Practice Time

  • Team up with others

  • Try scoring the sample completed FBA/BIP given to you with the evaluation tool

  • Come to consensus on the scores

  • Debrief

    • What did you like?

    • What did you dislike?

    • What was easy?

    • What was difficult?

    • What questions do you still have?

Evaluating your district s fba bips

Evaluating Your District’s FBA/BIPs

  • Within your district team, evaluate the technical adequacy of your district’s FBA/BIPs brought to the training

  • Be ready to debrief

    • You do NOT need to tell anyone your scores

    • Discuss anything you learned or didn’t learn in evaluating technical adequacy

  • Use outcomes to start developing strategic action plan steps to achieve district goals.

Next steps

Next Steps

  • Action Planning

    • What will you be doing in your district to improve your FBA/BIPs?

Ptr publications

PTR Publications

  • PTR Manual

    • Dunlap, G., Iovannone, R., Kincaid, D., Wilson, K., Christiansen, K., Strain, P., & English, C., 2010. Prevent-Teach-Reinforce: The School-Based Model of Individualized Positive Behavior Support. Baltimore: Paul H. Brookes.

  • Journal Articles

    • Iovannone, R., Greenbaum, P., Wei, W., Kincaid, D., Dunlap, G., & Strain, P. (2009). Randomized controlled trial of a tertiary behavior intervention for students with problem behaviors: Preliminary outcomes. Journal of Emotional and Behavioral Disorders, 17, 213-225.

    • Dunlap, G., Iovannone, R., Wilson, K., Strain, P., & Kincaid, D. (2010). Prevent-Teach-Reinforce: A standardized model of school-based behavioral intervention. Journal of Positive Behavior Interventions, 12, 9-22

    • Strain, P. S., Wilson, K., & Dunlap, G. (2011). Prevent-Teach-Reinforce: Addressing problem behaviors of students with autism in general education classroom. Behavior Disorders, 36, 160-171.

    • Iovannone, R., Greenbaum, P., Wei, W., Kincaid, D., & Dunlap, G. (in press). Reliability of the Individualized Behavior Rating Scale-Strategy for Teachers (IBRS-ST): A Progress Monitoring Tool. Assessment for Effective Intervention.

    • Sears, K. M., Blair, K. S. C., Iovannone, R. & Crosland, K., (in press). Using the Prevent-Teach-Reinforce model with families of young children with ASD. Journal of Autism and Developmental Disabilities.



  • Login