1 / 21

Applying the Research to Maximize Efficiency and to Best Meet Your School and District Needs

Applying the Research to Maximize Efficiency and to Best Meet Your School and District Needs. Kim Gulbrandson, Ph.D. Wisconsin RtI Center. Objectives. To provide a general overview of the research behind the tools To share strengths and weaknesses of the current assessment tools

Download Presentation

Applying the Research to Maximize Efficiency and to Best Meet Your School and District Needs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applying the Research to Maximize Efficiency and to Best Meet Your School and District Needs Kim Gulbrandson, Ph.D. Wisconsin RtI Center

  2. Objectives • To provide a general overview of the research behind the tools • To share strengths and weaknesses of the current assessment tools • To provide resources to support schools/districts in using these tools in a coordinated way

  3. BoQ • Sound development process (multiple stages) • Sound psychometrics • Good test-retest reliability (.94) • High inter-rater reliability (above 90%) • Good internal consistency reliability (.70 or above) PBS team – only scale with low reliability • CFA and EFA: Items with low factor loadings eliminated New Classroom Critical scale added Current 10-factor structure is solid

  4. BoQ • Best tool for distinguishing amongst schools implementing with fidelity • Detailed scoring criteria (rubric) • Found to be a valid instrument even when administered using diverse methods When administration varied from validated method, it did not significantly change scores (if Scoring Guide used)

  5. BoQ • Schools with higher BoQ scores tend to have greater decreases in ODR’s than schools with lower BoQ’s • No district support, CR or coaching items • Family engagement items • Highly correlated with the TIC and SET

  6. BoQ and SET • Offers good cross comparisons (several subscales represent similar elements) • BoQ and SET scores are significantly correlated with one another • BoQ measures PBIS areas with more specificity than the SET • BoQ measures critical features of implementation not covered by the SET Faculty buy-in Lesson plans Crisis plans Evaluation

  7. BoQ and SET • BoQ is better able to distinguish amongst schools that are implementing with fidelity than the SET is • SET can be used to validate BoQ reporting • BoQcan be used to identify additional areas in need of improvement that may not have been identified on the SET If done within same time frame

  8. SET • Considered more sensitive for initial implementation than for sustained implementation • Fairly strong psychometrics • Drawback: Can score 80% on the SET without having some of the critical features of PBIS in place • Limited feedback on the implementation process • Items most appropriate for elementary (less interpretable for middle school)

  9. SET • Use caution with Expectations Taught and Management subscales • Time intensive • Less interpretable and reliable for large schools • Includes a district support component yields high scores only 2 items • No family engagement, CR or coaching items

  10. TIC • Primarily looks at startup activities (only 6 questions tracking ongoing development) • Less useful for fully implementing schools or for looking at sustainability • Limited empirical research examining its reliability and validity One study - internal consistency reliability • Mixed criticisms about being too lenient • 3 family engagement items • No district level, coaching or CR items

  11. SAS • The only tool that clearly breaks things down into 4 different systems • Limited reliability and validity data • Higher reliability for improvement priority than current status • Nonclassroom Settings and Individual Student had lowest reliability and greatest variability across staff • Suggested: Look at individual items

  12. SAS • Item 8 – interpret with caution • Has been used to identify specific strategies associated with reductions in racially disproportionate suspensions • 3 family engagement items • No district-level, CR or coaching components

  13. BAT • Limited reliability (low test-retest for subscales) • Not yet validated (Tier 3 most problematic) • Tier 3 FBA/BIP scores consistently high/overinflated • Suggestion: People with specific knowledge of FBA/BIP’s complete the BAT • 6 family engagement items • No coaching, CR or district items

  14. MATT • No formal work has been done with regard to reliability and validity • 3 family engagement items • Scoring concerns (inflated implementation scores) • Suggestion: Look at tier 2 and 3 organization and critical elements subscale scores separately, or individual items

  15. RtI All Staff Survey • 5 family engagement items • 5 CR items • Aligns with the SIR (29 questions) • Aligns with the state graphic/model • Multiple levels

  16. RtI All Staff Survey • Reliability and validity information, but less than the SIR • No coaching items • Few leadership items

  17. SIR • Aligns with the RtI All Staff • 5 family engagement items • Includes leadership items • Includes CR items • Multiple levels

  18. SIR • Reliable and valid • Modified CR items has not been re-tested – be careful comparing across years • Missing district-focused items

  19. Considerations • Which is most important for you to measure? Initial implementation Sustainability District and/or school level factors Different settings All staff or team perceptions Family engagement Culturally responsive practices Leadership

  20. Assessment Tool Review • See handout

  21. Using Assessments to Action Plan

More Related