Evaluation in Michigan’s  Model
Download
1 / 45

- PowerPoint PPT Presentation


  • 168 Views
  • Uploaded on

Evaluation in Michigan’s Model Steve Goodman sgoodman@oaisd.org National PBIS Leadership Forum October, 2010. http://miblsi.cenmi.org. MiBLSi Evaluation Team (the people that make this presentation possible). Anna Harms, Evaluation Coordinator Ed Huth, Data Analyst

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about '' - miranda


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Slide1 l.jpg

Evaluation in Michigan’s Model

Steve Goodman

sgoodman@oaisd.org

National PBIS Leadership Forum

October, 2010

http://miblsi.cenmi.org


Miblsi evaluation team the people that make this presentation possible l.jpg
MiBLSi Evaluation Team(the people that make this presentation possible)

  • Anna Harms, Evaluation Coordinator

  • Ed Huth, Data Analyst

  • Jennifer Rollenhagen, PBIS Assessments Coordinator

  • Terri Metcalf, Reading Assessments Coordinator

  • Nikki Matthews, Data Entry, PBIS Surveys Support

  • Nancy Centers, DIBELS, AIMSweb Support

  • Donna Golden, Data Entry

  • Steve Goodman, MiBLSi Co-Director


Mission statement l.jpg
Mission Statement

To develop support systems and sustained implementation of a data-driven, problem solving model in schools to help students become better readers with social skills necessary for success.



Slide5 l.jpg

Collecting information to evaluate implementation effects and using this information for continuous improvement

MiBLSi Project

  • Fidelity of implementation (state)

  • Systems integrity (project)

  • Student success (project-wide)

ISD Leadership Team

  • Fidelity of implementation (across districts)

  • Systems integrity (district-ISD)

  • Student success

LEA District Leadership Team

  • Fidelity of implementation (across schools)

  • Systems integrity (district-LEA)

  • Student success (district-wide)

Building Leadership Team

  • Fidelity of implementation (across grades)

  • Systems integrity (school)

  • Student success (school-wide)

Building Staff

  • Student success/Intervention effectiveness


Several purposes of miblsi assessments l.jpg
Several Purposes of MiBLSi Assessments and using this information for continuous improvement

  • Audit

    • for “taking stock” of current strengths/weaknesses and action planning

  • Formative evaluation

    • for improving program while it is in the process of being implemented

  • Summative evaluation

    • for improvement of future reiterations


Assessments l.jpg
Assessments and using this information for continuous improvement

Elementary Schools

  • Major Discipline Referrals

  • PBIS Self-Assessment Survey

  • PBIS Team Implementation Checklist

  • Benchmarks of Quality (BOQ)

  • Schoolwide Evaluation Tool (SET)

  • Benchmarks for Advanced Tiers (BAT)

  • Dynamic Indicators of Basic Early Literacy Skills (DIBELS)

  • Planning and Evaluation Tool (PET) for Effective Schoolwide Reading Programs

  • Effective Reading Support Team Implementation Checklist

  • Special Education Data Collection Form

  • Schoolwide Reading Analysis Support Page

Middle/Junior High Schools

  • Major Discipline Referrals

  • PBIS Self-Assessment Survey

  • PBIS Team Implementation Checklist

  • Benchmarks of Quality (BOQ)

  • Schoolwide Evaluation Tool (SET)

  • ORF/MAZE through AIMSWeb

  • School-Wide Evaluation and Planning Tool for Middle School Literacy (SWEPT)

  • Middle School Reading Team Implementation Checklist

  • Special Education Data Collection Form


Building level l.jpg
Building Level and using this information for continuous improvement


Assist teams in using data for decision making l.jpg
Assist Teams in Using Data for Decision-making and using this information for continuous improvement

  • First Year

    • Winter systems review

    • Spring Data Review

  • Second Year

    • Fall data review

    • Winter data review

    • Spring data review

  • Third Year

    • Fall data review

    • Winter data review

    • Spring data review


Assessment booklet l.jpg
Assessment Booklet and using this information for continuous improvement

Description of assessments

Data collection schedule

Data summary

Data forms and assessment forms


Team evaluation of outcome process and systems data l.jpg
Team Evaluation of Outcome, Process and Systems Data and using this information for continuous improvement


Assessment schedule for cohort 7 from miblsi website l.jpg
Assessment Schedule and using this information for continuous improvement(for Cohort 7 from MiBLSi website)




Developing fluency with discipline referral categories l.jpg
Developing Fluency with Discipline Referral Categories Discipline Referrals

Example Exercise 2: Match the example situation below to the correct problem behavior on the discipline categories answer sheet. Write the letter in the column for Exercise 2.


District level l.jpg
District Level Discipline Referrals


Focus on implementing with fidelity using benchmarks of quality boq odr 06 07 and 07 08 l.jpg
Focus on Implementing with Fidelity Discipline Referralsusing Benchmarks of Quality (BoQ)/ODR ’06-’07 and ’07-’08

Decrease 14.6%

Increase 8%




Project level l.jpg
Project Level Discipline Referrals


File maker pro data base l.jpg
File Maker Pro Data Base Discipline Referrals


Slide25 l.jpg

One major activity of MiBLSi involves professional developmentOver 422 training days are currently scheduled for the 2010-2011 school year.


On line evaluation l.jpg
On-Line evaluation development

  • Trainer evaluation of trainer workdays

  • Participant evaluation of training sessions



Trainer work day questions l.jpg
Trainer Work Day Questions development

  • The training goals were clearly defined and reviewed frequently with checking for understanding.

  • The trainers were knowledgeable about the training content and were able to respond to participants' questions and share experiences to support understanding.

  • The trainers presented the content in such a way that promoted active engagement and opportunities for processing, working, and/or learning the content.

  • The materials were accessible in a timely manner (posted two weeks prior to trainer work day).

  • The trainer notes and activities of the day were a valuable use of my time as it relates to preparing for this upcoming training.

  • Potential challenges that participants may experience were highlighted with some ideas for addressing those challenges.

  • The big ideas of the day's training were emphasized, and areas to cut or condense were described in enough detail so that I am confident about how to adjust for different groups.


Miblsi project data l.jpg
MiBLSi Project Data development


Miblsi project data31 l.jpg
MiBLSi Project Data development

Behavior and Reading Interaction


Slide32 l.jpg
Proficiency on 4 developmentth Grade High Stakes Reading Test and Percent of Major Discipline Referrals from Classroom: 132 Elementary Schools


Average meap reading scores and fidelity in pbis implementation based on benchmarks of quality l.jpg
Average MEAP Reading Scores and fidelity in PBIS implementation based on Benchmarks of Quality

*29 Elementary Schools from multiple districts


Miblsi project data34 l.jpg
MiBLSi Project Data implementation based on Benchmarks of Quality

Implementation Fidelity


Comparison of schoolwide evaluation tool set scores after training and after miblsi implementation l.jpg
Comparison of Schoolwide Evaluation Tool (SET) Scores after training and after MiBLSi Implementation


Miblsi school wide evaluation tool set average scores for elementary and middle schools for 2009 10 l.jpg
MiBLSi School-wide Evaluation Tool (SET) Average Scores for Elementary and Middle Schools for 2009-10


Miblsi project data37 l.jpg
MiBLSi Project Data Elementary and Middle Schools

Student Outcome




Slide40 l.jpg

Elementary Schools with complete data sets: Average Major Discipline Referrals per 100 Students per Day


Slide41 l.jpg

Spring Reading Curriculum-Based Measurement “Established Level” for Cohort 4-6 Middle Schools


Slide42 l.jpg

Special Education Referral and Eligibility Rates for Cohort 1 - 4 Schools

(Comparison of 2007-08 and 2008-09)

*n = 84 schools


Slide43 l.jpg

Middle Schools with Complete Data Sets: Average Major Discipline Referrals per 100 Students per Day


Lesson learned l.jpg
Lesson Learned Discipline Referrals per 100 Students per Day

Teams need to be taught how to analyze and use data

Emphasis on directing resources to need and removing competing activities

As we grow, it is even more important to systematic gather data that is accurate and then act on the data for continuous improvement


Slide45 l.jpg

“Even if you’re on the right track, you’ll get run over if you just sit there” - Will Rogers