1 / 43

Data Driven Decisions: Using the Tools

Data Driven Decisions: Using the Tools . Susan Barrett, Jerry Bloom PBIS Maryland Coaches Meeting October 2007 sbarrett@pbismaryland.org jbloom@pbismaryland.org. Acknowledgements. Dr. Rob Horner University of Oregon Dr. George Sugai University of Connecticut. Goals.

tiana
Download Presentation

Data Driven Decisions: Using the Tools

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Driven Decisions:Using the Tools Susan Barrett, Jerry Bloom PBIS Maryland Coaches Meeting October 2007 sbarrett@pbismaryland.org jbloom@pbismaryland.org

  2. Acknowledgements • Dr. Rob Horner • University of Oregon • Dr. George Sugai • University of Connecticut

  3. Goals • Define use of data driven decision to reach full implementation of school-wide PBS • IPI • Team Checklist • SWIS

  4. Assumptions • School teams will be successful if: • They start with sufficient resources and commitment • They focus on the smallest changes that will result in the biggest difference • They have a clear action plan • They use on-going self-assessment to determine if they are achieving their plan • They have access to an external agent/coach who is supportive, knowledgeable and persistent.

  5. Data Driven Solutions-Using the Process Measures • Implementation Phase Inventory (IPI) • Team Checklist- Form A (TIC) • Self-assessment for Primary Prevention systems. • Emphasis is on milestones • Are we doing what we should be doing?

  6. IPI • Two times/year • Due November 10, April 10 • Coach completes with Team • Four Phases • Preparation • Initiation • Implementation • Maintenance

  7. Team Checklist • Self-assessment tool for monitoring implementation of School-wide PBS. • Start-Up Elements (17 items) • Establish Commitment • Establish and Maintain Team • Self-assessment • Establish school-wide expectations (Prevention) • Establish consequences for behavioral errors • Establish information system • Establish capacity for function-based support • On-going Elements (6 items) Team Checklist

  8. Use of the Team Checklist • Who completes the Team Checklist? • The school-team (completed together) • When is Team Checklist completed? • At least quarterly, best if done monthly • Check with your local coordinator (www.pbssurveys.org) • Who looks at the data? • Team • Coach • Trainers/State Evaluation • Action Planning

  9. Action Planning with the Team Checklist • Define items (or POINTS) In place or Partially in place. • Points: 2=in place, 1= partial, 0=not in place • Identify the items that will make the biggest impact • Define a task analysis of activities to achieve items. • Allocate tasks to people, time, reporting event.

  10. Implementation by Feature This report shows, for each completed Checklist, the percentage implemented and partially implemented for each of the following features: • Establish commitment (questions 1-2) • Establish & maintain team (3-5) • Conduct self-assessment (6-8) • Define expectations (9) • Teach expectations (10-12) • Establish reward system (13) • Establish violations system (14) • Establish information system (15) • Build capacity for function-based support (16-17)

  11. Putting your School in Perspective • Use % of Total Items/ or % of points • Messages: • Trends • You don’t need to be perfect immediately

  12. Overall Implementation This report shows, for each completed Checklist, overall scores as (a) the percentage of items fully implemented and partially implemented and (b) the percentage of implementation points. The report displays one row of this data for each Checklist in ascending date order.  The associated column chart shows the percentage of items implemented and partially implemented on each Checklist and, in a separate area, the percentage of implementation points.

  13. Team Checklist Total Scores

  14. Data Driven SolutionsUsing Outcome Measures to Make Decisions • School-wide Information System • www.swis.org

  15. Improving Decision-Making Solution Problem From Problem Solving Solution Problem To Information

  16. Key features of data systems that work. • The data are accurate and valid • The data are very easy to collect (1% of staff time) • Data are presented in picture (graph) format • Data are used for decision-making • The data must be available when decisions need to be made (weekly?) • Difference between data needs at a school building versus data needs for a district • The people who collect the data must see the information used for decision-making.

  17. Why Collect Discipline Information? • Decision making • Professional Accountability • Decisions made with data (information) are more likely to be (a) implemented, and (b) effective

  18. What data to collect for decision-making? • USE WHAT YOU HAVE • Office Discipline Referrals/Detentions • Measure of overall environment. Referrals are affected by (a) student behavior, (b) staff behavior, (c) administrative context • An under-estimate of what is really happening • Office Referrals per Day per Month • Attendance • Suspensions/Expulsions • Vandalism

  19. Office Discipline Referral Processes/Form • Coherent system in place to collect office discipline referral data • Faculty and staff agree on categories • Faculty and staff agree on process • Office Discipline Referral Form includes needed information • Name, date, time • Staff • Problem Behavior, maintaining function • Location

  20. When Should Data be Collected? • Continuously • Data collection should be an embedded part of the school cycle not something “extra” • Data should be summarized prior to meetings of decision-makers (e.g. weekly) • Data will be inaccurate and irrelevant unless the people who collect and summarize it see the data used for decision-making.

  21. Organizing Data for “active decision-making” • Counts are good, but not always useful • To compare across months use “average office discipline referrals per day per month”

  22. January 10

  23. Using Data for On-Going Problem Solving • Start with the decisions not the data • Use data in “decision layers” (Gilbert, 1978) • Is there a problem? (overall rate of ODR) • Localize the problem • (location, problem behavior, students, time of day) • Get specific • Don’t drown in the data • It’s “OK” to be doing well • Be efficient

  24. Is there a problem? • Office Referrals per Day per Month • Attendance • Faculty Reports

  25. SWIS summary 04-05 (Majors Only)1210 schools: 595,742 students

  26. Interpreting Office Referral Data:Is there a problem? • Absolute level (depending on size of school) • Middle, High Schools (> 1 per day per 100) • Elementary Schools (> 1 per day per 250) • Trends • Peaks before breaks? • Gradual increasing trend across year? • Compare levels to last year • Improvement?

  27. Elementary School with 250 students

  28. Middle School with 500 students

  29. Middle School with 500 students

  30. Is there a problem? Middle school with 500 students (Dec)

  31. Is there a problem? Middle School with 500 students (Dec 04-05)

  32. What systems are problematic? • Referrals by problem behavior? • What problem behaviors are most common? • Referrals by location? • Are there specific problem locations? • Referrals by student? • Are there many students receiving referrals or only a small number of students with many referrals? • Referrals by time of day? • Are there specific times when problems occur?

  33. Elementary School

  34. Referrals per Student

  35. Quote of the Day • “Without data, you are just another person with an opinion”

More Related