1 / 57

Joseph D. Otter LMSW, Regional PBIS Specialist Eastern Region

Communities of One Project SPSE New Coaches Training Day 2 March 24, 2009 Mineville, NY. Joseph D. Otter LMSW, Regional PBIS Specialist Eastern Region. Acknowledgements. OSEP Center on Positive Behavioral Interventions & Supports Technical Assistance Center at University of Oregon

satin
Download Presentation

Joseph D. Otter LMSW, Regional PBIS Specialist Eastern Region

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Communities of One Project SPSE New Coaches TrainingDay 2 March 24, 2009 Mineville, NY Joseph D. Otter LMSW, Regional PBIS Specialist Eastern Region

  2. Acknowledgements • OSEP Center on Positive Behavioral Interventions & Supports Technical Assistance Center at University of Oregon • National Network of Partnership Schools (NNPS) Johns Hopkins University Illinois EBD/PBIS Network NYS-PBIS Initiative

  3. Today’s Objectives • Discuss best practices of data collection and use • Identify and describe data sources to be used by PBIS teams • Distinguish between different kinds of data • Become better data-based decision makers • Network and problem solve

  4. Today’s Agenda • Welcome, Introductions & Evaluation Data • Warm Up • Kinds and types of data • Data Sources: SET, TIC, CTIC, EBS • Data-based Action Planning • Office Discipline Referral Data as a microcosm • Presenting Data • Networking/Closing

  5. Principles of Data-based Decision-Making • When you collect data from people, always show people how you used it – you asked them to work or change for you to get it • Data-based decision making in an ethical practice • Only collect what you plan on using

  6. Principles of Data-based Decision-Making • Whenever possible, use data for decision-making • Data must be used ethically and factually (people can use it to “say anything” with enough twisting) • Using data often results in needing to look at more data and ask more questions

  7. Process Data • Tools & instruments that measure action steps taken and the impact of those action steps • A way of determining if what needs to occur to “set the stage” for outcome data is occurring • Good for showing some results to vested stakeholders on the way to outcome data • Data on if the implementers behavior has changed

  8. Outcome Data • The ways of measuring achievement of desired goals • The data that stakeholders are truly interested in and that will get funding & district support • Can be • Changes in perception • Changes in behavior • Changes in achievement

  9. Data for decision-making in the adoption and maintenance of School-wide PBIS SYSTEM PROGRESS • Yearly EBS/PBIS Survey (EBSSAS) • School-wide Evaluation Tool (SET) • Coaches Checklist CTIC • Team Checklist TIC • Other STUDENT PROGRESS • Office Referrals (ODRs) and other discipline data • Academic Data

  10. School-wide Evaluation Tool • Conducted by trainer evaluator • Interviews with staff and students • Observation • Interview with administrators

  11. The interviews… • Focus on definitive answers, not perception • Ask about behaviors and knowledge • Emphasize being quantifiable (can be scored) • Seek specifics

  12. System-wide Evaluation Tool (SET) Categories • Expectations defined • Behavioral Expectations Taught • On-going Reward System • System for Responding to violations • Monitoring & Decision-making • Management • District-Level Support

  13. ES Cohort 1School-wide Evaluation Tool (SET)% “In-Place” per SET Component 02-03 vs. 03-04 SET Components 1) Expectations Defined 2) Expectations Taught 3) Rewards System 4) Violations System 5) Monitoring 6) Management 7) District Support ts = Average % “In- Place” for ALES or “Total Score”

  14. Cohort 1 SchoolsSchool-wide Evaluation Tool (SET)% “In-Place” per SET Component 02-03 vs. 03-04

  15. Team Implementation Checklist (TIC) • Monitors & guides activities for implementation and assist in development of action plan • Start-up Activities • On-going Activities • School, Family & Community Partnerships • Self-report data • Factual • Objective

  16. Effective Behavior Support School-wide Assessment Survey (EBSSAS) • Initial & annual assessment of effective behavior support systems in your school • Examines status & need to improvement • Looks at four behavior support systems ~ school-wide discipline systems ~ non-classroom management systems ~ classroom management systems ~ individual students

  17. Survey Completion – In Person • All staff at staff meeting • Individuals from a representative group • Team member-led focus group • Done independently • 20-30 minutes • Check current status on left • Check priority for improvement on right

  18. Survey Completion – Online • Either at staff convenience during a survey period or all at once (computer lab) • Done independently • 20-30 minutes

  19. EBS Survey results • Annual action planning • Internal decision making • Assessment of change over time • Awareness of building staff perceptions • Team validation • Buy-In

  20. Staff receive regular feedback on behavior data.

  21. Options exist to allow classroom instruction to continue when problem behavior occurs.

  22. Problem behaviors receive consistent consequences.

  23. Effective Behavior Support School-wide Survey (Current Status)

  24. Effective Behavior Support School-wide Survey (Priority for Improvement)

  25. 1. School-wide expectations apply to non-classroom settings. 2. School-wide expected behaviors are taught in non-classroom settings. 3. Supervisors actively supervise(move, scan, interact) students in non-classroom settings. 4. Rewards exist for meeting expected student behaviors in non-classroom settings. 5. Physical/architectural features modified to limit unsupervised settings, unclear traffic patterns, inappropriate entrance/exit from school. 6. Scheduling of student movement ensures an appropriate # of students in non-classroom spaces. 7. Staff receive regular opportunities for developing and improving active supervision skills. 8. Status of student behavior and management practices are evaluated quarterly from data. 9. All staff are involved directly or indirectly in management of non-classroom settings.

  26. Organizing Behavioral Data Use whatever data already exists THE BIG FIVE • Problem Behavior • Location • Time of Day • Referrals by Student • Average Per Day/Per Month

  27. Central Region ~ Rural School (gr 3-6) ~ 2002-2003

  28. Infractions by time of day

  29. Discipline Referrals by StudentMiddle School - Southern Illinois •50% of students account for all behavior referrals •10% of students account for 61% of all behavior referrals •5% of students account for 41% of all behavior referrals

  30. Data-Based Decision Making • Office discipline referral data • Staff/Faculty input/survey data • Student input/survey data • Nurses office • Family/Community input/survey data • Academic data • Kid to Kid • Kid to Adult

  31. Survey of Respectful Behaviordone by Dr. Rob March • Participants • 980 middle school students • Chicago, IL • Approximately 63% of students at school receive free or reduced lunch • Survey asked students, “What are some ways that teachers show you respect?”

  32. Top 12 Answers • Of the over 2900 responses, the ones listed were written by 50 students or more. • 1. Talk privately to students when a problem occurs. • 2. Use a calm tone of voice, even when they are upset. (No yell) • 3. Respect personal space (Don't touch, grab, eyeballing, crowd)

  33. Top 12 Answers (continued) • 4. Listen without interrupting. • 5. Have a sense of humor. • 6. Display student work around the classroom/school. • 7. Prepare exciting lessons. • 8. Let parents/guardian know student did a good job sometimes (see a balanced picture).

  34. Top 12 Answers (continued) • 9. Use student's name when talking to them • 10. Be available during non-classroom times • 11. Return work promptly • 12. Talk sincerely - no sarcasm or “eye rolling” • Worth noting: Acknowledge birthday received multiple mentions.

  35. Use Data to Drive Decisions • Data makes you ask questions • What are the right questions? • Who, What, Where, When, Why & How? • What information is needed to answer the questions? • What are the possible interventions? What is the smallest change that can produce the biggest impact?

  36. Determine Needs • Analyze data and ask questions • WHO are the students committing this infraction? • WHAT is the most efficient solution? • WHERE are we experiencing the most problems? • WHEN is this behavior not occurring? • WHY is this a priority for us? • HOW much more data needs to be collected?

  37. Question #1: Are we doing what we should be doing?Question #2: Is it working? • Are we satisfied with the behavior patterns of students? • Are we using best practices of DEFINE, TEACH, REMIND, ACKNOWLEDGE & RETEACH in school-wide behavior support? • What are we doing that is working and should be retained? • What are we NOT doing that would fit our setting and make a big difference?

  38. Intervention (Course of Action) • What do we want instead (what changes do we want to see?) • Are we trying to change the behavior of children or adults? • What other data do we need to collect (what sources do we have and what are we using)? • Determine priorities • Brainstorm ideas (at least 10) - keep it simple • Plan implementation Who will be involved? When will it happen?

  39. Always Remember the Three Questions for Active Decision-making: • Are we doing what we should be doing? • Is it making a difference? • What is the smallest change that will make the biggest difference?

  40. Determine Effectiveness and Success • What is our definition of success? What will it look like? • Set achievable, reasonable goals • How and how often will we measure success and effectiveness? • Do we need to modify the plan? • When do we move on?

More Related