1 / 40

The Basic Components of Inter-Rater Reliability

The Basic Components of Inter-Rater Reliability. Objectives. Develop strong understanding of rubric language and performance levels Strengthen observation skills and explore importance of frequent observation Calibrating the collection of evidence through multiple measures

yolandap
Download Presentation

The Basic Components of Inter-Rater Reliability

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Basic Components of Inter-Rater Reliability

  2. Objectives • Develop strong understanding of rubric language and performance levels • Strengthen observation skills and explore importance of frequent observation • Calibrating the collection of evidence through multiple measures • Rate the level of performance and provide consistent feedback that builds self-directed learners

  3. Vision of Effectiveness • Think of when you observed an excellent teacher in the classroom or a professional context. • What did you see or hear that made you think that you were observing an effective practitioner? • In a perfect world, what is your vision of educator effectiveness?

  4. Traits of Effectiveness • Write two pieces of evidence of an effective practitioner on two post-its (one piece of evidence per post-it) • As a table group sort your post-its into like categories • Agree on a label for each category

  5. Traits of Effectiveness • Write two pieces of evidence of an effective practitioner on two post-its (one piece of evidence per post-it) • As a table group sort your post-its into like categories • Agree on a label for each category Label:___ Label:___ Label:___ Label:___

  6. Traits of Effectiveness • What happens if we add the four Domains of the InTASC Standards? • Do you need to re-sort your post-its? The Learner Content Instructional Professional & Learning Practices Responsibilities

  7. Understanding the Rubric Set up

  8. Level of Performance • Level 1- Does not meet standards; performs below the expectations for good performance under this standard; requires direct intervention and support to improve practice • Level 2- Making sufficient progress toward meeting this standard; meets expectations for good performance most of the time and shows the continuous improvement; expected improvement through focused professional learning and growth plan

  9. Level of Performance • Level 3- Consistently meets expectations for good performance under the standard; demonstrates effective practices and impact on student learning; continues to improve professional practice through ongoing professional learning • Level 4- Consistently exceeds expectations for good performance under this standard; demonstrates highly effective practices and impact on student learning; continued expansion of expertise through professional learning and leadership opportunities

  10. Where do you live? • It is expected that we strive and are supported to live at a Level 3 and visit Levels 2 & 4 • You’re going to move between levels within your year and career!

  11. For Example: Designing Student Assessments

  12. Differentiated LanguageLevels of Performance • Using your rubric, pick a component, standard or criteria • Highlight key words that show the difference between levels of practice • Discuss and record responses on the organizer provided • Share examples of evidence from your practice that matches the language you identified at each level

  13. Common Language

  14. Identifying Evidence • Read the evidence statements provided • Based on your rubric, which component, standard or criteria is it evidence of? • Which statements are evidence of observable practice? Non-observable? If non-observable, where would this evidence be found? • What are examples of evidence from your practice that support components, standards and criteria that aren’t listed? List a few examples of your own in the blank spaces. Note: Levels of performance are not determined until multiple sources of evidence are collected.

  15. Fact vs. Opinion When Collecting Evidence Table Talk: • Think about a time that you received an evaluation based on both factual evidence and opinion. What were the differences in…. How you felt? The impact on your professional growth? The relationship with your evaluator? • Why is it important to collect multiple pieces of factual evidence before trying to assess educator practice? • How can factual evidence support educator development?

  16. Which is Stronger? Why?

  17. Effective Observation Practices • Eliminate effects of bias. Enter the classroom without judgment and work from evidence. • Collect factual evidence. Write down only what teachers and students say and do. Look for evidence of learning. • Remain, review, reflect. Pause to organize your evidence before aligning. Only rate after multiple sources of evidence are collected. • Frequent, focused, varied with timely feedback.

  18. Collecting Evidence • What do you see and hear the teacher and students doing? • What evidence can you gather?

  19. Selecting Observation Focus • When taking notes, come up with a short-hand system. Don’t worry if you miss something – practice makes perfect. • Sample of how you could scaffold support for focusing your first 3 observations • Cluster1: Managing classroom procedures, Managing student behavior, and Organizing physical space • Cluster 2: Creating environment of respect & rapport, Establishing a culture for learning, Communicating with students • Cluster 3: Using questioning and discussion techniques, Engaging students in learning, Using assessment in instruction • Count off at your table by “3’s” – be ready to gather evidence on the video based on your cluster

  20. Video 1 • Ms. Warburton’s 8th grade math lesson: Sorting and classifying equations • CCSS Math 8.EE.C.7a • Collect evidence statements on your cluster • https://www.teachingchannel.org/videos/sorting-classifying-equations-overview

  21. Video 1 (follow up) • Compare your evidence statements with a cluster partner • With partner, use your rubric and practice aligning/coding evidence statements with components/standards/criteria language. • Make sure each item shared is factual evidence • Each cluster shares out at their table • One piece of evidence from each cluster • In whole group, each cluster will share out one piece of evidence. • Add to your notes any factual evidence your table partners shared that you did not have written down

  22. Calibrating • Let’s all look at evidence that aligns with: • At tables: Based on the evidence we have compiled, talk about the level of performance should be assigned for each component. • On the following slides use your clicker to report the ranking for each of the three cluster areas we collected evidence for

  23. What is the performance level for Cluster 1: 2c, Bh, and 3.3? • 1 • 2 • 3 • 4

  24. Calibrating • Let’s all look at evidence that aligns with: • At tables: Based on the evidence we have compiled, talk about the level of performance should be assigned for each component. • On the following slides use your clicker to report the ranking for each of the three cluster areas we collected evidence for

  25. What is the performance level for Cluster 2: 2a, Bd, and 3.2? • 1 • 2 • 3 • 4

  26. Calibrating • Let’s all look at evidence that aligns with: • At tables: Based on the evidence we have compiled, talk about the level of performance should be assigned for each component. • On the following slides use your clicker to report the ranking for each of the three cluster areas we collected evidence for

  27. What is the performance level for Cluster 3: 3c, Cg, and 5.2? • 1 • 2 • 3 • 4

  28. Effective Feedback

  29. Planning for IRR Implementation • Plan - Best practice = 30-hours minimum • Rubric: • Understanding Rubric Format • Levels of Performance • Language of Instruction • Evidence: • Multiple Measures • Observable & Non-Observable • Aligning/Coding to Rubric • Fact vs. Opinion • Awareness of Bias

  30. Planning for IRR Implementation • Practice Observations: • Observe Practice (Frequent, Focused, Varied) • Organize Evidence (Remain, Review, Reflect) • Aligning/Coding Evidence to Rubric • Rate Performance • 80% rule • Effective Feedback • Preparing / Sharing model • What is the plan to come together multiple times to re-calibrate throughout the year(s)? • How will you assess the effectiveness of this IRR/Calibration plan?

  31. Video 2 • Ms. Bannon’s 3rd grade reading lesson: Understanding main idea • CCSS ELA RL.3.2 & ELA.SL.3.2 • Collect evidence statements on your cluster • https://www.teachingchannel.org/videos/3rd-grade-ela-lesson

  32. Video 2 (follow up) • Compare your evidence statements with a cluster partner • With partner, use your rubric and practice coding evidence statements with components/standards/criteria language. • Make sure each item shared is factual evidence • Each cluster shares out at their table • One piece of evidence from each cluster • In whole group, each cluster will share out one piece of evidence. • Add to your notes any factual evidence your table partners shared that you did not have written down

  33. Calibrating • Let’s all look at evidence that aligns with: • At tables: Based on the evidence we have compiled, talk about the level of performance should be assigned for each component. • On the following slides use your clicker to report the ranking for each of the three cluster areas we collected evidence for

  34. What is the performance level for Cluster 1: 3e, Aj, and 3.1? • 1 • 2 • 3 • 4

  35. Calibrating • Let’s all look at evidence that aligns with: • At tables: Based on the evidence we have compiled, talk about the level of performance should be assigned for each component. • On the following slides use your clicker to report the ranking for each of the three cluster areas we collected evidence for

  36. What is the performance level for Cluster 2: 3a, Cc, and 3.2? • 1 • 2 • 3 • 4

  37. Calibrating • Let’s all look at evidence that aligns with: • At tables: Based on the evidence we have compiled, talk about the level of performance should be assigned for each component. • On the following slides use your clicker to report the ranking for each of the three cluster areas we collected evidence for

  38. What is the performance level for Cluster 3: 3b,__, and__? • 1 • 2 • 3 • 4

  39. Wrapping Up • First of many professional learning opportunities • Bend Summit – October 22nd • CCSS Regional Series • Pendleton – October 29 & 29 • Wilsonville – November 4 & 5 • Redmond – November 7 & 8 • Resources • SLG Goal Guidance and samples • Updated Framework, FAQs, “Who is evaluated?” • Toolkit Implementation Web Page www.ode.state.or.us/search/page/?id=3904

  40. Providing Feedback • Summit Evaluation • General Questions/Feedback ode.evaluation@state.or.us Thank you for attending today!

More Related