1 / 32

Educator Evaluation: Research-Informed Practice

Educator Evaluation: Research-Informed Practice. Dr. Don Beaudette Rachel Bradshaw Richard Fournier Elizabeth Nolan. Agenda. Student Input in Teacher Evaluation (Rachel Bradshaw) Instructional Leadership in the Context of Teacher Evaluation (Elizabeth Nolan)

bob
Download Presentation

Educator Evaluation: Research-Informed Practice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Educator Evaluation:Research-Informed Practice Dr. Don Beaudette Rachel Bradshaw Richard Fournier Elizabeth Nolan

  2. Educator Evaluation: Research-Informed Practice Agenda • Student Input in Teacher Evaluation (Rachel Bradshaw) • Instructional Leadership in the Context of Teacher Evaluation (Elizabeth Nolan) • The Evaluation of Special Educators (Richard Fournier)

  3. Educator Evaluation: Research-Informed Practice Part 1 Student input in teacher evaluation

  4. Educator Evaluation: Research-Informed Practice What is worth knowing? • In this class, we learn to correct our mistakes. • This teacher is nice to me when I need help. • We learn a lot in this class. • I look forward to coming to this class. • In this class, we stay busy and do not waste time. • This teacher pushes us to use our thinking skills. • Students do what this teacher wants them to do. • This teacher makes learning interesting. • This teacher wants us to share our ideas. • This is a good teacher.

  5. Educator Evaluation: Research-Informed Practice Are student perceptual data trustworthy? • Yes …with a well-designed survey • Witnesses to behaviors, not judges of skill • Redundant, research-based items • Reliability • Inter-rater: ten or more students • Test-retest: two or more years? • Validity • Convergent: observations, self-reports • Predictive: test scores

  6. Educator Evaluation: Research-Informed Practice What roles can student surveys play among multiple measures? • Monitoring implementation processes • Alignment with Danielson framework • Adding nuance to evaluation • Complementing, not just verifying • Providing “fast and frequent feedback” • Multiple administrations per year • Engaging and empowering students… • …and teachers!

  7. Educator Evaluation: Research-Informed Practice What are best practices for implementation? • Understanding and buy-in • For data as well as for people • Equity and protection for students • Translations, accommodations, proctors • Norming by age, subject, etc. • National data sets, anti-bias algorithms • Focus on improvement • Actionable feedback, professional development

  8. Educator Evaluation: Research-Informed Practice Part 2 Instructional leadership In the context of teacher evaluation

  9. Educator Evaluation: Research-Informed Practice

  10. Educator Evaluation: Research-Informed Practice Which do you emphasize more in your district?

  11. Educator Evaluation: Research-Informed Practice What is the purpose of teacher evaluation? Accountability • State/District Standards • Retention • Dismissal Improvement • Formative Evaluations • Feedback • Professional Development

  12. Educator Evaluation: Research-Informed Practice What does the research say about observation protocols as a tool for improvement? • Error Marzano (2012) • Measurement • Sampling • 3 Types of Lessons • Introductory • Practice and Understanding • Application • Assumptions • Principals are content experts • Generic Instruments

  13. Educator Evaluation: Research-Informed Practice How can observations be more accurate and promote teacher improvement? • “Differentiating” the number of observations • Looking for content expertise • Conferring with teachers • Strategically planning observations • Conducting walk-throughs • Videotaping lessons • Peer Assistance and Review (PAR)

  14. Educator Evaluation: Research-Informed Practice How can a principal make this all work? • Use Technology • TeachPoint • Evernote • Excel • Limit the focus • Ask teachers to do some homework • Pre-conference worksheets • Data • Devote a day • Setting targets and goals

  15. Educator Evaluation: Research-Informed Practice What is instructional leadership in the context of teacher evaluation? • Traditional Idea • Focus on curriculum and instruction • Limited if any improvements in student growth • Contemporary Idea • Focus on organizational management • Growth in student achievement

  16. Educator Evaluation: Research-Informed Practice How can central office administrators support principals as instructional leaders? • Training • Ongoing professional development • Time with staff to focus on the evaluation • Instructional coaching • Credentialing • Provide opportunities for principal feedback on the evaluation process • Allow for flexibility with the number of observations • Allocate funding for support personnel to assist • Provide technology to facilitate observations

  17. Evaluating Special Education Teachers: Current Issues & Recommendations Richard Fournier Doctoral Fellow rfournie@bu.edu

  18. Introduction: Why Special Education?

  19. Evaluating Special Education Teachers:Major Issues • Special education has unique characteristics evaluation models rarely consider • These features challenge current models of evaluating teacher effectiveness: • Diversity in population of students with disabilities • Intensive, individualized instruction found in SPED

  20. Evaluating Special Education Teachers:Major Issues • Problems • Current mixed-method measures (e.g., VAM) are not easily transferrable to special education • Current models tend to be “one-size-fits-all” • Current teacher observations problematic

  21. What does the research say? Rigorous research is limited, but growing Little understanding of what models work or of alternative solutions

  22. What does the research say? Observation of Special Education Teachers Examples: • Danielson’s Framework for Teaching (FFT) • Meant to be comprehensive of all aspects of one’s teaching • Classroom Assessment Scoring System (CLASS) • Focuses on teacher-student interactions • Recent Findings: • observations scores tend to be unstable • Raters tend to remain high or low in ratings • Consistency with observation systems • Not strongly correlated with VAM scores • BIG ISSUE: How do we deal with issues of fairness? Dr. Nathan Jones (2013), Presentation with Office of Special Education Programs (OSEP)

  23. What does the research say? Observation of Special Education Teachers • Practical, Common Issues: • Evaluators unfamiliar with environment • Differences in ideas and definitions of effective teaching • Roles & Responsibilities not captured in protocol Dr. Nathan Jones (2013), Presentation with Office of Special Education Programs (OSEP)

  24. What does the research say? Observation of Special Education Teachers Unique roles and responsibilities Johnson, E., & Semmelroth, C. L. (2014).

  25. What does the research say? New research emerging… What doesn’t work? What can be done to improve existing models? Practical Recommendations for Observations: Ensure that special education teachers are familiar with checklist items OR rubric (e.g., FFT) Have evaluators and teachers select specific domains to work on Encourage and allow time for teachers to explain their lesson plans or actions during evaluation Video exemplars for administrators Provide extra training for administrators evaluating special education teachers

  26. What does the research say? Observation Tool in Development RESET Recognizing Effective Special Education Teachers What will it include? Video component Acknowledge unique roles and settings Scoring on specific instruction strategies Collecting student growth data (still in development) Johnson, E., & Semmelroth, C. L. (2014).

  27. What does the research say? PEER Observation/Review of Special Education Teachers Recommendation to improve observations for special education teachers Inform general education observers/reviewers Document peer observer notes and review Peer observer and teacher identify specific areas Encourage teachers to take feedback seriously

  28. What does the research say? Portfolios for Special Education Teachers May want to consider e-portfolio system Administrators should review often (with teachers) Provide model portfolios & handbook Enlist special education teacher/administrator to review portfolios with main reviewer

  29. Educator Evaluation: Research-Informed Practice What does the research say? Dynamic Instructional Practices Portfolio Tool MyiLOGSMy Instructional Learning Opportunities Guidance System

  30. Educator Evaluation: Research-Informed Practice Lastly . . . • Recognize professionalism • Incorporate Research • Where? Regional Educational Laboratory (Northeast & Islands) www.relnei.org

  31. Educator Evaluation: Research-Informed Practice Free Research Resources! Ask A REL is a free reference desk service providing brief responses to your education-related questions.

  32. Free Research Resources! • Education Resources Information Center (ERIC) • Free access to > 1 million records of journal articles • Often has full text!! (http://www.eric.ed.gov/) • What Works Clearinghouse (WWC) • Free, easily accessible databases and reports • Provides high quality reviews of educational • interventions (programs, products, practices, policies) • Doing What Works (DWW) • Free & Helps educators identify and use effective teaching • practices • Provides examples of ways educators might apply research

More Related