1 / 35

Central Michigan University Educator Evaluation Discussion

Central Michigan University Educator Evaluation Discussion. Jennifer S. Hammond, Ph.D. Grand Blanc High School Principal Michigan Association of Secondary School Principals Past President Michigan Council for Educator Effectiveness Member. Objectives:.

ariane
Download Presentation

Central Michigan University Educator Evaluation Discussion

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Central Michigan UniversityEducator Evaluation Discussion Jennifer S. Hammond, Ph.D. Grand Blanc High School Principal Michigan Association of Secondary School Principals Past President Michigan Council for Educator Effectiveness Member

  2. Objectives: • Review National Reform on Educator Evaluations • Review changes to Michigan law regarding evaluations • Discuss the charge of the Michigan Council for Educator Effectiveness (MCEE) • Summarize the MCEE Interim Report • Review the 2012-13 Pilot

  3. National Council onTeacher Quality • Teacher quality is the most important school-level variable in student achievement. • Recognition that increasing teacher quality is key to raising student achievement. • Specific emphasis on teacher effectiveness.

  4. The Effect of Teacher Quality (Sander and Rivers (1996): Cumulative and Residual Effects of Teachers on Future Student Achievement)

  5. Implementation Issues • Timeline • State Data System • Training • Teachers • Principals/Other Evaluators • Validation • Funding • Rewards/Consequences

  6. The Education Trust Whether schools are charters or traditional public schools, several features distinguish the high performers from all the rest. They don’t leave anything about teaching and learning to chance. An awful lot of our teachers—even brand new ones—are left to figure out on their own what to teach and what constitutes “good enough” work. www.edtrust.org

  7. The Education Trust The Widget Effect: “When it comes to measuring instructional performance, current policies and systems overlook significant differences between teachers. There is little or no differentiation of excellent teaching from good, good from fair, or fair from poor. This is the Widget Effect: a tendency to treat all teachers as roughly interchangeable, even when their teaching is quite variable. Consequently, teachers are not developed as professionals with individual strengths and capabilities, and poor performance is rarely identified or addressed.” •The New Teacher Project, 2009 www.edtrust.org

  8. www.edtrust.org

  9. MichiganPublic Act 102 of 2011 (HB 4627) • Evaluations • Evaluations must occur annually, must take place at the end of the year, and be based on “multiple,” rather than at least two observations • MUST implement that rating system by September 10, 2011 • highly effective, • effective, • minimally effective or • ineffective • Beginning with the 2013-2014 school year, evaluation system for teachers and administrators that is based largely on student growth and assessment data

  10. Student Growth: • 2013-2014 school year, at least 25% • 2014-2015 school year, at least 40% • 2015-2016 school year, at least 50%

  11. Student Growth: • The annual year-end evaluation shall be based on the student growth and assessment data for the most recent 3 consecutive school-year period. • If none available for a teacher for at least 3 school years, the annual year-end evaluation shall be based on all assessment data that are available for the teacher.

  12. Note: • Observations • Manner to be conducted shall be prescribed in evaluation tool • Shall include review of lesson plan • Shall include state curriculum standard being used in lesson • Shall include a review of pupil engagement in the lesson • Observation does not have to be for entire class period • Multiple observations per year for those rated below effective

  13. MichiganPublic Act 102 of 2011 (HB 4627) Ineffective Ratings • Beginning with the 2015-16 school year, a board must notify the parent of a student assigned to a teacher who has been ineffective on his or her two most recent annual year-end evaluations. • Any teacher or administrator who is rated ineffective on three consecutive annual year-end evaluations must be dismissed from employment.

  14. Note: Districts are not required to comply with Governor’s teacher/administrator evaluation tools if they have an evaluation system that: • Most significant portion is based on student growth and assessment data • Uses research based measures to determine student growth • Teacher effectiveness and ratings, as measured by student achievement and growth data, are factored in teacher retention, promotion and termination decisions • Teacher/administrator results are used to inform teacher of professional development for the succeeding year • Ensures that teachers/administrators are evaluated annually Must notify Gov. Council by November 1st of exemption

  15. Michigan Council for Educator Effectiveness(MCEE) • Appointed by Governor Snyder: • Deborah Ball • Mark Reckase • Nick Sheltrown • Appointed by Senate Majority Leader: • David Vensel • Appointed by Speaker of the House: • Jennifer Hammond • Appointed by Superintendent of Public Instruction: • Joseph Martineau

  16. Michigan Council for Educator Effectiveness(MCEE) Advisory committee appointed by the Governor • Provide input on the Council’s recommendations • Teachers, administrators, parents

  17. Michigan Council for Educator Effectiveness(MCEE) • No later than April 30, 2012 the Council must submit: • A student growth and assessment tool • A value-added model • Measures growth in core areas and other areas • Complies with laws for students with disabilities • Has at least a pre- and post-test • Can be used with students of varying ability levels

  18. Michigan Council for Educator Effectiveness(MCEE) • No later than April 30, 2012 the Council must submit: • A state evaluation tool for teachers (general and special education teachers) • Including instructional leadership abilities, attendance, professional contributions, training, progress reports, school improvement progress, peer input and pupil and parent feedback • Council must seek input from local districts

  19. Michigan Council for Educator Effectiveness(MCEE) • No later than April 30, 2012 the Council must submit: • A state evaluation tool for administrators • Including attendance, graduation rates, professional contributions, training, progress reports, school improvement plan progress, peer input and pupil/parent feedback • Recommended changes for requirements for professional teaching certificate • A process for evaluating and approving local evaluation tools

  20. Interim Report Vision Statement: The Michigan Council for Educator Effectiveness will develop a fair, transparent, and feasible evaluation system for teachers and school administrators. The system will be based on rigorous standards of professional practice and of measurement. The goals of this system is to contribute to enhanced instruction, improve student achievement, and support ongoing professional learning.

  21. Interim ReportTeacher Evaluation: Observation Tool Selection Criteria • Alignment with State Standards • Instruments describe practice and support teacher development • Rigorous and ongoing training program for evaluators • Independent research to confirm validity and reliability • Feasibility

  22. Interim ReportTeacher Evaluation: Observation Tool Systems • Marzano Observation Protocol* • Thoughtful Classroom* • Five Dimensions of Teaching and Learning* • Framework for Teaching* • Classroom Assessment Scoring System • TAP

  23. Interim ReportTeacher Evaluation: Observation Tool Lesson Learned from other States: • Pilot is essential • Phasing in • Number of observations • Other important components

  24. Interim ReportTeacher Evaluation: Observation Tool Challenges • Being fiscally responsible • Ensuring fairness and reliability • Assessing the fidelity of protocol implementation • Determining the equivalence of different instruments

  25. Interim ReportTeacher Evaluation: Student Growth Model • Recognize that student growth can give insight into teacher effectiveness • Admit that “student growth” is not clearly defined • Descriptions of growth vary and include: • Tests • Analytic techniques for scoring • Measures of value-added modeling • Simple vs. Complex statistics • VAM

  26. Interim ReportTeacher Evaluation: Student Growth Model Challenges • Measurement error in standardized and local measurements • Balancing fairness toward educators with fairness toward students • Non-tested grades and subjects • Tenuous roster connections between students and teachers • Number of years of data

  27. Considerations for Student Growth Measures Should the State evaluation data (i.e. MEAP, MME, etc.) be the only source of student growth data? Why or why not? Should local student growth models be allowed? Why or why not? If you agree that multiple measures should be allowed, what percentage would you give each of the multiple measures? • For example if educators are permitted to use MME data, a local tool such as an end of course assessment, and a personally developed measure how should those three measures be weighted? How should we measure teachers in non-tested subjects such as band or auto mechanics?

  28. Pilot for 2012-2013 • 14 school districts • Pilot the teacher observation tool • Pilot the administrator evaluation tool • Train evaluators • Provide information on validity • Gather feedback from teachers and principals • 4 observation tools • Student growth model/VAM pilot

  29. 5 Dimensions of Teaching and Learning: Clare, Leslie, Marshall, Mt. Morris Framework for Teaching: Garden City, Montrose, Port Huron Marzano Evaluation Framework: Big Rapids, Farmington, North Branch The Thoughtful Classroom: Cassopolis, Gibraltar, Harper Creek, Lincoln

  30. Pilot for 2012-2013 Questions to be answered about observational tool: • Ratings of teachers • Satisfaction with tool • Adequate training • Correlation between observation tool and student growth

  31. Pilot for 2012-2013 Testing Protocol NWEA K-2, 3-6 Explore 7, 8 PLAN 9, 10 ACT 11, 12 Value Added Modeling Sample size State-wide data collection tool Vendors

  32. Pilot for 2012-2013 Questions to be answered about observational tool: • Ratings of teachers • Satisfaction with tool • Adequate training • Correlation between observation tool and student growth

  33. Pilot for 2012-2013 Testing Protocol NWEA K-2, 3-6 Explore 7, 8 PLAN 9, 10 ACT 11, 12 Value Added Modeling Sample size State-wide data collection tool Vendors

  34. Interim ReportTimeline for MCEE Recommendations

  35. Jennifer S. Hammond, Ph.D. jhammond@grandblancschools.org (810) 591-6637 Educator Evaluation Discussion www.mcede.org

More Related