1 / 25

Michigan’s Council on Educator Effectiveness

Michigan’s Council on Educator Effectiveness. Educator Evaluation Discussion. Objectives:. Discuss the charge of the Michigan Council for Educator Effectiveness (MCEE ) Summarize the MCEE Interim Report Provide an Overview of the Pilot. Michigan Public Act 102 of 2011 (HB 4627).

Download Presentation

Michigan’s Council on Educator Effectiveness

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Michigan’s Council on Educator Effectiveness Educator Evaluation Discussion

  2. Objectives: • Discuss the charge of the Michigan Council for Educator Effectiveness (MCEE) • Summarize the MCEE Interim Report • Provide an Overview of the Pilot

  3. MichiganPublic Act 102 of 2011 (HB 4627) • Evaluations • Evaluations must occur annually, must take place at the end of the year, and be based on “multiple,” rather than at least two observations • MUST implement that rating system by September 10, 2011 • highly effective, • effective, • minimally effective or • ineffective • Beginning with the 2013-2014 school year, evaluation system for teachers and administrators that is based largely on student growth and assessment data

  4. Student Growth: • 2013-2014 school year, at least 25% • 2014-2015 school year, at least 40% • 2015-2016 school year, at least 50%

  5. Student Growth: • The annual year-end evaluation shall be based on the student growth and assessment data for the most recent 3 consecutive school-year period. • If none available for a teacher for at least 3 school years, the annual year-end evaluation shall be based on all assessment data that are available for the teacher.

  6. Note: • Observations • Shall include review of lesson plan • Shall include state curriculum standard being used in lesson • Shall include a review of pupil engagement in the lesson • Observation does not have to be for entire class period

  7. MichiganPublic Act 102 of 2011 (HB 4627) • Ineffective Ratings • Beginning with the 2015-16 school year, a board must notify the parent of a student assigned to a teacher who has been ineffective on his or her two most recent annual year-end evaluations. • Any teacher or administrator who is rated ineffective on three consecutive annual year-end evaluations must be dismissed from employment.

  8. Note: • Districts are not required to comply with Governor’s teacher/administrator evaluation tools if they have an evaluation system that: • Most significant portion is based on student growth and assessment data • Uses research based measures to determine student growth • Teacher effectiveness and ratings, as measured by student achievement and growth data, are factored in teacher retention, promotion and termination decisions • Teacher/administrator results are used to inform teacher of professional development for the succeeding year • Ensures that teachers/administrators are evaluated annually Must notify Gov. Council by November 1st of exemption

  9. Michigan Council for Educator Effectiveness(MCEE) • Appointed by Governor Snyder: • Deborah Ball • Mark Reckase • Nick Sheltrown • Appointed by Senate Majority Leader: • David Vensel • Appointed by Speaker of the House: • Jennifer Hammond • (Non-voting Member) Appointed by Superintendent of Public Instruction: • Joseph Martineau

  10. Michigan Council for Educator Effectiveness(MCEE) Advisory committee appointed by the Governor • Provide input on the Council’s recommendations • Teachers, administrators, parents

  11. Michigan Council for Educator Effectiveness(MCEE) • No later than April 30, 2012 the Council must submit: • A student growth and assessment tool • A value-added model • Measures growth in core areas and other areas • Complies with laws for students with disabilities • Has at least a pre- and post-test • Can be used with students of varying ability levels

  12. Michigan Council for Educator Effectiveness(MCEE) • No later than April 30, 2012 the Council must submit: • A state evaluation tool for teachers (generaland special education teachers) • Including instructional leadership abilities, attendance, professional contributions, training, progress reports, school improvement progress, peer input and pupil and parent feedback • Council must seek input from local districts

  13. Michigan Council for Educator Effectiveness(MCEE) • No later than April 30, 2012 the Council must submit: • A state evaluation tool for administrators • Including attendance, graduation rates, professional contributions, training, progress reports, school improvement plan progress, peer input and pupil/parent feedback • Recommended changes for requirements for professional teaching certificate • A process for evaluating and approving local evaluation tools

  14. Interim Report Vision Statement: The Michigan Council for Educator Effectiveness will develop a fair, transparent, and feasible evaluation system for teachers and school administrators. The system will be based on rigorous standards of professional practice and of measurement. The goals of this system is to contribute to enhanced instruction, improve student achievement, and support ongoing professional learning.

  15. Interim ReportTeacher Evaluation: Observation Tool Selection Criteria • Alignment with State Standards • Instruments describe practice and support teacher development • Rigorous and ongoing training program for evaluators • Independent research to confirm validity and reliability • Feasibility

  16. Interim ReportTeacher Evaluation: Observation Tool Systems • Marzano Observation Protocol • Thoughtful Classroom • Five Dimensions of Teaching and Learning • Framework for Teaching • Classroom Assessment Scoring System • TAP

  17. Interim ReportTeacher Evaluation: Observation Tool Lesson Learned • Pilot is essential • Phasing in • Number of observations • Other important components

  18. Interim ReportTeacher Evaluation: Observation Tool Challenges • Being fiscally responsible • Ensuring fairness and reliability • Assessing the fidelity of protocol implementation • Determining the equivalence of different instruments

  19. Interim ReportTeacher Evaluation: Student Growth Model • Recognize that student growth can give insight into teacher effectiveness • Admit that “student growth” is not clearly defined • Descriptions of growth vary and include: • Tests • Analytic techniques for scoring • Measures of value-added modeling • Simple vs. Complex statistics • VAM

  20. Interim ReportTeacher Evaluation: Student Growth Model New York: 40% state assessments w/local assessments approved by State Arizona: 33% - 50% of evaluation, locals determine multiple measures, tested subject areas must use State data for one measure Colorado: 50% growth on state assessments and “other” measures for non-tested subjects Delaware: 50% school-wide assessment measure based on State assessment (30%) and student cohort assessment (20%) Florida:50% on State assessments for teachers in tested subjects (40% if less than 3 years of data) and 30% on State assessment for teachers in non-tested subjects

  21. Interim ReportTeacher Evaluation: Student Growth Model Challenges • Measurement error in standardized and local measurements • Balancing fairness toward educators with fairness toward students • Non-tested grades and subjects • Tenuous roster connections between students and teachers • Number of years of data

  22. Questions Surrounding Student Growth Measures… Question #1: Should the State evaluation data (i.e. MEAP, MME, etc.) be the only source of student growth data? Why or why not? Question #2: Should local student growth models be allowed? Why or why not? Question #3: If you agree that multiple measures should be allowed, what percentage would you give each of the multiple measures? • For example if educators are permitted to use MME data, a local tool such as an end of course assessment, and a personally developed measure how should those three measures be weighted? Question #4: How should we measure teachers in non-tested subjects such as band or auto mechanics?

  23. Interim ReportPilot for 2012-2013 • 12 school districts • Pilot the teacher observation tool • Pilot the administrator evaluation tool • Train evaluators, principals and teachers • Provide information on validity • Gather feedback from teachers and principals • 3 observation tools • Student growth model/VAM pilot

  24. Interim ReportTimeline for MCEE Recomendations

  25. New Website! www.mcede.org

More Related