1 / 27

Educator Evaluation Spring Convening Connecting Policy, Practice and Practitioners

Educator Evaluation Spring Convening Connecting Policy, Practice and Practitioners. May 28-29, 2014 Marlborough, Massachusetts. Our goal. Our core strategies. Prepare all students for success after high school by: Improving educator effectiveness

elsu
Download Presentation

Educator Evaluation Spring Convening Connecting Policy, Practice and Practitioners

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Educator Evaluation Spring ConveningConnecting Policy, Practice and Practitioners May 28-29, 2014 Marlborough, Massachusetts

  2. Our goal

  3. Our core strategies Prepare all students for success after high school by: Improving educator effectiveness Strengthening curriculum, instruction, and assessment Turning around the lowest performing districts and schools Using data and technology to support student performance

  4. Educator Evaluation Team:Three Key Strategies Teaching the components of the Educator Evaluation framework and sharing implementation resources to build capacity within districts and schools. Teach Learn Connect Learning from and with educators about their successes, challenges, and needs to ensure educator voices are reflected in Educator Evaluation policies and practices. Connecting and aligning Educator Evaluation implementation with other state and district initiatives to improve professional growth and student learning; creating opportunities for educators to connect and share with one another and ESE.

  5. Welcome! Who’s here today? • More than half MA districts attending (n~235) • Over 1,000 educators: • District Administrators, • School Administrators, • Teachers, and local union leaders, • Specialized Instructional Support Personnel, and • Collaborative Leaders and Staff • ESE Staff

  6. Implementation Milestones (past) • June 2010: Educator Evaluation Task Force convened multiple times over 6 months. • June 2011: MA Board of Education passed new educator evaluation regulations • September 2011: Implementation began in 347 Level 4 schools, 11 Early Adopter districts, and 4 special education collaboratives • January 2012: Implementation began in all RTTT districts • September 2012: Implementation began in all RTTT districts

  7. Implementation Milestones (present) • September 2013: Implementation began in all Non-RTTT districts • September 2013: All districts began piloting District Determined Measures • April 2014: Model Student and Staff Survey Pilots completed

  8. Goals of the System • Promote growth and development; • Place student learning at the center; • High bar for professional teaching status; • Shortened timelines for improvement; • Recognize excellence.

  9. Expanding our Understanding of the Problem

  10. How are we doing? Spring 2014 Teacher Survey on Educator Evaluation • Almost all respondents have experienced at least some parts of the system. • 70%-80% agree that they have received sufficient training on the various parts of the process. • 87.9% agree or strongly agree that their evaluator’s assessment of performance is fair. • Among those teachers evaluated last year, 82.7% think the ratings they received last year were fair.

  11. How are we doing? Spring 2014 Teacher Survey on Educator Evaluation • There is still some anxiety: Less than half the respondents (43.0%) think the new system provides a fair process and about half (49.5%) feel anxious about their evaluator’s assessment of their performance. 63.5% feel more anxious this year because of the educator evaluation system. • Only about a quarter (27.2%) think that compared to the prior system, the new system enables educators to better distinguish between exceptional, capable, and weak educator practice, and only about a third (32.1%) think that compared to the prior system, the new evaluation system provides educators with more meaningful feedback. • 81.4% think that the feedback they receive from their evaluator is timely and 72% reported that the feedback is helpful. • Most of those who were evaluated last year agree that they feel more knowledgeable and informed about the process this year.

  12. Ed Eval – Original Timeline (in Regulations)

  13. Ed Eval – Revised TimelineExtension by Exception Massachusetts Department of Elementary and Secondary Education

  14. Educator Evaluation Results – State 2012-2013

  15. Educator Evaluation Results – State 2012-2013

  16. Educator Evaluation Results – State 2012-2013

  17. Why Look at SGP? • Examine where there are similarities and differences between SGP and evaluation results • It is not the sole determinant in an educator’s evaluation • However, if there are large differences it would signal to state and districts there might be a need for additional training and calibration • Document on Educator Evaluation website explaining the uses http://www.doe.mass.edu/edeval/ddm/GrowthPercentiles.pdf

  18. Educator Evaluation Results vs. SGPState Results

  19. Educator Evaluation Results vs. SGPState Results

  20. Educator Evaluation Results vs. SGPState Results

  21. Educator Evaluation Results vs. SGPState Results

  22. Educator Evaluation Results vs. SGPState Results

  23. Integration of Initiatives You said ESE was not linking the implementation of the Curriculum Frameworks and Ed Eval, so we listened and produced integrated support: • Educator Evaluation and Curriculum Frameworks Quick Reference Guide • Ed Eval and Professional Development Quick Reference Guide • Using Current Assessments in DDMs Guidance Document • 2013-14 Curriculum Summit – Curriculum-Embedded Performance Assessments (CEPAs) and DDMs • Professional Practice Innovation Grant

  24. Stakeholder Engagement Including, but not limited to: • Superintendents Advisory Council • Principal Dialogue Tours • Principal Cabinets • Educator Effectiveness Teacher Cabinet • State Student Advisory Council

  25. Educator Evaluation Spring Convening:Connecting Policy, Practice, and Practitioners • Today we will focus on four key areas: • District Determined Measures (DDMs) • Evaluator Calibration • Student and Staff Feedback • Professional Development

More Related