1 / 48

Educator Evaluation Systems & Effectiveness Labels

Educator Evaluation Systems & Effectiveness Labels. Venessa Keesler, Ph.D. Office of Evaluation, Strategic Research and Accountability Michigan Department of Education and Carla Howe West Virginia Department of Education. Overview of Current Plan and Issues. Key important messages:

lucky
Download Presentation

Educator Evaluation Systems & Effectiveness Labels

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Educator Evaluation Systems & Effectiveness Labels Venessa Keesler, Ph.D. Office of Evaluation, Strategic Research and Accountability Michigan Department of Education and Carla Howe West Virginia Department of Education

  2. Overview of Current Plan and Issues • Key important messages: • This was the FIRST YEAR; 800+ different systems (we have data to show this) • Districts did MASSIVE amounts of work to accomplish this • We do not believe that huge numbers of MI teachers are ineffective

  3. Current Circumstances Our current legislation has allowed for local systems of evaluations, which has given districts flexibility to design systems that work best for them. • Over 800 systems across the state • Varying degrees of implementation across the state Public reporting of effectiveness labels is required by SFSF • Released in November via mischooldata.org • Teachers labels reported in aggregate by school (number of teachers in each of the four categories) • Principals/Administrators reported at the district level.

  4. Important Context for the 2011-12 Results • First year of implementation of NEW systems based on student growth measures • State provided student growth measures are only available in grades 4-8 for reading and mathematics • Varying components across systems (i.e. between districts) • Varying percentages of growth across systems (i.e. between districts) • Some districts on prior contract (i.e. No new system, but reporting labels was required)

  5. K-12 Educator Evaluation Survey • 792 districts completed the survey about their Evaluation systems from April to August • Required to be completed by SFSF • Results provide valuable insight into local systems • The types of frameworks used • The % of student growth as a component (law states “significant”, but it isn’t defined until 2013-14) • Types of growth measures included • Types of decisions informed by the results of evaluations

  6. 54 districts with a prior contract did not have to incorporate growth or a new system in 2011-12 50% of reporting districts # of districts Other Frameworks reported include: Charlotte Danielson Framework AND a local component, Teacher Advancement Program, My Learning Plan, 5 Dimensions of Teaching and Learning, Local District or ISD framework, McREL, STAGES, Kim Marshall Rubrics PRELIMINARY/DRAFT FINDINGS

  7. Appropriate given the FIRST year of local evaluation systems # of districts

  8. # of districts Other ways growth data are measures include: Combination of data from multiple assessments, pre/post test data, combination of local, state, national measures, benchmark testing, several sources as agreed upon in the professional growth plan

  9. # of districts Others types of assessment data reported that factor into educator evaluations include: AIMSweb, DRA, Ed Performance Series, Fontes & Pinnell, STAR Reading and Math, CBM for Math, DELTA Math

  10. # of districts Others types of assessment data reported that factor into educator evaluations include : AIMSweb, DRA, Ed Performance Series, Fontes & Pinnell, STAR Reading and Math, TerraNova, ITBS, DELTA Math

  11. # of districts Others types of assessment data reported that factor into educator evaluations include : AIMsweb, Ed Performance Series, STAR Reading and Math, Study Island

  12. # of districts Others types of assessment data reported that factor into educator evaluations include : common assessments, district benchmark assessments, Scantron Performance Series PRELIMINARY/DRAFT FINDINGS

  13. # of districts Others types of decisions include: Assignment to committees or roles beyond the classroom, classroom support and assistance, layoff/recall/transfer, mentoring, staff placement, scheduling, setting improvement goals, merit pay

  14. Other Factors Reported As Part of Evaluations PRELIMINARY/DRAFT FINDINGS

  15. Overview of Statewide Results Understanding educator evaluation labels in MI

  16. Caveat…. • Labels are not EQUAL across districts • However, we know that people will want this type of analysis and we want it done appropriately

  17. Statewide Results • IMPORTANT NOTES: • Based on the labels as determined by the local evaluation system; rigor of label designation is not consistent across districts • THERE is differentiation in label reporting  now, 22% of teachers are reported as “highly effective”  moving away from a satisfactory/unsatisfactory system • We do not believe that 1% of teachers labeled as “ineffective” is unreasonable in the first year

  18. Impact of growth • Law required districts to implement systems based in “significant part” on student growth • How do the labels look different when the district used growth in greater percentages?

  19. Growth and eval labels More differentiation in labels when growth counts at a higher rate LESS differentiation without growth

  20. Distribution of Labels By Percent of Evaluation Based on Growth

  21. Key Takeaways • Distribution of labels (i.e. number of teachers in each category): • Is appropriate in Year 1 of implementation • Reflects differentiation (esp highly effective/effective) • BUT we also see that systems using higher proportions of growth are able to make those differentiations more accurately • The statewide evaluation system will move us toward more growth measures at higher rates

  22. Who is more likely to be rated as highly effective or effective? Teachers more likely to appear in highly effective category (versus other three) and in effective category (versus other two): • Female teachers • Those with more time in the same district • Teachers with a professional certificate (as opposed to all others) • Those with a master’s degree or higher • Teachers in districts with growth over 40% in their system

  23. Who is less likely to be rated as effective or highly effective? • Older teachers • New teachers (those in their first year of teaching) • Mathematics, science, social science, special education and world language teachers (relative elementary teachers) • Teachers in systems where growth is less than 10% of the evaluation system

  24. Relationship between effectiveness labels and Priority/Focus/Reward • Important to remember: • A school-level designation does not mean that all teachers within that school are in a given level of effectiveness • Example: In a Priority School, there will be effective teachers as well as ineffective teachers

  25. Effectiveness Labels in Priority, Focus and Reward Schools Notes: There are significantly more teachers reported as ineffective and minimally effective in Priority Schools than the statewide number, and in Focus or Reward schools.

  26. Key Takeaways from the Results • These results are reasonable for the first year; represent a huge effort on the part of districts • There is differentiation in the system; there will be more as growth becomes a higher component; but we still do not believe large numbers of Michigan teachers are “ineffective”

  27. Questions? Contact Michigan Department of Education Office of Evaluation, Strategic Research and Accountability 877-560-8378 Option 6

  28. From Statehouse to School HouseDelaware’s Statewide Evaluation System

  29. Delaware Performance Appraisal System DPAS II • Pre and post conference meetings • Evidenced-based formative and summative reports • Professional responsibility teacher self-reflection component • Student improvement and growth targets • Supports for improving teacher performance through expectations and individual improvement plans

  30. Title 14 Education 100 Accountability Delaware Performance Appraisal System (DPAS II)Legislation Title 14 Education 100 Accountability 106 Teacher Appraisal Process 2.0 Definitions 3.0 Appraisal Cycles 4.0 DPAS II Guide for Teachers 5.0 Appraisal Components and Appraisal Criteria 6.0 Summative Evaluation Ratings 7.0 Pattern of Ineffective Teaching Defined 8.0 Improvement Plan 9.0 Challenge Process 10.0 Evaluator Credentials 11.0 Evaluation of Process 12.0 Effective Date

  31. DPAS II 5 Components

  32. Component 3 Instruction Component Criteria Elements

  33. _ DPAS IIr NOT NOT NOT NOT NOT

  34. Beyond Expectations Observed Lesson Recommendations Expectations Improvement Plan Typically you will use expectations to improve teacher performance. However, there may be situations that require an IIP be implemented. Improvement Plan

  35. Department of Education Monitoring • RTTT – Development Coach Project • EMS –Electronic Management System • Audit of Districts

  36. Audit Tool DPAS-II Monitoring Summary 2012-2013 ___________District Rating:

  37. Development Coach RTTT Project • 63 participating schools representing all 19 Delaware School Districts. • Supported over 140 principals, assistant principals, and district expert evaluators. • Reviewed and provided feedback on over 2000 formative and summative evaluation reports for school administrators. • Participated in 100’s of pre/ post conferences, observations and walkthroughs with school administrators. • Designed strategies to enhance the effectiveness of the DPAS II process with school administrators and teachers.

  38. Development Coach RTTT Project • Gained expertise understanding DPAS II regulations, using evidence based technical writing, rubric scoring, and teacher levels of performance. • Developed relationships with their principals and the districts. • Provided countywide and school district DPAS II training. • Supported “deeper and richer” conversations among school administrators with an instructional focus. • Increased the level and degree of accountability with participating school administrators.

  39. Development Coach • Worked three hours a week with principals and APs. • Provided support for the implementation of the revised DPAS process. • Accompanied the principal during pre conferences, observations, post conferences and walkthroughs and provided feedback for improvement.

  40. Development Coach • Collaborated with District Office to calibrate formatives, provide professional development, and ascertain training needs. • Supported principal as the instructional leader of the school. • Worked with principal to interpret teacher data and to work toward student improvement through instructional growth.

  41. Principal • Debriefed with Development Coach weekly. • Learned to prioritize formative observations. • Enhanced knowledge of the DPAS process. • Technical writing & evidence collection • Management & scheduling of DPAS activities

  42. Principal • Received support for non DPAS related areas. • Learned to use data from PLCs and Component V to help individual teachers work toward instructional growth. • Received additional instructional resources for teacher development.

  43. Challenges • Principal and DC finding time to meet. • Moving from narrative to evidence based technical writing. • Working with admins who do not want a coach. • Monitoring the process. • Districts not supporting principals in moving forward to remove incompetent teachers.

  44. Triumphs • Increased number of • Principals meeting required number of formative and summative evaluations. • IIPs and expectations over a two year period. • Going beyond compliance. • Principals with Development Coach see DPAS as a tool to improve instruction. • Principals are using expectations and IIPS to improve instruction. • Teachers and admins recognize the value of pre and post conference as a means to improve. • 64 schools have signed on for a third year.

  45. Lessons Learned • Hiring the people who had the skills and knowledge to be coaches was essential (principals). • New administrators adapted to evidence based technical writing faster than veteran administrators. • Over time and with appropriate support in the process, administrators improved their practice and took less time completing evaluations. • When implemented effectively and with fidelity, teachers have embraced the new system. • Creating a trusted and confidential relationship with the coachee is critical to the success for the program.

  46. Implications • What gets monitored gets done. • Electronic reporting • State audits • Expert evaluators • School districts and DOE need to monitor adherence to the evaluation process and the quality of written formatives and summatives. • Calibration activities • Administrative PLCs • Non-binding regulatory guidance • School leaders need to commit to the process, allocate time, and provide professional learning resources.

  47. Questions? http://www.doe.k12.de.us/csa/dpasii Dr. Jacquelyn Wilson, Director DASL, University of Delaware, Email jowilson@udel.edu Dr. Sharon Brittingham, Project Director Development Coach, Email: sharonbr@udel.edu Dr. Steve Godowsky, Project Director Development Coach, Email: sgodow@udel.edu Linda Grace, Development Coach, Email: lgrace@udel.edu Lewis Cheatwood, Principal, Email: lewis.cheatwood@bsd.k12.de.us Eric Niebrzydowski, Deputy Officer, Special Projects Teacher and Leader Effectiveness Unit, Delaware Department of Education, Email: eric.niebrzydowski@doe.k12.de.us

More Related