1 / 76

For Learning, Growth, and Adaptation Day 2

Educator Evaluations: The Current Landscape, Your Role, and Best Practices Designing an Educator Evaluation System. For Learning, Growth, and Adaptation Day 2. It’s a bit like reading hieroglyphics. “I’m not sure what the story is about, but I think it has a happy ending”.

zion
Download Presentation

For Learning, Growth, and Adaptation Day 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Educator Evaluations: The Current Landscape, Your Role, and Best PracticesDesigning an Educator Evaluation System For Learning, Growth, and Adaptation Day 2

  2. It’s a bit like reading hieroglyphics “I’m not sure what the story is about, but I think it has a happy ending”

  3. Today’s Workshop • Pat Reeves, WMU & MASA • Pat McNeill, Michigan ASCD • Linda Wacyk, MASA MASA Performance Evaluation Resource Center www.gomasa.org/educator-evaluation

  4. Agenda for the Day Part 1: Growing your district performance evaluation system Part 2: Aligning administrator evaluations Part 3: Managing the politics

  5. First Things First Revisiting assumptions Revisiting research-based principles Revisiting paradigms: Demonstration vs. Inspection

  6. Revisiting the 10 Assumptions Activity 1 Using your workbook, select an assumption that particularly resonates with you. With a partner, take 2 minutes to share the one that resonates with you and the reason why. Listen to your partner do the same. We need 4 volunteers to report out to the whole group. Total time for this activity: 10

  7. Authentic Professional Inclusive Great Teachers And Leaders Evidence Based Purpose Driven Adaptive Change Model for Evaluation So we can do what it takes to serve all students well Reeves, P & McNeill, P 2012

  8. Jigsaw with the 6 Principles Activity 2 Spend 2 minutes deciding how you would “introduce” one principle with others. Prepare a one minute synopsis of your principle. Rehearse your one minute synopsis by sharing it with your table group. We will need 4 new people to report out to the whole group. Total time for this activity: 15

  9. A Look at TwoClassic Evaluation Paradigms Inspection (external locus of control) Demonstration (Internal locus of control) Adapted from: Educator Evaluation – Models, Parameters, Issues and Implementation, Dr. Edward Roeber, Professor of Education, Michigan State University, February2011. (White paper commissioned by the Michigan Education Association)

  10. Common to Both Paradigms Standards of professional practice Operational definitions of applied practice (i.e. what it looks like) Evidence Interpretation and judgment Formative and summative decisions

  11. How will you blend the two? To what degree and how will you use inspection based approaches (e.g. observation, rating scales, scoring guides, external goals, perception data, and examination of work artifacts and examination of results)? To what degree and how will you use demonstration based approaches (self assessments, evidence portfolios, practice and results based reflections, 360° feed back, personal goals, and job specificity)?

  12. How will you blend the two? Activity 3 Using the chart in your handbook, take inventory of all the pieces in your evaluation system. Where does each piece fit? Inspection, Demonstration, or both? We need 4 volunteers to report out to the whole group on what you discovered. Time for this activity: 15 min.

  13. What is it all about? Leaving a mark we can be proud of?

  14. Three Purposes for Performance Evaluation Achieve Organizational Goals, i.e. Student Outcomes Guide the learning, growth, and development of personnel, e.g. teachers and leaders Make decisions about hiring, placement, retention, promotion, and compensation

  15. Common to All 3 Purposes Build Professional Competence Build Organizational Capacity Achieve Better Results Adapt and Innovate Replicate Success

  16. When is it a System? When there are clear goals When there are guiding principles for achieving the goals When there are strategies to achieve the goals When all the parts work together in a complementary way When it serves users well

  17. Creating Constructive Conversations Activity 4 • In your handbook, take 3 minutes to write your reflections on the following questions. • In our district, are we approaching performance evaluation as a systems issue? • What are we talking about? • What do we need to be talking about? • How do we change the conversation? • We need 4 volunteers to report out to the whole group.

  18. What does a system look like? • Actions • Principles

  19. A System Overview Evidence Artifacts Data Summative Decisions Formative Decisions Compensation Promotion Retention + Growth Status Performance Status Revise Goals Revise Plans = Evaluate System Overall Growth Status of Personnel Overall Performance Status of Personnel Attainment of Goals

  20. How’s your system developing? Activity 5 On a 5 point scale (0-4) rank your district’s level of development on each of the system components in the Overview handout. Mark your ratings for each component right on the handout. Circle the 1-2 components you think may be the most challenging to implement. We will ask 4 people to report out on which ones they circle and why.

  21. Part A:Growing your District Performance Evaluation (PE) System

  22. Growing your District Performance Evaluation (PE) System Part A -1: Laying the Foundation • Using PE to achieve District Goals • Using PE to empower Learning, Growth and Adaptation • Finding the balance between Demonstration and Inspection • Establishing Guiding Principles

  23. Principle 1: Authentic Issues in Laying the Foundation • Start with the WHY? • Do you have a district vision? • Can everyone* name it? • How clear are your district/building goals? • What role will performance evaluation play in helping the district reach those goals? • How will you talk about it, then? * and I mean everyone

  24. Issues in Laying the Foundation • Shift the thinking about performance evaluation • From doing to, to doing with • From an event to a process • From “gotcha” to “empowerment” • From fuzzy focus, to clear collaborative goals • From dangerous detours to work that matters

  25. Things to Think and Talk About Activity 6 Is it clear how district and school goals will drive your district and school PE process? Is it understood what assumptions and principles will drive your district and school PE process? Do you have a target for how you will eventually balance inspection and demonstration? What steps can you take to get to clarity on these issues? We will ask 4 people to report out on ideas for creating answers to the above four questions

  26. Identifying Your Growth Edges Principle 1: Authentic Activity 7 Using the Levels of Implementation handout, take 2 minutes to study the Level 1, 2, & 3 descriptors for the principle: Authentic. Use the descriptors to diagnose where your district is at and to identify possible growth targets for developing your system further under that principle. We will ask 4 people to share the growth edges they identified for their district.

  27. Growing your District Performance Evaluation (PE) system Part A-2: Building the Infrastructure • Selecting and deciding how to use tools • Defining roles and responsibilities • Providing training and support • Identifying student growth measures • Deciding what goes into the official performance evaluation report (personnel file)

  28. Building the Infrastructure:Evaluating, Choosing, Using Tools • What tools do you need? • Assessment Instruments: rubrics, rating scales, observation guides, etc. • Record keeping systems • Data collection, storage, retrieval • Data analysis systems • Professional portfolio (evidence) systems TIP: Most commercially available products, with or without web management tools, do not provide value-added growth statistical analysis CAUTION: This is an emerging and rapidly changing industry. Costs and features vary widely and most are under development.

  29. Building the Infrastructure:Evaluating Choosing, Using Tools • How will you evaluate PE instruments? • Standards Base • Research Base • Format: Rubrics vs. Rating Scales • Alignment with or adaptability to district/school goals and priorities • Alignment with or adaptability to specific job responsibilities • Alignment or compatibility with other system tools • Trustworthiness: • Validation studies (pending/in progress)? • Reliability studies (pending/in progress)? • Derived or adapted from work submitted to reliability and validity studies or based on research meta-analyses

  30. Building the Infrastructure:Evaluating Choosing, Using Tools • How will you use the Tools You Adopt? • Entire instrument vs. cherry picking or adapting. • Pre-set scoring guide or custom designed. • Prioritizing and weighting. • Who completes or responds to the instrument? • How the instrument fits into the full evaluation. CAUTION: Reliability and validity findings apply to use of the instrument that are consistent with the conditions under which reliability and validity were established; however, scoring may or may not be flexible.

  31. Examining the Tools We Are Currently Using Activity 8: Table Talk on 7 Tools Discuss how confident you are with the tools you are currently using and why. Discuss how comfortable you are with how you are currently using the tools and why. We will ask 4 people to report out the highlights of the table conversations

  32. Principle 2: Professional Portfolios as a tool to increase ownership and empower educators as professionals.

  33. Being the Key Person Reeves, P & McNeill, P 2012 Strong demonstration to achieve more reliable Inspection High quality evidentiary portfolios Authentic self-assessment The courage to look at the data and learn from it Confidence, competence, and humility Passion and Commitment (to students and to excellence)

  34. Professional Practice Portfolio • Compiled, maintained, and updated by the Educator throughout the year and from year to year • Based on established performance criteria; standards of professional practice plus established performance goals and priorities • Critical aspect of aligning the performance evaluation system with the 6 principles

  35. Professional Practice Portfolio Example • Developed as an ongoing compilation of recent and relevant performance evidence • Developed in conjunction with Professional Growth Plan or Individual Development Plan (PGP or IDP) • Goals for Individual Educator • Goals from School/District Improvement Plans • Individual Goals from Performance Assessments (long & short term) • Plans for Growth & Improvement • Plans to accomplish school or team goals • Plans to accomplish individual goals • Measures of Performance (Artifacts and Evidence) • State Measures: (evaluation tools, growth models) • School Measures: (Observations, feedback, data, etc.) • Educator Measures: (work samples, feedback, self assessment, data, etc.) Adapted from: Educator Evaluation – Models, Parameters, Issues and Implementation, Dr. Edward Roeber, Professor of Education, Michigan State University, February2011. (White paper commissioned by the Michigan Education Association)

  36. Educator Performance Evaluation System

  37. Quality Performance Portfolios Reeves, P & McNeill, P 2012 • Tell a story: • Your learning and growth • Your use of research grounded practice • Your performance improvement goals and priorities • Your innovations and adaptations • Your student results • Include artifacts, data displays, and work samples that help tell the story and annotate them • Cross-reference to the Evaluation Rubrics and IDP • Keep the story up-to-date with additions, deletions and refinement • Enhance self assessment and reflective practice

  38. Measuring Credibility Reeves, P & McNeill, P 2012 Is the Professional Portfolio current, well organized and consistent with district requirements? Are annotations, reflections, and commentary clear, authentic, and helpful for interpreting the artifacts and data? Do the artifacts and data demonstrate the performance standards convincingly? Is the evidence of professional practice, growth, and improvement compelling? Are growth and improvement targets addressed convincingly and adequately and supported by data? Is the evidence (including the self assessment) corroborated by other evidence (e.g. observations, feedback data, results data, and work products?) Does the portfolio address aspects of performance not addressed by other evidence sources; i.e. does it “round out the picture”?

  39. Identifying Your Growth Edges Principle 2: Professional Activity 9 Using the Levels of Implementation handout, take 2 minutes to study the Level 1, 2, & 3 descriptors for the principle: Professional Think about how you can raise the level of professionalism in your system through the use of professional portfolios; i.e. demonstration. Diagnose your district’s professionalquotient. Identify possible growth targets for developing your system further. We will ask 4 people to share the growth edges they identified for their district.

  40. Building the Infrastructure:Determining Performance Evaluation Cycles/Schedules What parts and processes will occur every year? What parts and processes will occur on a cycle? What are the required timelines? What are the record keeping requirements? What is the procedure for monitoring and tracking processes, cycles, and timelines?

  41. Building the Infrastructure:Choosing a data management system What evaluation tools/instruments will it support? What evaluation processes will it support, e.g. aggregating evaluation data; portfolio alignment, analytics, etc.? What management processes and tools does it offer?

  42. Technology Checklist: Things to consider before subscribing to an on-line management system Customization Ease of Use Data and Documentation Capacity Compatibility Support References Costs Other

  43. It Will Take a Learning Community to Get it Right

  44. And did they ever get it right!

  45. So, let’s take time out for a few questions and observations

  46. Growing your District Performance Evaluation (PE) system Part A-3: Defining Roles & Responsibilities • For Developing, Implementing, and Evaluating the Performance Evaluation System • As an Evaluatee within the System • As an Evaluator (or contributor to evaluations) within the System (see Appendix G)

  47. Growing your District Performance Evaluation (PE) system Part A-4: Providing Training and Support • Principals cannot differentiate performance for approximately 60% of teachers whose effectiveness is average or near average. • Principal preparation programs have little or no focus on evaluating teachers • Veteran principals have not conducted evaluations in which multiple measures are used and high stakes are attached Source: NGA Center for Best Practice, Issue Brief, Executive Summary, 10/31/11

  48. Providing Training & Support Beginning • Orientation: Review/Understanding of model, tools, processes and procedures • Training on how to use the instruments and tools • Observer Training for inter-rater reliability • Documentation Training for Portfolio Development • Development of IDPs

  49. Providing Training & Support During evaluation • Sources of evidence and data • Interpretation of evidence and data • Self Assessment • Recognition of growth • Refinement of procedures • Focus on growth

  50. Providing Training & Support Ending an evaluation cycle • Debrief and Continue Coaching and Training as needed • Implementation fidelity • Final evaluation quality • Interpreting Evidence • Scoring • Identifying Improvement Targets • Goal setting • Revising IDPs and Summative Reporting

More Related