1 / 27

Value Added for Teacher Evaluation in the District of Columbia

Value Added for Teacher Evaluation in the District of Columbia. Robin Chait, Office of the State Superintendent of Education Anna Gregory, District of Columbia Public Schools Eric Isenberg, Mathematica Policy Research Association for Education Finance and Policy 37 th Annual Conference

imani-downs
Download Presentation

Value Added for Teacher Evaluation in the District of Columbia

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Value Added for Teacher Evaluation in the District of Columbia Robin Chait, Office of the State Superintendent of Education Anna Gregory, District of Columbia Public Schools Eric Isenberg, Mathematica Policy Research Association for Education Finance and Policy 37th Annual Conference March 16, 2012

  2. Value Added Teacher value added = Students’ actual end-of-year test scores – Students’ predicted end-of-year test scores • Statistical model predicts student achievement • Account for pretests, student characteristics • Ranks teachers relative to an average teacher

  3. Value Added in DCPS Evaluation System

  4. Key Points • Implementation requires sufficient capacity • Communication strategy is vital • Value added is worth the investment

  5. Where We Were in 2007 VS 8th grade reading proficiency (2007 NAEP) Teachers meeting or exceeding expectations 95% 12%

  6. Why Value Added for DCPS? • Fairest way to evaluate teachers • Objective, data-based measure • Focused on student achievement

  7. Value Added in DCPS Evaluation System IVA: Individual value added TLF: Teaching and learning framework (classroom observations) CSC: Commitment to school community SVA: School value added • Individual value-added measures: 50 percent of eligible teachers’ IMPACT scores 7

  8. IMPACT Is High Stakes • Highly effective: performance pay • Ineffective (one year): subject to separation • Minimally effective (consecutive years): subject to separation 100 175 250 350 400 8

  9. Overall Performance DistributionPPEP vs. IMPACT n=3,469 9

  10. Value Added in DC

  11. Help for DC Public Schools • Mathematica Policy Research • Technical Advisory Board [2012] • Steve Cantrell, Gates Foundation • Laura Hamilton, RAND Corporation • Rick Hanushek, Stanford University • Kati Haycock, Education Trust • David Heistad, Minneapolis Public Schools • Jonah Rockoff, Columbia Business School • Tim Sass, Georgia State University • Jim Wyckoff, University of Virginia

  12. Mathematica’s Work with DC Schools

  13. Challenges Consider face validity, incentive effects Teacher-student link data can be challenging All data decisions shared with district Timeline must allow DCPS to transition out poor performers, hire new teachers

  14. No One-Size-Fits-All Value Added Model • Choosing student characteristics: communications challenge for race/ethnicity • Multiple years of data: bias/precision trade-off • Joint responsibility for co-teaching • Cannot estimate model of separate teacher effects • Can estimate “teams” model, but should team estimates count? • Comparing teachers of different grades

  15. Roster Confirmation • Teacher-student links critical for value added • Administrative data can be challenging • Specialized elementary school teachers • Co-teaching • Pull-out and push-in programs • Midyear student transfers • Teachers surveyed to confirm administrative roster data (Battelle for Kids)

  16. Business Rules: Documenting Data Decisions Every data decision defined, discussed, documented beforehand Let OSSE, DCPS review all decisions Document entire process Make quick progress when final data arrive

  17. Production: Meeting Timelines, Ensuring Accuracy • October data: formulate business rules • February data • Establish data cleaning programs • Begin trial runs from analysis file to final output • April data: Final student data in trial runs • June (test score) data: produce final results

  18. Perspective of State Education Agency

  19. Race To The Top • Federal competition between states • Required student achievement to contribute 50% of teacher evaluation score • Decision to use DCPS value-added model for all eligible DC teachers • Brought DCPS and charter schools together • Each charter school LEA has own evaluation system used to inform personnel decisions

  20. Common Decision-Making • Need to make decisions on value added • Quickly to meet production schedule • Informed by best available data • Obtains buy-in from charter schools and DCPS • Technical Support Committee (TSC) • Six members: five charter, one DCPS • Meets periodically • Consensus decisions sought

  21. Data Infrastructure • Most data elements for value added exist • . . . but not necessarily collected on right schedule • Student background characteristics • Collected twice a year for AYP purposes • Need three-time-a-year collection, earlier schedule for value added

  22. Need Capacity Within District • Do not just hire a contractor • Need dedicated staff to answer questions • Data team • Technical Support Committee

  23. Communicating Results to DC Teachers

  24. Communication Strategy • Value added hard to understand • Requires a strong statistical background • Final information is hard to connect to familiar test scores • Different from other student achievement measures teachers commonly use • Communication tools • Guidebooks • Information sessions

  25. What Factors Affect a Student’s Achievement? Teacher’s Level of Expectations Student’s Prior Learning Student Achievement As Measured by the DC CAS Value-added isolates the teacher’s impact on student achievement. Teacher’s Content Knowledge Student’s Resources at Home Student’s Disability (If Any) Teacher’s Pedagogical Expertise Student’s English Proficiency Teacher’s Ability to Motivate

  26. Initiatives Under Development • Student-level output for DC teachers • Would show pretest, predicted posttest, actual posttest score for each student • May be in graphical format • Intermediate value-added scores • Individual value-added scores based on intermediate tests • Could be given to teachers midyear

  27. Conclusions • Implementing value added requires . . . • Availability and accessibility of current data • Confirmation of teacher-student links • Careful planning of production process • Sufficient capacity within local and/or state education agency to interact with value-added contractor • Teacher buy-in is not a given – communication strategy is vital • Properly implemented, value added is worth the investment • Fairest measure of teacher effectiveness • Provides data for answering research questions

More Related