1 / 40

Evaluating the Impact of Performance-related Pay for teachers in England

Evaluating the Impact of Performance-related Pay for teachers in England. Adele Atkinson, Simon Burgess, Bronwyn Croxson, Paul Gregg, Carol Propper, Helen Slater, Deborah Wilson. Background.

kylar
Download Presentation

Evaluating the Impact of Performance-related Pay for teachers in England

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating the Impact of Performance-related Pay for teachers in England Adele Atkinson, Simon Burgess, Bronwyn Croxson, Paul Gregg, Carol Propper, Helen Slater, Deborah Wilson

  2. Background • Improving education outcomes key priority for governments, but evidence suggests poor returns from simply raising school resources. • One alternative mechanism: incentives for teachers, but rel. little evidence on impact. • 1999: UK government introduced performance related pay scheme for teachers (the “Performance Threshold”). • Performance assessed across five criteria, inc. pupil progress (value-added). www.bris.ac.uk/Depts/CMPO/

  3. What we do in this paper • Quantitative evaluation of the impact of this PRP scheme for teachers on pupil test score gains. • Design • Longitudinal teacher-level data and a difference-in-difference research design. • Link pupils to their teachers for each subject; collect prior attainment data for each pupil. • So control for teacher and pupil fixed effects. • Also control for differences in teacher experience. • Incentive scheme had significant effects on pupil progress. www.bris.ac.uk/Depts/CMPO/

  4. Outline of the talk • Current evidence (in paper, not here) • The National Curriculum • The PRP scheme • Data • Evaluation methodology • Results • Conclusion www.bris.ac.uk/Depts/CMPO/

  5. The National Curriculum • Centralised system of control over national exams and teacher pay scales. • All pupils tested at the end of each Key Stage of the National Curriculum. • KS1 and KS2 tests taken at ages 7 and 11; KS3 and KS4 (GCSE) taken at ages 14 and 16. • KS1, 2, 3 tests taken in English, maths, science. • These subjects compulsory also at KS4. • We focus on KS4 and value added between KS3 and KS4. www.bris.ac.uk/Depts/CMPO/

  6. The PRP scheme • Labour administration 1998 Green Paper: range of reforms to education, inc. performance-related element to teacher pay. • The ‘Performance Threshold’ introduced in 1999/2000; first applications in July 2000. • The Performance Threshold itself was one element of larger pay reform, designed to affect teacher effort, as well as recruitment and retention. www.bris.ac.uk/Depts/CMPO/

  7. The PRP scheme II • Prior to the PRP scheme, all teachers paid on unified basic salary scale which had 9 full points. • Position on scale depended on qualifications and/or experience; progress through annual increments. Plus additional management points available. • 1999/2000: approx. 75% of teachers at top of scale, at spine point 9. www.bris.ac.uk/Depts/CMPO/

  8. The PRP scheme III • After the reforms, teachers on spine point 9 could apply to pass the Performance Threshold. 2 effects: • Annual bonus of £2,000. • Move onto the Upper Pay Scale (UPS): additional spine points, each of which related to performance. www.bris.ac.uk/Depts/CMPO/

  9. The PRP scheme IV • To pass the Threshold, teachers had to demonstrate effectiveness in five areas, including pupil progress (value added). • Forms submitted by July 2000. Assessed by headteacher and external assessor. • Initial Threshold payments funded out of a separate, central budget; no quota or limit. • The vast majority of eligible teachers both applied and were awarded the bonus. www.bris.ac.uk/Depts/CMPO/

  10. Was it incentive pay? • Wragg et al (2001) survey of 1000 schools • In these schools, 88% of the eligible teachers applied, and of these 97% were awarded the bonus • Unconditional pay increase - little effect on teacher effort. • But • Ex ante (Marsden) survey suggests teachers believed it to be ‘real’ • UPS element clearly performance related www.bris.ac.uk/Depts/CMPO/

  11. Teacher survey before implementation www.bris.ac.uk/Depts/CMPO/

  12. Data requirements • Control for pupil prior attainment to measure progress or value added: • KS3-GCSE; English, maths, science. • Longitudinal element: • Follow same teachers through two complete KS3-GCSE teaching cycles (before and after scheme introduced). • Link pupils to teachers: • Obtain class lists direct from schools. www.bris.ac.uk/Depts/CMPO/

  13. Sample • First approached schools in 2000. • Onerous data requirements; problems with school information systems; teacher and headteacher turnover. • Final sample: • 18 schools. • 181 teachers (145 eligible; 36 not eligible). • Approx. 23,000 pupils. • No presumption that sample is representative. www.bris.ac.uk/Depts/CMPO/

  14. Evaluation Methodology • Pupil i; teacher j; teaching cycle t. • Teacher effectiveness, X; test score, g; value-added v. www.bris.ac.uk/Depts/CMPO/

  15. Teacher mean scores: • Difference between two tranches: www.bris.ac.uk/Depts/CMPO/

  16. Differencing between eligible and ineligible teachers. D(x) operator means: D(x)  E(x|DI=1) – E(x|DI=0) • This is the difference-in-difference. If • This yields: www.bris.ac.uk/Depts/CMPO/

  17. For value-added: • And same steps as before yield the following as the diff-in-diff: www.bris.ac.uk/Depts/CMPO/

  18. Key issues • Parameters of interest are: • g3b for gross test score • m2b for value added • Role of experience profile: • If f(W) is linear, no problem, as DDf(W) = 0 • If concave, diff-in-diff underestimates parameters of interest, as DDf(W) < 0. www.bris.ac.uk/Depts/CMPO/

  19. Experience Profile www.bris.ac.uk/Depts/CMPO/

  20. Key issues (cont.) • Experimental design and pupil assignment: • No grouping on effort. • Timing of class assignment. www.bris.ac.uk/Depts/CMPO/

  21. Results • Difference-in-difference results • Regressions • Robustness checks • Interpretation and evaluation www.bris.ac.uk/Depts/CMPO/

  22. Table 2: D-inD analysis: GCSEs www.bris.ac.uk/Depts/CMPO/

  23. Table 3: D-in-D VA Means www.bris.ac.uk/Depts/CMPO/

  24. Experience Difference • Potentially need to control for systematic differences in experience: ideal: non-parametrically defined experience-effectiveness profile. • Not enough data to do that, so define a ‘novice’ teacher dummy picking out teachers with least experience. www.bris.ac.uk/Depts/CMPO/

  25. Table 4: GCSE Analysis www.bris.ac.uk/Depts/CMPO/

  26. Table 5: Value Added Analysis www.bris.ac.uk/Depts/CMPO/

  27. Table 6: Subject Differences www.bris.ac.uk/Depts/CMPO/

  28. Robustness checks • Leave out novices to just compare eligibles and ineligibles with pretty similar experience. • Ceiling effects on marks: just look at pupils in bottom 75% of KS3 distribution • Robust to these www.bris.ac.uk/Depts/CMPO/

  29. Evaluation • One standard deviation in the teacher-mean change in GCSE is 1.29, and 0.58 for VA • Coefficients on eligibility of 0.890 for GCSE change and 0.422 for VA change • As percentages of a standard deviation these are 69% and 73% • Alternatively, eligibility dummy is 67% of the novice teacher dummy for GCSE change, and 78% for VA change. www.bris.ac.uk/Depts/CMPO/

  30. Conclusions • Rich data, research design which controls for teacher and pupil effects • Results: around 0.5 GCSE grade per pupil • Caveats • Was it incentive pay and the experience-effectiveness profile • Extra effort or effort diversion? www.bris.ac.uk/Depts/CMPO/

  31. Additional slides www.bris.ac.uk/Depts/CMPO/

  32. Table 2: Summary teacher stats www.bris.ac.uk/Depts/CMPO/

  33. Table 3: Summary pupil stats www.bris.ac.uk/Depts/CMPO/

  34. Table 4: Comparative stats www.bris.ac.uk/Depts/CMPO/

  35. Data requested from schools www.bris.ac.uk/Depts/CMPO/

  36. www.bris.ac.uk/Depts/CMPO/

  37. www.bris.ac.uk/Depts/CMPO/

  38. www.bris.ac.uk/Depts/CMPO/

  39. Table 12: Distributional Impacts www.bris.ac.uk/Depts/CMPO/

  40. Table 13: Robustness checks www.bris.ac.uk/Depts/CMPO/

More Related