1 / 29

New Approaches in Education Evaluation: Design Challenges, Perspectives

Explore the innovative approaches, different perspectives, and design challenges in education evaluation, with a focus on the impact and process evaluations, rigorous methodologies, and the importance of user-centered design. Learn about evaluation in education compared to other fields and the strategies used to maximize effectiveness and efficiency.

flanagan
Download Presentation

New Approaches in Education Evaluation: Design Challenges, Perspectives

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation in Education: 'new' approaches, different perspectives, design challengesCamilla NevillHead of Evaluation, Education Endowment Foundation 24th January 2017camilla.nevill@eefoundation.org.ukwww.educationendowmentfoundation.org.uk @EducEndowFoundn

  2. Introduction • The EEF is an independent charity dedicated to breaking the link between family income and educational achievement. • In 2011 the Education Endowment Foundation was set up by Sutton Trust as lead charity in partnership with the ImpetusTrust.The EEF is funded by a Department for Education grant of £125m and will spend over £220m over its fifteen year lifespan. • In 2013, the EEF was named with The Sutton Trust as the government-designated ‘What Works’ centre for improving education outcomes for school-aged children.

  3. The EEF Our approach Two aims: 1. Break the link between family income and school attainment 2. Build the evidence base on the most promising ways of closing the attainment gap

  4. The Teaching and Learning Toolkit • A meta-analysis of education research • Contains c.10,000 studies • Cost, impact & security included to aid comparison

  5. EEF, March 2016 7,500 schools currently participating in projects 133 projects funded to date 750,000pupils currently involved in EEF projects £220mestimated spend over lifetime of the EEF 26independent evaluation teams £82m funding awarded to date 66published reports

  6. New approach, different perspectives, design challenges • Design with the end user in mind • There is no one right answer – communicate and compromise

  7. New approach Rigorous, independent evaluations • Evaluate projects Independent evaluation Longitudinal outcomes Impact and process evaluations Robust counterfactual (RCTs)

  8. Education v other fields How does this compare to evaluation in your field?

  9. Trials: Education v public health

  10. Main messages • Design with the end user in mind • There is no right answer – communicate and compromise

  11. Process for appointing evaluators Grants team identify projects, 1st Grants Comm. shortlist Teams submit 8 page proposal Teams chosen to submit proposal Teams chosen to evaluate projects Evaluation teams receive 1page project descriptio-ns Teams submit 2 page EoI First set-up meeting with evaluation team, project team and EEF 2ndGrants Comm. shortlist 2nd set up meeting with evaluation team, project team and EEF Finalise evaluation design. Decide on eligibility criteria, details of protocol, process evaluation measures linked to logic model Share understanding of intervention logic. Decide overall design, timeline , sample size, control group condition. Developer (& evaluator) budgets set

  12. Different perspectives EEF Evaluator Set-up meeting Developer

  13. Different perspectives EEF Useful results Quick results Keep costs down Evaluator Publications Funding to do research Personal interests Set-up meeting Developer Funding to deliver programme Demonstrate impact Good relationships with schools Publications?

  14. Design challenges Improving Working Memory • Teaching memory strategies by playing computer games • For 5 year-olds struggling at maths • Delivered by Teaching Assistants • Developed by Oxford University educational psychologists • Evidence of improvement in WM from two small (30 and 150 children) controlled studies

  15. Design challenges How many arms? • Working Memory (WM) • WM blended with maths • Matched time maths support • Business as usual (BAU)

  16. Design challenges When would you randomise? Deliver programme (10 hours) 121 support for 20-30 mins for total 5 hours Computer games for 5 hours Maths attainment Improved working memory Identify pupils (bottom 1/3) School Recruited Identify TAs and link teacher One-day training for TAs Oxford University

  17. Design challenges Delivery log Deliver programme (10 hours) Survey, observations, interviews Maths test WM test • 121 support for 20-30 mins for total 5 hours Computer games for 5 hours Maths attainment • One-day training for TAs • Identify pupils (bottom 1/3) Improved working memory School Recruited • Identify TAs and link teacher • Oxford University Randomisation

  18. Design challenges Catch Up Numeracy • For 4 to 11 year-olds struggling at maths • Delivered by Teaching Assistants • 10 modules of tailored support • Flexible delivery model (no fixed length) • Evidence from EEF pupil-randomised efficacy trial:

  19. Design challenges What control group would you use?

  20. Design challenges Catch Up Numeracy 150 schools Recruited Identify TAs and ~8 children in years 3-5 behind in maths Randomise 75 schools, 600 children: Flexible Catch Up delivery model 75 schools, 600 children: Business as usual control group Follow up maths test

  21. Problems with interpretation What if we see no effect of Catch Up and control group gets lots more support? What if we see a big effect of Catch Up and the control group has received lots less support?

  22. A radical idea: Pre-specify interpretation!

  23. A radical idea: Pre-specify interpretation!

  24. A radical idea: Pre-specify interpretation!

  25. Design challenges Boarding school • Children in need at risk of going into care • Referred by Local Authorities Teenage Sleep • Changing school start times to later • Positive effects from US trials (8am start v 11am start)

  26. Main messages (and sub-messages) • Design with the end user in mind • Test the right intervention • Make sure your comparison is relevant • Measure implementation and cost • There is no right answer – communicate and compromise • Use logic model to understand the intervention • Pre-specify the interpretation to aid decision making • Not all interventions can be randomised

  27. Thank youcamilla.nevill@eefoundation.org.ukwww.educationendowmentfoundation.org.uk @EducEndowFoundn

  28. Measuring the security of trials • Summary of the security of evaluation findings • ‘Padlocks’ developed in consultation with evaluators • Five categories – combined to create overall rating:

More Related