1 / 31

Evidence Informing Practice

Evidence Informing Practice . Robert Coe ASCL Annual Conference , 2 1 March 2014. Outline. What can research tell us about the likely impacts and costs of different strategies? How do we implement these strategies to … Focus on what matters Change classroom practice

duard
Download Presentation

Evidence Informing Practice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evidence Informing Practice Robert Coe ASCL Annual Conference, 21 March 2014

  2. Outline • What can research tell us about the likely impacts and costs of different strategies? • How do we implement these strategiesto … • Focus on what matters • Change classroom practice • Target areas of need • Produce demonstrable benefits Improving Education: A triumph of hope over experience http://www.cem.org/attachments/publications/ImprovingEducation2013.pdf

  3. Evidence about the effectiveness of different strategies

  4. Toolkit of Strategies to Improve Learning The Sutton Trust-EEF Teaching and Learning Toolkit http://www.educationendowmentfoundation.org.uk/toolkit/

  5. www.educationendowmentfoundation.org.uk/toolkit Impact vs cost Most promising for raising attainment 8 May be worth it Feedback Meta-cognitive Peer tutoring Early Years Homework (Secondary) 1-1 tuition Effect Size (months gain) Collaborative Behaviour Small gp tuition Phonics Parental involvement Smaller classes ICT Social Summer schools Individualised learning Small effects / high cost After school Mentoring Homework (Primary) Teaching assistants Performance pay Aspirations 0 Ability grouping £0 £1000 Cost per pupil

  6. Key messages • Some things that are popular or widely thought to be effective are probably not worth doing • Ability grouping (setting); After-school clubs; Teaching assistants; Smaller classes; Performance pay; Raising aspirations • Some things look ‘promising’ • Effective feedback; Meta-­cognitive and self regulation strategies; Peer tutoring/peer‐assisted learning strategies; Homework

  7. Clear, simple advice: • Choose from the top left • Go back to school and do it For every complex problem there is an answer that is clear, simple, and wrong H.L. Mencken

  8. Why not? • We have been doing some of these things for a long time, but have generally not seen improvement • Research evidence is problematic • Sometimes the existing evidence is thin • Research studies may not reflect real life • Context and ‘support factors’ may matter • Implementation is problematic • We may think we are doing it, but are we doing it right? • We do not know how to get large groups of teachers and schools to implement these interventions in ways that are faithful, effective and sustainable

  9. So what should we do?

  10. Four steps to improvement • Think hard about learning • Invest in good professional development • Evaluate teaching quality • Evaluate impact of changes

  11. 1. Think hard about learning

  12. www.educationendowmentfoundation.org.uk/toolkit Impact vs cost Most promising for raising attainment 8 May be worth it Feedback Meta-cognitive Peer tutoring Early Years Homework (Secondary) 1-1 tuition Effect Size (months gain) Collaborative Behaviour Small gp tuition Phonics Parental involvement Smaller classes ICT Social Summer schools Individualised learning Small effects / high cost After school Mentoring Homework (Primary) Teaching assistants Performance pay Aspirations 0 Ability grouping £0 £1000 Cost per pupil

  13. Which strategies/interventions are very surprising (you really don’t believe it)? • Which strategies/interventions can you explain why they do (or don’t) improve attainment? • Which strategies/interventions o you want to know more about?

  14. Poor Proxies for Learning • Students are busy: lots of work is done (especially written work) • Students are engaged, interested, motivated • Students are getting attention: feedback, explanations • Classroom is ordered, calm, under control • Curriculum has been ‘covered’ (ie presented to students in some form) • (At least some) students have supplied correct answers, even if they • Have not really understood them • Could not reproduce them independently • Will have forgotten it by next week (tomorrow?) • Already knew how to do this anyway

  15. Do children learn better in the morning or afternoon?

  16. A better proxy for learning? Learning happens when people have to think hard

  17. Hard questions about your school • How many minutes does an average pupil on an average day spend really thinking hard? • Do you really want pupils to be ‘stuck’ in your lessons? • If they knew the right answer but didn’t know why, how many pupils would care?

  18. 2. Invest in effective CPD

  19. How do we get students to learn hard things? Eg • Place value • Persuasive writing • Music composition • Balancing chemical equations • Explain what they should do • Demonstrate it • Get them to do it (with gradually reducing support) • Provide feedback • Get them to practise until it is secure • Assess their skill/ understanding

  20. How do we get teachers to learn hard things? Eg • Using formativeassessment • Assertive discipline • How to teachalgebra • Explain what they should do

  21. What CPD helps learners? • Intense: at least 15 contact hours, preferably 50 • Sustained: over at least two terms • Content focused: on teachers’ knowledge of subject content & how students learn it • Active: opportunities to try it out & discuss • Supported: external feedback and networks to improve and sustain • Evidence based: promotes strategies supported by robust evaluation evidence Do you do this?

  22. 3. Evaluate teaching quality

  23. Why monitor? • Strong evidence of (potential) benefit from • Performance feedback(Coe, 2002) • Target setting (Locke & Latham, 2006) • Intelligent accountability (Wiliam 2010) • Individual teachers matter most • Everyone can improve • Teachers stop improving after 3-5 years • Judging real quality/effectiveness is very hard • Multidimensional • Not easily visible • Confounded

  24. Monitoring the quality of teaching • Progress in assessments • Quality of assessment matters (cem.org/blog) • Regular, high quality assessment across curriculum (InCAS, INSIGHT) • Classroom observation • Much harder than you think! (cem.org/blog) • Multiple observations/ers, trained and QA’d • Student ratings • Extremely valuable, if done properly (http://www.cem.org/latest/student-evaluation-of-teaching-can-it-raise-attainment-in-secondary-schools) • Other • Parent ratings feedback • Student work scrutiny • Colleague perceptions (360) • Self assessment • Pedagogical content knowledge

  25. Teacher Assessment • How do you know that it has captured understanding of key concepts? • vs ‘check-list’ (eg ‘;’=L5, 3 tenses=L7) • How do you know standards are comparable? • Across teachers, schools, subjects • Is progress good? • How have you resolved tensions from teacher judgments being used to judge teachers?

  26. Evidence-Based Lesson Observation • Behaviour and organisation • Maximise time on task, engagement, rules & consequences • Classroom climate • Respect, quality of interactions, failure OK, high expectations, growth mindset • Learning • What made students think hard? • Quality of: exposition, demonstration, scaffolding, feedback, practice, assessment • What provided evidence of students’ understanding? • How was this responded to? (Feedback)

  27. Next generation of CEM systems … • Assessments that are • Comprehensive, across the full range of curriculum areas, levels, ages, topics and educationally relevant abilities • Diagnostic, with evidence-based follow-up • Interpretable, calibrated against norms and criteria • High psychometric quality • Feedback that is • Bespoke to individual teacher, for their students and classes • Multi-component, incorporating learning gains, pupil ratings, peer feedback, self-evaluation, … • Diagnostic, with evidence-based follow-up • Constant experimenting

  28. 4. Evaluate impact of changes

  29. School ‘improvement’ often isn’t • School would have improved anyway • Volunteers/enthusiasts improve: misattributed to intervention • Chance variation (esp. if start low) • Poor outcome measures • Perceptions of those who worked hard at it • No robust assessment of pupil learning • Poor evaluation designs • Weak evaluations more likely to show positive results • Improved intake mistaken for impact of intervention • Selective reporting • Dredging for anything positive (within a study) • Only success is publicised (Coe, 2009, 2013)

  30. Key elements of good evaluation EEF DIY Evaluation Guide • Clear, well defined, replicable intervention • Good assessment of appropriate outcomes • Well-matched comparison group What could you evaluate?

  31. Summary … • Think hard about learning • Invest in good CPD • Evaluate teaching quality • Evaluate impact of changes Robert.Coe@cem.dur.ac.uk @ProfCoe www.cem.org

More Related