1 / 24

Implementation: what is it, why is it important and how can we assess it?

Session 2: Implementation and process evaluation Implementation Neil Humphrey (Manchester) Ann Lendrum (Manchester) Process evaluation Louise Tracey (IEE York). Implementation: what is it, why is it important and how can we assess it?. Neil Humphrey, Ann Lendrum and Michael Wigelsworth

yehudi
Download Presentation

Implementation: what is it, why is it important and how can we assess it?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Session 2: Implementation and process evaluationImplementationNeil Humphrey (Manchester) Ann Lendrum (Manchester)Process evaluationLouise Tracey (IEE York)

  2. Implementation: what is it, why is it important and how can we assess it? Neil Humphrey, Ann Lendrum and Michael Wigelsworth Manchester Institute of Education University of Manchester, UK neil.humphrey@manchester.ac.uk

  3. Overview • What is implementation? • Why is studying implementation important? • How can we assess implementation? • Activity • Feedback • Sources of further information and support

  4. What is implementation? • Implementation is the process by which an intervention is put into practice • If assessment of outcomes answers the question of ‘what works’, assessment of implementation helps us to understand how and why • Implementation science has grown dramatically in recent years. For example, in the field of social and emotional learning, an early review by Durlak (1997) found that only 5 per cent of intervention studies provided data on implementation. This figure had risen to 57 per cent 14 years later (Durlak et al, 2011)

  5. What is implementation? • Aspects of implementation • Fidelity/adherence • Dosage • Quality • Participant responsiveness • Programme differentiation • Programme reach • Adaptation • Monitoring of comparison conditions • Factors affecting implementation • Preplanning and foundations • Implementation support system • Implementation environment • Implementer factors • Programme characteristics • See Durlak and DuPre(2008), Greenberg et al (2005), Forman et al (2009)

  6. Why is studying implementation important? • Domitrovich and Greenberg (2000) • So that we know what happened in an intervention • So that we can establish the internal validity of the intervention and strengthen conclusions about its role in changing outcomes • To understand the intervention better – how different elements fit together, how users interact etc • To provide ongoing feedback that can enhance subsequent delivery • To advance knowledge on how best to replicate programme effects in real world settings • However, there are two very compelling additional reasons! • Interventions are rarely, if ever, implemented as designed • Variability in implementation has been consistently shown to predict variability in outcomes • So, implementation matters!

  7. Why is studying implementation important?

  8. Why is studying implementation important?

  9. How can we assess implementation? • Some choices • Quantitative, qualitative or both? • Using bespoke or generic tools? • Implementer self-report or independent observations? • Frequency of data collection? • Which aspects to assess? • Some tensions • Implementation provides natural variation – we cannot randomize people to be good or poor implementers! (although some researchers are randomizing key factors affecting implementation – such as coaching support) • Assessment of implementation can be extremely time consuming and costly • Fidelity and dosage have been the predominant aspects studied because they are generally easier/simpler to quantify/measure. We therefore know a lot less about the influence of programme differentiation, quality et cetera • The nature of a given intervention can influence the relative ease with which we can accurately assess implementation (for example, the assessment of fidelity is relatively straightforward in highly prescriptive, manualised interventions)

  10. Activity • Think about a school-based intervention that you are evaluating – whether for the EEF or another funder • How are you assessing implementation? • What choices did you make (see previous slide) and why? • What difficulties have you experienced? How are these being overcome? • How do you plan to analyse your data? • What improvements could be made to your implementation assessment protocol?

  11. Assessment of implementation– case study (PATHS trial) • PATHS trial overview • Universal social-emotional learning curriculum delivered in twice-weekly lessons, augmented by generalisationtechniques and home-link work • 45 schools randomly allocated to deliver PATHS or continue practice as usual for 2 years • c.5,000 children aged 7-9 at start of trial • Outcomes assessed: social-emotional skills, emotional and behavioural difficulties, health-related quality of life, various school outcomes (attendance, attainment, exclusions) • Assessment of implementation • Independent observations • Structured observation schedule developed, drawing upon previous existing tools • Piloted and refined using video footage of PATHS lessons; inter-rater reliability established • 1 lesson observation per class; moderation by AL in 10% to promote continued inter-rater reliability • Provides quantitative ratings of fidelity/adherence, dosage, quality, participant responsiveness, reach and qualitative field-notes on each of these factors • Teacher self-report • Teacher implementation survey developed following structure/sequence of observation schedule to promote comparability • Teachers asked to report on their implementation on each of the above factors over the course of the school year in addition to providing information about the nature of adaptations made (surface vs deep) • School liaison report • Annual survey on usual practice in relation to social-emotional learning (both universal and targeted) to provide data on programme differentiation • Analysis using 3-level multi-level models (School, class, pupil) • Plus! Lots of qualitative data derived from interviews with teachers and further quantitative data on factors affecting implementation

  12. Sources of further information and support • Some reading • Lendrum, A. & Humphrey, N. (2012). The importance of studying the implementation of school-based interventions. Oxford Review of Education, 38, 635-652. • Durlak, J.A. & DuPre, E.P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327-350. • Kelly, B. & Perkins, D. (Eds.) (2012). Handbook of implementation science for psychology in education. Cambridge: CUP • Organisations • Global Implementation Initiative: http://globalimplementation.org/ • UK Implementation Network: http://www.cevi.org.uk/ukin.html • Journals • Implementation Science: http://www.implementationscience.com/ • Prevention Science: http://link.springer.com/journal/11121

  13. Developing our approach to process evaluation Louise Tracey

  14. Process Evaluation: ‘documents and analyses the development and implementation of a programme, assessing whether strategies were implemented as planned and whether expected output was actually produced’ (Bureau of Justice Assistance (1997) (cf: EEF 2013)

  15. Reasons for Process Evaluation Formative Implementation/Fidelity Understanding Impact

  16. Methods of Process Evaluation Quantitative / Qualitative Observations Interviews Focus groups Surveys Instruments Programme data

  17. Plymouth Parent Partnership: SPOKES Literacy programme for parents of struggling readers in Year 1 6 Cohorts Impact Evaluation: Pre-test, post-test, 6-month & 12-month follow-up

  18. Plymouth Parent Partnership: SPOKES Process Evaluation: Parent Telephone Interview PARYC Teacher SDQs Parent Questionnaire Attendance Records Parent Programme Evaluation Survey

  19. SFA Primary Evaluation Impact Evaluation: RCT of SFA in 40 schools Pre-test / post-test (Reception) & 12-month follow-up (Year 1) National Data (KS1/2) Process Evaluation: Observation Routine Data

  20. Discussion Questions 1. What are the key features of your process evaluation? Why did you choose them? 2. What were the main challenges? How have you overcome them?

  21. Key Features?Why chosen? Key stakeholders Inclusivity Reliability Costs Inform impact evaluation

  22. Main challenges? How overcome? • Shared understanding with key stakeholders/implementers • Reliability • Burden on schools • Control groups • Costs

  23. Any Questions?

  24. Thank you! louise.tracey@york.ac.uk

More Related