1 / 29

Issues in impact evaluation in design

Issues in impact evaluation in design. Howard White International Initiative for Impact Evaluation (3ie). Impact evaluation. An impact evaluation seeks to attribute all, or part, of the observed change in outcomes to a specific intervention. Impact in the log frame: basic education.

lucien
Download Presentation

Issues in impact evaluation in design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Issues in impact evaluation in design Howard White International Initiative for Impact Evaluation (3ie)

  2. Impact evaluation An impact evaluation seeks to attribute all, or part, of the observed change in outcomes to a specific intervention.

  3. Impact in the log frame: basic education

  4. So what is impact? • Focus on final welfare outcomes, e.g. • Infant mortality • Income poverty • Security • Usually long-term, but not necessarily so (but then sustainability is an issue)

  5. What sort of things can we do impact analysis on? • Projects (or specific interventions) • Individual projects are the ‘back bone’ of impact analysis • But even then may only be able to do rigorous impact analysis of some components • Programmes • Sector wide programs can be conceived of as supporting a range of interventions, many of which can be subject to rigorous impact evaluation. • Policies • In general different approaches are required, such as CGEs – these are not being discussed today

  6. engage Pick a named intervention for an impact evaluation and make a short list of indicators (using the log frame) for evaluation of this intervention

  7. What do we need to measure impact? Girl’s secondary enrolment

  8. Post-treatment control comparison But we don’t know if they were similar before… though there are ways of doing this

  9. Before versus after comparison Sometimes this can work … but usually not

  10. Double difference =(66-40)-(55-44) = 26-11 = 15

  11. THE IMPORTANCE OF BASELINE DATA • Ex ante design preferred to ex post: impact evaluation design is much stronger if baseline data are available (but may still be possible even if they are not) • Means collecting data before intervention starts, and can be affecting the design of the intervention • But can sometimes use secondary data, that is an existing survey

  12. Issues in conducting impact evaluation • Confounding factors • Selection effects • Spillovers and contagion • Impact heterogeneity • Ensuring policy relevance

  13. Confounding factors • Other things happen – so before versus after rarely sufficient • So get a control group… but different things may happen there • So collect data on more than just outcome and impact indicators • And collect baseline data • But …

  14. Selection bias • Program placement and self-selection • Program beneficiaries have particular characteristics correlated with outcomes – so impact estimates are biased • Need to use experimental or quasi-experimental methods to cope with this; this is what has been meant by rigorous impact evaluation • But it is just one facet of impact evaluation design • Other things can also bias impact estimates

  15. How to solve for selection bias • Experimental (randomized): • Limited application, but there are applications and it is a powerful approach • Many concerns (e.g. budget and ethics) and not valid • Quasi-experimental design (regression based): • Propensity score matching is most common • Regression discontinuity • Interrupted time series • Regression modelling of outcomes

  16. Spillover and contagion • Spillover – positive and negative impacts on non-beneficiaries • Contagion – similar interventions in control areas • Need to collect data on these aspects and may need to revise evaluation design

  17. Engagement 2 WHAT ARE THE MAJOR CONFOUNDING FACTORS FOR YOUR OUTCOME AND IMPACT INDICATORS? HOW MIGHT SELECTION BIAS, SPILLOVER AND CONTAGION AFFECT THE EVALUATION OF THE INTERVENTION YOU HAVE SELECTED?

  18. Impact heterogeneity • Impact varies by intervention (design), beneficiary and context • ‘Averages’ can be misleading • Strong implications for evaluation design

  19. Impact heterogeneity by design: complements or substitutes? • Is the impact of X and Y, bigger, equal to or less than the impacts of doing X and Y separately? • For example, hygiene promotion and sanitation facilities • Evidence suggestions they are substitutes- either one reduces incidence child diarrhea by 40-50%, but not by more if the two are combined

  20. Impact heterogeneity by beneficiary: nutrition interventions • Irreparable damage to physical and cognitive development results from nutritional deprivation in the first two years of life • Hence interventions to infants have greater long-run impact on many outcomes than do those aimed at older children (such as school feeding programs)

  21. Impact heterogeneity by context: expected impact of irrigation project under different scenarios

  22. ENGAGEMENT 3 What sort of differences in impact would you expect for your intervention with respect to intervention (design), context and beneficiary?

  23. Ensuring policy relevance • Process • Stakeholder engagement • Packaging messages • Design • Theory-based approach • Mixed methods • Capture all costs and benefits, including cross-sectoral effects • Cost effectiveness and CBA

  24. THEORY-BASED EVALUATION • Make explicit underlying theory about how inputs lead to intended outcomes and impacts • Documents every step in causal chain • Draws on multiple data sources and approaches • Stresses context of why or why not working

  25. An example: Bangladesh nutrition project

  26. Data collection • Need to collect survey data at the unit of intervention (child, firm etc) • Will need also facility/project data • Need data across the log frame and for confounding factors – and for your instrumental variables (lack of valid instruments is the major obstacle to performing IE) • Designing data collection instruments takes time and should be iterated with qualitative data

  27. Data collection in ieg impact studies

  28. Group exercise OUTLINE YOUR PROPOSED EVALUATION DESIGN (TIMING OF DATA COLLECTION, IDENTIFICATION OF CONTROL, IF ANY) WHAT DATA SOURCES WOULD YOU USE FOR YOUR PROPOSED EVALUATION?

  29. Thank you VISIT WWW.3IEIMPACT.ORG

More Related