1 / 16

Steps in Implementing an Impact Evaluation

Victor Orozco, Development IMpact Evaluation Initiative (DIME). Steps in Implementing an Impact Evaluation. Steps. Step 1. Build capacity for IE. Objectives: Become informed consumers of impact evaluation Set the learning agenda

maalik
Download Presentation

Steps in Implementing an Impact Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Victor Orozco, Development IMpact Evaluation Initiative (DIME) Steps in Implementing an Impact Evaluation

  2. Steps

  3. Step 1. Build capacity for IE • Objectives: • Become informed consumers of impact evaluation • Set the learning agenda • Use it as an internal management tool to improve program over time • How • Training • Learning by doing

  4. Step 2: Set learning agenda • Objective: • Get answers to relevant policy and operational questions • How? • Dialectic discussion involving key policy makers and program managers • Technical facilitation to structure framework of analysis • Focus on few critical policy (what) and operational (how to) questions • Discuss agenda with authorizing environment and constituencies

  5. Cont. 2: Questions • Operational: design-choices of program • Institutional arrangements, Delivery mechanisms, Packages, Pricing/incentive • Management purpose • Use random trials to test alternatives • Measure effects on short term outcomes (months) • take up rates, use, adoption • Scale up better implementation modalities • Policy: effectiveness of program • Accountability purpose • Use random assignment or next best method • Measure effects medium to long term • Scale up/down, negotiate budget, inform

  6. Step 3: Design IE • Exploit opportunities: • Will roll-out take time? • Is the budget allocated insufficient to cover everyone? • Are there quantitative eligibility rules? • If the program has universal access, does it have imperfect take-up? • Set scale: • Pilot to try out an intervention • Large scale w. representative sample: more costly, externally valid • Large scale with purposeful sample: less costly, indicative • Do power calculation to determine minimum sample size

  7. Cont. Step 3 • Select “best” method for each of your questions • Feasible • Requires least assumptions • Ethics • Not to deny access to something for which there is irrefutable evidence • Test interventions before scale up when you have no solid evidence

  8. Step 4: Planning implementation • Budget cost items • Staff time (PROJECT FUNDS) and training (DIME) • Analytical services and field coordination (DIME) • Data collection (PROJECT FUNDS) • Discussions and dissemination (shared) • Timeline • Use it to organize activities, responsibilities and work backwards to know when to start • Team • Government (program manager, economist/statistician); WB Project team (Task manager or substitute); Research team (Lead researcher, co-researchers, field coordinator); Data collection agency

  9. Step 5: Assignment to treatment and control • The smallest unit of assignment is the unit of intervention • Training and Credit: individuals and groups • Municipal registration system: municipality • Create listing of treatment units assigned to the intervention and control units that are not • Explain assignment to responsible parties to avoid contamination

  10. Step 6: Baseline data • Quality assurance : IE team (not data collection agency) to • Design questionnaire and sample • Define terms of reference for data collection agency • Train enumerators • Conduct pilot • Supervise data collection • Do not collect data before your design is ready and agreed

  11. Cont. Step 6: Baseline data • Contract data collection agency • Bureau of Statistics: Integrate with existing data • Ministry concerned: Ministry of Agriculture/Water Resources/Rural Development • Private agency • Analyze baseline data a feed back into program and evaluation design if needed • Check for balance between treatment and control group: do they have similar average characteristics?

  12. Step 7: Roll out intervention • Conduct intensive monitoring of roll-out to ensure evaluation is not compromised • What if treatment and control receive the intervention? • What if all the control group receive some other intervention?

  13. Step 8: Follow-up data • Collect follow-up data with the same sample and questionnaire as baseline data • At appropriate intervals

  14. Step 9: Estimate program effects • Randomization: compare average outcomes for treatment and control group • Other methods: Use relevant econometric analysis , test assumptions, check robustness • Are the effects statistically significant? • Basic statistical test tells whether differences are due to the program or to noisy data • Are they significant in real terms? • If a program is costly and its effects are small, may not be worthwhile • Are they sustainable? • Is the trajectory of results sustained?

  15. Step 10: Discuss, Disseminate and Feedback into policy • Are you thinking about this only now? • Discuss what are the policy implications of the results • What actions should be taken • How to present them to higher ups to justify changes/budget/scale up? • Talk to policy-maker and disseminate to wider audience • If no one knows about it, it won’t make a difference • Make sure the information gets into the right policy discussions • Real time discussions • Workshops • Reports • Policy briefs

  16. Final step: Iterate • What do you need to learn next?

More Related