1 / 18

Steps in Implementing an Impact Evaluation

Steps in Implementing an Impact Evaluation. Aïchatou Hassane Africa Impact Evaluation Initiative World Bank. Step 1: Identify priorities. Examine sector plan examples Poverty Reduction Strategy Paper Education Long-term Strategic and Financial Framework

Download Presentation

Steps in Implementing an Impact Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Steps in Implementing an Impact Evaluation Aïchatou Hassane Africa Impact Evaluation Initiative World Bank

  2. Step 1: Identify priorities • Examine sector plan • examples • Poverty Reduction Strategy Paper • Education Long-term Strategic and Financial Framework • Identify highest priorities for learning • Education • Teacher incentives and management • Learning materials • School-based management (SBM) • If most resources going into SBM, then that’s the opportunity to learn • New policies can also present an opportunity for learning ( Rwanda)

  3. Step 2: Understand roll-out of intervention and opportunities for impact evaluation • How will the school-based management program roll out? • Piloted in a random sample of schools? • Rolled out nationwide? • Rolled out in schools satisfying certain clear crtiteria? • Each roll-out strategy yields distinct opportunities for impact evaluation

  4. Step 2+: Understand roll-out of intervention and opportunities for impact evaluation • How will the different incentives for contract teachers be implemented • Piloted in a random sample of districts/schools? • Rolled out in certain districts? • Pick a nationwide representative sample • Each roll-out strategy yields distinct opportunities for impact evaluation

  5. 3. Appropriate design • Keep in mind • the needs of the intervention: Target needy schools • the evaluation: Take advantage of opportunities for random assignment • School grants example: 1,000 schools to receive grants over 3 years • Randomly assign 300 to each of Phases 1-3 • Identify 500 neediest and assign to Phases 1 and 2 • Rwanda: 3,000 contract teachers to be hired by year • Representative randomly selected sample of schools with contract teachers.

  6. 3+. More on design • Determine scale: at scale or small pilot? • At scale • Nationally representative • More costly to implement • Better information about national effectiveness • Small pilot (e.g., in two districts) • Easier to implement • Not as informative

  7. 4. Random assignment • Randomly assign […] to treatment and control groups • Can randomly assign at individual, school, clinic, or community level • School grants: at school level • Treatment package: at individual level • Contract teachers: at school level • Trade-off: higher level means bigger sample

  8. 5. Collect baseline data • Baseline data not strictly necessary: Randomization implies treatment and control are similar but • Allows you to verify that treatment and control appear balanced • Provides valuable data for impact analysis • Did the program mostly benefit patients who were poor at baseline? Or high school performers at baseline? • Allows analysis of targeting efficiency • Take advantage of on-going data collection

  9. 5+ Baseline questionnaires • Include areas essential to impact evaluation • Ultimate outcomes we care most about • Intermediate outcomes we expect to change first • Take advantage of opportunity to collect essential sector data • Gambia: corporal punishment • Rwanda: double shift teaching (location and teacher), bonus for teachers paid by parents • Focus group on contract teachers for more information. • Who collects it? • Bureau of Statistics: Integrate with existing data • Ministry concerned: i.e Rwanda: Ministry of Education • Private agency: Sometimes easier quality monitoring

  10. 6. Check for balance • Do treatment and control groups look similar at baseline? • If not, all is not lost! • Even in absence of perfect balance, can use baseline data to adjust analysis

  11. 7. Roll out intervention • Monitor to roll-out to ensure evaluation is not compromised • What if the benefits are accidentally rolled out to everyone, all at once? • Example: New chalkboards go to all schools • Evaluation is compromised: Needed to monitor! • What if all the control households receive some other benefit? • Example: NGO targets control schools to receive lunches. (WFP does over some schools in Rwanda) • Changes evaluation. PTA training at district level

  12. 7+ Gather info on roll-out • In reality, who receives which benefits when? • Could affect the impacts measured • Does the intervention involve something other than initially planned? • Example: Learn that those giving resources to clinics gave detailed guidance on clinic management • Program impact now includes the guidance

  13. 8. Follow-up data • Collect follow-up data for both the treatment and control groups • Appropriate intervals • Consider how long it should take for outcomes to change • Sub-sample at six months? Intermediate changes • One year • Provide initial outcomes • Adjust program if needed • Two years: Changes in longer term outcomes? • After end of program: Do effects endure? • School feeding in Kenya • What happens once teachers have obtained permanent position or have not obtained permanent position

  14. 9. Estimate program impacts • Randomization: Simply compare average outcomes for treatment and comparison • Other methods: Make statistical assumptions to estimate impact of program

  15. 10. Are they big enough to matter? • Are the effects statistically significant? • Basic statistical test tells whether differences are due to the program or to noisy data • Are they policy significant? • If the anti-HIV media campaign costs a million dollars and has positive effect but it’s tiny, may not be worthwhile

  16. 11. Disseminate! • If no one knows about it, it won’t make a difference to policy! • Make sure the information gets into the right policy discussions • Ownership to government, capacity building • Forums • Workshop • Report • Policy brief

  17. 12. Iterate • Re-examine sector priorities: Identify next learning opportunity • Or suppose the effects aren’t as big as you hoped • Test variations (ex: different teacher or clinic officer training) • Test other interventions to affect same outcomes (ex: better equipment or school materials) • Test, test, test!

  18. Thank you

More Related