1 / 19

Cultivating Demand Within USAID for Impact Evaluations of Democracy and Governance Assistance

Cultivating Demand Within USAID for Impact Evaluations of Democracy and Governance Assistance. Mark Billera USAID Office of Democracy and Governance Perspectives on Impact Evaluation Cairo, Egypt April 1, 2009. Goal of USAID Democracy and Governance (DG) Assistance.

sunila
Download Presentation

Cultivating Demand Within USAID for Impact Evaluations of Democracy and Governance Assistance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cultivating Demand Within USAID for Impact Evaluations of Democracy and Governance Assistance Mark Billera USAID Office of Democracy and Governance Perspectives on Impact Evaluation Cairo, Egypt April 1, 2009

  2. Goal of USAID Democracy and Governance (DG) Assistance Promote transition to and consolidation of: • Democratic institutions, • Civic values, and • Good governance, • and directly impact broader stabilization and development objectives.

  3. Why Promote Democracy and Strengthen Good Governance? • Consistent with political system and conviction that all people share fundamental rights and freedoms. • Central to U.S. national security priorities. • Enhances broader development agenda: economic growth, provision of crucial services such as education and health care.

  4. Growth of DG Assistance • Democracy and Governance (DG) is a relatively young sector in the field of development. • Over the past 25 years, USAID has supported democratization in over 125 countries, in every region of the world. • Investment has increased dramatically.

  5. What works? • What we know is based on insight and anecdote rather than empirical evidence. • Policy makers and practitioners need better information. • Current evaluation practices provide insufficient answers.

  6. Social Science Research Council (SSRC) 2003 - USAID commissioned the SSRC to develop a methodological and analytical strategy for evaluating DG programs. • Focus on future as well as the past. • Focus on democracy ‘activities’ rather than a more general sectoral level. • Use multiple methodologies.

  7. Responses to the SSRC • Cross-national quantitative studies of the impact of USAID democracy assistance. • Voices from the field. • USAID asked the National Academy of Sciences (of the National Academies) for specific methodologies to implement recommended strategy.

  8. Improving Democracy Assistance:Building Knowledge Through Evaluations and Research Published by the National Academies in 2008: • Pilot program of impact evaluations including use of randomized design. • Develop better DG indicators at the sectoral level. • Conduct country case studies in clusters to place DG assistance in context of national and international factors. • Rebuild USAID’s institutional mechanisms for learning.

  9. Definition Impact evaluation: Seeks to establish the effect of a project (intervention). The effect is defined as the difference between the outcome that occurred in the presence of the project and the outcome that would have occurred in its absence. This effect is often estimated by comparing outcomes in target and comparison groups.

  10. Steps to Build Demand for Impact Evaluations • Publicize report by the National Academies. • Training in impact evaluation for USAID DG Officers and implementing partners. • Meetings with host country and implementing partners. • Communicate with other donors. • Demonstrate that impact evaluations can work in DG.

  11. Publicize the Report • Public event in Washington for USAID implementing partners and staff (May 2008). • Report is featured prominently in USAID DG informational materials. • Glossy brochures • Web site • Report was emphasized at 2008 USAID DG Partners Conference.

  12. Training Why use impact evaluations? What would they look like with DG programs? • Trained approximately 65 DG Officers from around the world at 2008 USAID DG Officer’s Conference. • Trained approximately 45 people from 29 partner organizations in February 2009. • Both will be repeated.

  13. Pre/Post Test Results for USAID DG Officers How likely are you to use comparison (control) groups in the M&E/PMPs of new DG projects that you manage or supervise? Pre Test: 31% answered likely or very likely to use control groups in new project M&Es (18 of 58). Post Test: 78% answered likely or very likely to use control groups in new project M&Es (42 of 54).

  14. Pre/Post Test Results for USAID DG Officers How likely are you to use random selection of target beneficiaries in the design of new DG projects that you manage or supervise? Pre Test: 38% answered likely or very likely to use random selection in new project design (21 of 56). Post Test: 64% answered likely or very likely to use random selection in new project design (34 of 53).

  15. Legitimate Concerns about DG Impact Evaluations(especially randomized) Expressed by DG Officers and partners: • Randomized evaluations are expensive • Ethical considerations • Spillover effects • Flexibility of programming • Requires Political Will

  16. Demonstrate Impact Evaluations Can Work in DG • Provide technical assistance and incentive funds to early adopters within USAID. • Others want to see proof on the ground.

  17. What have we done in eight months? • Fielded around 15 expressions of interest from USAID Missions that are launching new programs. • Provided three expert trips to the field. • At least three Missions have reached implementation stage of programs that include some form of impact evaluation (two as a result of prior work). • Several more Missions will soon implement some form of impact evaluation.

  18. What have we learned in eight months? • Ethical considerations around denying treatment to comparison groups are not a problem. • Opposite is true. Programs often do not treat enough recipients for statistically significant comparisons to non-treated groups. • USAID DG programming is a complex combination of many interventions in smaller doses. More difficult to evaluate than large doses of a single intervention. • Timing is everything, but we have greatest leverage as evaluators during early but less important stages of program design. • Do not forget about bureaucracy; many can say no.

  19. Remaining Questions • How intrusive should we be as evaluators? • Should the research design test the project as it would occur without the evaluation? • Or is it better to design the project and the evaluation with the primary goal of testing the development hypothesis? • Should the strategy be to conduct a small number of large impact evaluations or to promote the general approach as widely as possible? • Should we focus on one or a few research questions with a number of evaluations or work on more questions with fewer evaluations for each?

More Related