1 / 9

Prioritizing among impact questions

Prioritizing among impact questions. Module 3.3. How important will the results be?. Is the program (or similar programs) large or expanding? If the program works will it be cheap and easy to scale—i.e. high potential impact? How much existing evidence is there already?

waylon
Download Presentation

Prioritizing among impact questions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Prioritizing among impact questions Module 3.3

  2. How important will the results be? • Is the program (or similar programs) large or expanding? • If the program works will it be cheap and easy to scale—i.e. high potential impact? • How much existing evidence is there already? • Is the question more general (theory based) or more tied to one context?

  3. Can the question be answered well? • Must have good way to measure outcomes • When evaluating a HIV education program it is critical to get measures that go beyond self reported attitudes and behavior or results may mislead • Need sufficient sample size to be able to pick up a reasonable effect (more on this later) • Study that finds treatment is not statistically different from zero, but cannot rule out a reasonably big impact may be worse than useless • may be misread as proving no effect

  4. Is the program representative? • Representative program design • Testing the impact of a standard program provides a benchmark against which variants can be measured • Representative geography and cultural context • Results more likely to generalize to a similar context, so testing in a context that looks like many others is a plus • Representative partner • If implementation is of unusually high quality we may worry results will not generalize to other partners • But doing impact evaluation of a poorly implemented program is a waste: process evaluation can tell you wont work • Partner must be committed to evaluation or will fail

  5. When being unrepresentative is good • Proof of concept evaluations can be very informative • These test: does this type of program work when implemented very well • If proof of concept evaluation shows positive results, can then try cheaper, more scalable versions of the program • If proof of concept evaluation finds no impact, cheaper or more scalable versions done by less good partners unlikely to work

  6. Is the implementing partner right? • Implementation partner commitment is essential, both at local and headquarter level • Q: Why is implementer commitment so important? How could lack of commitment disrupt an evaluation? • Potential tradeoff between having a partner committed to evaluation and a representative partner • One approach is work with committed partner but make sure program does not use highly specific and hard to replicate skills • Eg does not rely on very committed staff

  7. Cost-effective evaluation • Rigorous impact evaluations vary enormously in cost, important to balance cost and benefits • What makes an evaluation expensive? • Collecting new data vs using administrative data • Randomizing by group (rather than individual) • Rural and remote locations drive up transport costs • When skilled labor is expensive (much of Africa) • Multiyear evaluations • Many treatment arms • Nonsurvey outcomes (like biological tests) • Large expensive evaluations can still be cost-effective by improving the effectiveness of spending

  8. Developing an evaluation strategy • If an organization wants to develop a plan of where to invest its evaluation budget, start with key questions facing the organization • Determine which can be answered from existing evidence and which from process evaluation • Select top priority questions for impact evaluation • Find best context for answering them: where is representative and where are there opportunities to evaluate?

  9. From impact back to process • With key questions answered from impact evaluations, process evaluation can be used to test overall impact • Example of bednets and malaria • Impact evaluation shows hung bed nets reduce malaria • Impact evaluation shows fee bed nets get hung • Needs assessment finds areas where malaria high, bednets low • Process evaluation monitors how many bed nets given out • Possible to impute lives saved from bednets given out

More Related