Evaluating the Options - PowerPoint PPT Presentation

saber
evaluating the options n.
Skip this Video
Loading SlideShow in 5 Seconds..
Evaluating the Options PowerPoint Presentation
Download Presentation
Evaluating the Options

play fullscreen
1 / 19
Download Presentation
Evaluating the Options
171 Views
Download Presentation

Evaluating the Options

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Evaluating the Options Analyst’s job is to: gather the best evidence possible in the time allowed to compare the potential impacts of policies

  2. What should we do?: What was the impact?: What are we doing? How · · Policy Analysis Outcomeevaluation does it work?: · Benefit - cost analysis · Performance · Process Evaluation · Cost - Effectiveness Measurement · Needs assessment Activities, Policies, and Programs Outcomes Outcomes Time After During Before

  3. WSIPP: “Return on Investment: Evidence-Based Options to Improve Statewide Outcomes” • What information do we get from WSIPP study? • How did they create it? • What principles can we take away for our predictions?

  4. Benefits: To whom? For what period?

  5. Costs: What’s included? How can they be positive?

  6. Summary stats: What are they? How are they different?

  7. Net present value (WSIPP 2011, Tech appdxp.6)) Q is how much of outcome you get with program P the value of the outcome C is cost of program Dis is discount rate Proage is age of participant when program starts

  8. Benefit-Cost Ratio (WSIPP 2012, tech appdx p.6) Internal Rate of Return: discount rate at which NPV is zero =0

  9. WSIPP study elements: • Meta-analysis: averages results across multiple studies to get program impacts Q (meta example) • Estimates monetary benefits: • private gains to participants, • public value of avoiding outcomes like abuse and crime, and • private value of not being victim, P • Puts these together to get impact on outcomes over life with discount for later benefits and costs (dis) • Benefit Cost analysis adds up benefits and costs

  10. WSIPP adjusts estimates for quality of the evaluation evidence it collects: (tech appdx p.17) Determine empirically if have enough studies (i.e., which types of estimates are largest).

  11. From WSIPP case: • What outcomes will you account for? • Impacts on who? • (state or local budget, participants, by-standers) • What time frame will you use? • (discounting and NPV) • How will you weight multiple sources of evidence given its qualities? • How will you add it all up? • Will you demonstrate Sensitivityanalysis of results? • What other criteria “count” other than monetized?

  12. General starting points for predictions: • Need detailed descriptions of the alternatives (but not TOO detailed) • Focus on key impacts and most important costs • Common metrics (dollars, DALYS, etc.) are useful if they capture key outcomes • May need to adjust estimates for your scale or context • Get the best possible evidence—won’tget perfect information • Need to understand strengths and weaknesses of your evidence and communicate it

  13. Where to get evidence on costs and impacts (from Hatry): • Previous experience with similar changes • Pilot study in your organization • Information from other organizations that implemented similar policies (program evaluations) • Academic or think tank studies (academic journal search and web search) • Modeled or “engineered” estimates • Theories and logical inference about causal connections (tragically leads to “high,” “medium,” or “low”!) WEAKEST!

  14. Does the evidence from elsewhere apply to your organization (external validity)? • Is the policy or political context different in important ways? • Are the economic conditions different? • Is the target of the policy (e.g., client population or location) different in critical ways? • Would the policy or program be implemented in the same way? To the same scale? You must assess the severity of the differences and predict their impacts on your outcomes.

  15. Sources of uncertainty in estimates: • Validity of comparison and study methodology (see WSIPP report) • Statistical uncertainty (randomness) • Uncertainty in how would be implemented in new context • Possible changes in other policies or conditions (e.g., economic or social)

  16. What do you do with uncertainty? • Give explicit range estimates for costs or impacts • Perform sensitivity analysis and discuss effects on trade-offs (e.g., Monte Carlo) • Use worst case/best case scenarios • Give best guess estimates with caveats • Build resilience into your policy options But…. • Clients like certainty • There is limited time/space to explain details • Need to make decisions in face of uncertainty

  17. How do you add up? • Most likely you don’t!--Any adding up scheme inherently weights the criteria • Can use cost/benefit analysis (monetize) • Use go/no go (minimum threshold) for each criteria and pick policy that meets all, then maximize one Don’t have to recommend one policy but MUST point out KEY trade-offs across policy options

  18. Your mandate: • Find at least one quantitative outcome criterion that you can find evidence to estimate. • You must provide cost estimates of your options • Use at least one academic or think tank study as evidence for at least one outcome (and preferably more) • I challenge you to find the most informative quantitative and qualitative evidence from the broadest sources.