1 / 50

Your presenter

willem
Download Presentation

Your presenter

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Understanding Economic Evaluations 101:The Challenge of Demonstrating ValueGraham Clyne, Executive DirectorPeel Children and Youth InitiativeCrime Prevention Conference, MississaugaFriday November 9, 2012CANADIAN INSTITUTE FOR ECONOMIC EVALUATION"Making the Case for Value: the Tools and Methods of Economic Evaluation” CIEE

  2. Your presenter • Special interest in children and prevention • Researcher, administrator, funder, community developer, volunteer, consultant – diverse perspectives • Background in public policy • Prevention Dividend Project 2000 • Canadian Institute for Economic Evaluation 2001 CIEE

  3. Background • Founding Director: Canadian Institute for Economic Evaluation • Associate Researcher: System-Linked Research Unit, McMaster University • Organizational Consultant: Public and Non Profit Sectors • Research Director: Prevention Dividend Project • Currently, Executive Director: Peel Children & Youth Initiative CIEE

  4. Today’s presentation • Prevention Dividend Project /CIEE • Key Principles and Concepts Involved • Common methods of Economic Evaluation • Costing Studies: Purpose and Method • Measuring System Service Utilization • Communication Strategies • Questions and Discussion CIEE

  5. The Prevention Dividend Project:Why do Economic Evaluation? Dr. Paul Steinhauer: “If prevention really works, why do we seem to struggle in the battle for public support?” GC: “Because we – the people in our business - haven’t really demonstrated that prevention is a better, more efficient use of scarce resources. We get by implying that it works and telling stories…… but the burden of proof is going up.” CIEE

  6. Prevention Dividend Project: The Burden of Proof Goes Up • Exponential growth in programs and services; many competing societal “goods” • Narrow organizational mandates = an evaluation focus on very few outcomes • “Outcomes” described in the language of professionals, service providers (e.g. Improved well being) CIEE

  7. Prevention Dividend Project:Burden of Proof… • Weak evaluative structures and processes – inverse relationship between quality of evaluation and program assessment • Confusion of Process and Outputs – with end state and/or intermediary outcomes • Singular focus on effectiveness – not efficiency • Evaluations tend to rationalize the ongoing use of resources CIEE

  8. Prevention Dividend Project: Societal Motivations • General growing public skepticism re: effecting change/risk factors Phillip Donne: Kelloggs Canada… “We spend $12 million annually for new products expecting that only a few will work, but in your business everything stays on the shelf” • United Way annual review process – Units of Service Assessments… is this really cost-effective? CIEE

  9. Prevention Dividend Project:Motivations… • Social Health and Community Services always described as “soft” services ..despite measurable economic impacts • Changes in Donor Preferences Measurable Impact / Direct linkage to outcomes expressed in terms of value • Requests for “Return on Investment” Impacting all sectors of public and charitable work CIEE

  10. Initial Negative Reactions… “Prevention actually costs society money – re: smoking kills – saves money!?! “Prevention is about improving the quality of our lives – not about saving money” “Do this in another sector – we’re not fat comparatively speaking” “We don’t know enough about outcomes – much less value” CIEE

  11. Economic Evaluations:Common Myths • E.E.: is a difficult or impossible methodology for health promotion and community based services • E.E.: is an inappropriate tool for measuring human care services • E.E.: primary focus is on budget reductions, generating savings CIEE

  12. Economic Evaluations: Actual Barriers • Practitioners perspective/justification paradigm • Comparative nature of economic evaluation – “It’s all good” • Motivated by external threat or funding and resource challenges • Poor outcome measures and causality issues • Training, sills, tools, resources CIEE

  13. Economic Evaluations: Applications • Provides new and better information for decision making – “at what cost” • Speaks to effectiveness – and efficiency; where and when “returns” will diminish • As a Management or funding tool – choosing between competing options • Speaks in a language that those outside the business can understand CIEE

  14. Economic Evaluations: Applications • As a “tipping point” – a necessary but not sufficient element • Allows for creative and useful cross-sectoral comparisons • Advocating for support without a shared “social justice” perspective • Adds a greater policy relevance to “soft” services and programs CIEE

  15. Economic Evaluations: Applications • Shows the interconnectedness of programs and services • Captures wider range of outcomes and benefits (societal perspective) • “Cost of” (burden of illness) studies: a community motivator and a different approach to collaborative action CIEE

  16. Economic Evaluations: Applications • Creating new allies; those who would benefit from change • Building and sustaining funder/donor support • Studies and messages can be tailored to suit the need • “Community Benefit” – better described in monetary terms with broad impacts CIEE

  17. Economic Evaluations: Limitations • Ideological Opposition may Prove Intractable • Policy Outcomes – blend of art, science, politics, timing, etc. • Resonance of “Viewpoint” to Intended Audience • Some outcomes cant be valuated with credibility CIEE

  18. Economic Evaluations: Limitations • Professional/internal resistance • Re-stigmatizing populations as “expensive” • Dilution of language/terminology • Logistical/resource challenges • Imposed methods – may be very poorly suited to your work CIEE

  19. Economic Evaluations: Looking Ahead • Impacting a wide range of sectors • Successful illustrations, creative applications growing • Shift to S.R.O.I – extrapolating “benefits” • Largest challenges – expert support and fiscal resources • Limited training opportunities CIEE

  20. Key concepts, principles and methods of economic evaluation CIEE

  21. What do we mean by economics? ECONOMICS TOPICS: • Things to do with the economy (e.g., employment, interest rates etc.) • Many disciplines contribute to understanding the economy(e.g., sociology, finance, psychologyetc.) ECONOMICS DISCIPLINE: • The tools and methods used by economists to analyse problems • Not confined to contributing to understanding economics topics CIEE

  22. Fundamental Concepts SCARCITY OF RESOURCES: • Possible uses of resources exceed their availability CHOICES: • Choices required about how to use available resources - implicit, but cannot be avoided OPPORTUNITY COST: • Choosing to use resources one way means choosing not to use them in other way Question: Do these “concepts” apply in your work? CIEE

  23. Economics and Money Canada’s Health Care “Solution” Increase expenditures by $15 billion • Does more expenditure lead to more resources? • Do more resources lead to more care? • Does more care lead to more health? • Does more health lead to more well being? MONEY: A MEANS OF EXCHANGE NOT A RESOURCE FOR PRODUCTION CIEE

  24. Other Critical Concepts • Equity– in your indicators • Discounting– time preferences • Sensitivity analysis CIEE

  25. Discounting and Time Preferences • Discounting reflects the loss in economic value that occurs when there is a delay in incurring a cost or realizing a benefit (value today vs. value tomorrow) • Not to be confused with inflation – all calculations should be in common (current) year dollars • All expenditures and benefits must be included and multiplied by a percentage over the number of years that costs and benefits occur • The discount rate is pivotal especially in prevention/early intervention CIEE

  26. Discount Table CIEE

  27. Sensitivity Analysis • Every evaluation will contain some degree of uncertainty, imprecision, or methodological controversy • Sensitivity analysis is performed to test if variations in the assumptions or estimates underlying an analysis do not produce significant alterations in the results • Deterministic (i e given as point estimate) vs. stochastic (i.e., given as distributions) type data • Know the Implications of your sensitivity analysis CIEE

  28. Viewpoint and Cost Identification • Whose Costs and Whose Consequences are included in the analysis? • Some possible Viewpoints: • Agency • Government • Consumer/client • Regional, State • Board of Directors/market share • Societal (best and most inclusive) • Broader viewpoints are useful to capture any costshifting– a common practice CIEE

  29. What is Economic Evaluation • The comparative analysis of alternative courses of action in terms of both their costs and consequences • Involves both the concept of technical efficiency and allocative efficiency • Differs from SROI – a calculation or roll up of (attributable) benefits CIEE

  30. Basic Methods of Economic Evaluations • Cost Effectiveness Analysis (CEA) • Cost Utility Analysis (CUA) • Cost-Benefit Analysis (CBA) • Costing Studies – Burden of Illness • Service System Utilization CIEE

  31. Cost Effectiveness Analysis • Compares different interventions with same sort of unambiguous outcome • Expressed in common units of outcome – lives saved, employment attained • Determines which intervention most efficiently produces the desired outcome by comparing costs per unit of outcome (search for dominance) CIEE

  32. Cost Effectiveness Analysis • Demonstrates correlation between amount spent and results • achieved (generally more costs = improved effectiveness) • Used to capture marginal or incremental rates of effectiveness (i.e.: generally a diminishing marginal returns) • Ratios demonstrate the additional cost of one intervention per unit of outcome achieved – a more accurate reflection of alternative costs/benefits • Associated with the rarely used Cost Minimization Analysis (CMA) (the search for the lowest cost alternative where outcomes are identical) CIEE

  33. Choosing a Measure of Effectiveness • Program’s effects are measured in natural units – e.g. avoided episodes • The Chosen measure of effectiveness depends on the objective(s) of the programs being evaluated • A CEA requires one unambiguous objective of the intervention and therefore a clear dimension along which effectiveness can be assessed CIEE

  34. Cost Utility Analysis • Commonly used in health care – to compare complex / multiple impacts from various interventions (uncertainty) • Creates a single - subjective - measure of utility or preference • Subsumes multiple effects into a single – most important – measure of quality (eg: QALY’s ; HYE’s) • Most efficient option = best incremental ratio of costs/outcomes – or costs per QALY gained CIEE

  35. Cost Benefit Analysis • Generally broadest, most inclusive assessment of costs and consequences using the societal perspective • All inputs valued in monetary terms – allows differing interventions with differing outcomes to be compared • Describes and compares the net gain (assuming benefits surpass costs) of various options CIEE

  36. Cost Benefit Analysis • CBA requires program consequences to be valued in the same units as the costs (i.e.; typically monetary units), thus enabling a direct comparison of the program’s incremental costs with its incremental consequences in commensurate units of measurement. • CBA is not “cost analysis” or “cost saving” analysis. CIEE

  37. Costing Studies / Burden of Illness • Commonly used in health care, disease prevention messages – cost of illness (COI) • Used to value morbidity outcomes in CBA • Aggregate assumed costs of treatment, lost productivity, out of pocket (cost shifting), cash transfers – some include suffering, care giver costs. Administrative cost usually excluded due to differing cost structures (US and Canada) • Not a full economic evaluation ( no comparison next best alternative uses for resources) • Problems with assumptions re: lost productivity – not a measure of preferences to avoid outcome CIEE

  38. Measuring Service Utilization System-Linked Philosophy and Approach… • Assumes multiple / overlapping conditions • Expects differential effectiveness – “do no harm” • Takes a comprehensive view of economic impacts • Uses a straightforward methodology – shopping basket approach…. Add it up! Effectively demonstrates the substantial cost of “doing nothing” CIEE

  39. Comprehensive vs. Self Directed Care:Measuring Service Utilization • Randomized large sample of single parents / children on welfare • Comprehensive care includes health, recreation, daycare, public health, employment training, etc. • Self-directed care = fending for themselves • Blend / mix of single services • Recorded all program costs; measured against reduced use of inappropriate remedial services • Findings: CC pays for itself in a single year CIEE

  40. Total Per Child Expenditures for 214 Children’s Direct Use of Health and Social Services 2 Years After Proactive, Subsidized Recreation CIEE

  41. Recreation/Childcare PAYS for itself by reductions ½ THE USE OF SPECIALISTS ½ USE OF C.A.S SERVICES ¼ USE OF OCCUPATIONAL THERAPIST 1/3 USE OF PHYSIOTHERAPISTS ½ USE OF PSYCHOLOGIST 1/10 USE OF SOCIAL WORKERS 1/10 USE OF PROBATION OFFICERS ½ USE OF CHIROPRACTORS ½ USE OF 911 SERVICES CIEE

  42. Communications Strategiesfor Economic Evaluations CIEE

  43. Communicating Results: Intentions… • Creating, building and sustaining support • Demonstrating the interconnected of services, decision-making and cost shifting • Communicating impact in a new way; appeal to different audiences in the language of business • A better and more comprehensive description of “Community Benefit” CIEE

  44. Communicating Results: Opportunities… • Meaningful data for communities and other service organizations; collaboration possibilities • “Hardens” softer services / program relevance during difficult economic circumstances – the business case • Framing the Economic Question Focuses Programming on results / efficiency questions. E.g. “maximizing benefits” CIEE

  45. Communicating Results: Content… • Success starts with the quality of content, data and sources used • Use transparent and reasonable assumptions • Be tight on effectiveness; causality; and attribution • Ensure relevance of viewpoint to audience • Show it all: Table your results if appropriate • Know your underlying equity statements Most of all: don’t overstate your results! CIEE

  46. Communications: Limitations • Professional and internal resistance to share information • Re-stigmatizing populations (sub-sets; or with particular risk characteristics) as expensive • Ongoing Dilution of the language and terminology of “investment” • Imposition of inappropriate and imposed frameworks • Expert and Resource Issues CIEE

  47. Communications: Internal Messages • EE doesn’t change our Mission – but helps to demonstrate and clarify its impact • EE can show there are real costs associated with “doing nothing” • EE demonstrates how we impact other organizations / systems • EE can create new allies – including some non-traditional partners CIEE

  48. Communications: Internal Messages • EE are a tool: a method that helps us put resources where we have the greatest impact – frees up resources for reallocation • Cost-effectiveness/ Cost benefit analysis will be one – but never the only – consideration of what we do • Some things we do – will always be driven by our Mission CIEE

  49. Some closing thoughts • Economic Evaluations: All about the relative relationship between costs and benefits • Effectiveness, causality, and measurable outcomes – an a priori requirement for any EE • Portability of ratios’ from research problematic (population, duration / intensity, measures) • SROI – understand the method, its use and limitations • Economic Evaluations: Powerful tool if we use it for good instead of evil CIEE

  50. Thanks for listening… your turn. QUESTIONS….. COMMENTS…. DISCUSSION….. CIEE

More Related