1 / 79

Learning Objectives

Evidence-Based Public Health: A Course in Chronic Disease Prevention MODULE 9: Evaluating the Program or Policy Ross Brownson Anjali Deshpande Darcy Scharff March 2013. Learning Objectives. Understand the basic components of program evaluation.

snow
Download Presentation

Learning Objectives

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evidence-Based Public Health: A Course in Chronic Disease Prevention MODULE 9: Evaluating the Program or PolicyRoss BrownsonAnjali DeshpandeDarcy ScharffMarch 2013

  2. Learning Objectives • Understand the basic components of program evaluation. • Describe the differences and unique contributions of quantitative and qualitative evaluation. • Understand the various types of evaluation designs useful in program evaluation. • Understand the concepts of measurement validity and reliability. • Understand some of the advantages and disadvantages of various types of qualitative data. • Understand some of the steps involved in conducting qualitative evaluations. • Describe organizational issues in evaluation.

  3. Best available research evidence Environment and organizational context Decision-making Population characteristics, needs, values, and preferences Resources, including practitioner expertise

  4. Discontinue Disseminate widely Retool

  5. What is program evaluation? “a process that attempts to determine as systematically and objectively as possible the relevance, effectiveness, and impact of activities in light of their objectives.” A Dictionary of Epidemiology, 2008 The best evaluations often “triangulate” The combination of quantitative and qualitative methods - Looking in a room from two windows Prominent example: the evaluation of California Prop 99

  6. Evaluation is basically … a process of measurement & comparison

  7. Why evaluate? • Improve existing programs • Measure effectiveness • Demonstrate accountability • Share effective strategies and lessons learned • Ensure funding and sustainability Evaluation is a tool that can both measure and contribute to the success of your program.

  8. Evaluation versus research Evaluation • Controlled by stakeholders • Flexible design • Ongoing • Used to improve programs Research • Controlled by investigator • Tightly controlled design • Specific timeframe • Use to further knowledge

  9. Do you have… • A research/evaluation person on staff? • Time and other resources? • Staff to assist? • Necessary skills? From Mattessich, 2003

  10. What are the most significant challenges you face in program evaluation? • Program personnel may be threatened by the evaluation • Need for personnel involvement vs. objectivity • Comprehensive evaluation versus nothing at all • “10% Rule” as you design and implement programs

  11. In program planning when should you begin planning an evaluation?

  12. Logic Model (Analytic Framework) Worksheet: Evidence-Based Public Health Program Title:__________________________________________ Goal: Long-Term Objective: What are the evidence- Intermediate Intermediate Intermediate Intermediate Objective based determinants? Objective (Individual Objective (Social Objective ( Govt ./Org. (Environmental Level): Level): Level): Level): Impact Based on an evidence review, what activities will address these Activities: Activities: Activities: Activities: determinants? What do Evaluation you do? How long will it take? How much will it cost? What other resources Costs: Costs: Costs: Costs: d? are needed Process Instructions : First discuss your target population. Using data, evidence-based recommendations (the Community Guide or others), your own knowledge, and group discussion, develop a program strategy for controlling diabetes in your community. Define the goal, objectives, activities, and costs: and describe them in this sample logic model.

  13. Some important questions: What are sources of data?How might program evaluation differ from policy evaluation?

  14. Some notes on policy evaluation • Same principles apply • Lack of control over the intervention (policy) • Time frame may be much shorter • No evaluation is completely “objective,” value-free, or neutral

  15. Logic Model PROGRAM PLANNING Goal Objective Activities EVALUATION Outcome Impact Formative/ Process

  16. Evaluation Framework Program - instructors? - content? - methods? - time allotments? - materials Process Impact Evaluation Types Behavior/cognition - knowledge gain? - attitude change? - habit change? - skill development? Outcome Health - mortality? - morbidity? - disability? - quality of life? (Adapted from Green et al., 1980)

  17. Types of Evaluation Formative evaluation • Is an element of a program or policy (e.g., materials, messages) feasible, appropriate, and meaningful for the target population? • Often, in the planning stages of a new program • Often, examining contextual factors

  18. Types of Evaluation • Considerations for formative evaluation 1. Sources of data • (pre-) program data 2. Limitations of data (completeness) 3. Time frame 4. Availability & costs • Examples • Attitudes among school officials toward a proposed healthy eating program • Barriers in policies toward healthy eating

  19. Types of Evaluation Process evaluation • “Field of Dreams” evaluation • shorter-term feedback on program implementation, content, methods, participant response, practitioner response • what is working, what is not working

  20. Types of Evaluation Process evaluation (cont) • direct extension of action planning in previous module • uses quantitative or qualitative data • data usually involves counts, not rates or ratios

  21. Unraveling the “Black Box”

  22. Types of Evaluation • Considerations for process evaluation 1. Sources of data • program data 2. Limitations of data (completeness) 3. Time frame 4. Availability & costs • Examples • Satisfaction with a diabetes self-management training • How resources are being allocated

  23. California Local Health Department Funding by Core Indicator 2004/07 2001/04 Cess. Cess. Spon. SSD License Bars License Outdoor Outdoor

  24. Types of Evaluation Impact evaluation • long-term or short-term feedback on knowledge, attitudes, beliefs, behaviors • uses quantitative or qualitative data • also called summative evaluation • probably more realistic endpoints for most public health programs and policies

  25. Types of Evaluation • Considerations for impact evaluation 1. Sources of data • surveillance or program data 2. Limitations of data (validity and reliability) 3. Time frame 4. Availability & costs • Example • Smoking rates (tobacco consumption) in California

  26. California and U.S. minus California adult per capita cigarette pack consumption, 1984/1985-2004/2005 Packs/Person 200 $0.25 tax increase 150 $0.02 tax increase US minus CA $0.50 tax increase 100 California 50 0 Source: California State Board of Equalization (packs sold) and California Department of Finance (population). U.S Census, Tax Burden on Tobacco, and USDA.. Note that data is by fiscal year (July 1-June 30). Prepared by: California Department of Health Services, Tobacco Control Section, February 2006.

  27. Obesity maps in the US

  28. Types of Evaluation Outcome evaluation • long-term feedback on health status, morbidity, mortality • uses quantitative data • also called summative evaluation • often used in strategic plans

  29. Types of Evaluation Considerations for outcome evaluation 1. Sources of data • routine surveillance data 2. Limitations of data (validity and reliability) 3. Time frame 4. Availability & costs - often the least expensive to find Example • Geographic dispersion of heart disease

  30. Acute myocardial infarction rates, Missouri, 2010-2011 (age-adjusted)

  31. Quantitative Evaluation

  32. Program Evaluation Designs Reasons for research on causes • to identify risks associated with health-related conditions type 1 evidence Reasons for evaluating programs • to evaluate the effectiveness of public health interventions type 2 evidence

  33. Interventioninitiated Program Evaluation Designs rate Can we conclude that the intervention is effective?

  34. Interventioninitiated Program Evaluation Designs rate Can we conclude that the intervention is effective?

  35. Program Evaluation Designs Experimental • randomized controlled trial • group randomized trial Quasi-experimental • pre-test / post-test with external control group (non-randomized trial) • pre-test / post-test without external control group (before-after or time series) Observational • cohort • case-control • cross-sectional

  36. Population Ineligibility Eligibility Participation No Participation Randomization ? Intervention Group No Intervention Group Outcome(s) Outcome(s) Program Evaluation Designs

  37. Follow-up July-Sept 2004 Pretest-Posttest with comparison group Reference: Brownsonet al. Preventive Medicine 2005; 41: 837-842

  38. Pretest-Posttest with comparison group Reference: Brownsonet al. Preventive Medicine 2005; 41: 837-842

  39. Employees & visitors, at JHMC Before new smoke-free policy After new Smoke-free policy Cigarettes smoked per day Cigarette remnant counts per day Nicotine concentrations Cigarettes smoked per day Cigarette remnant counts per day Nicotine concentrations Pre-test / Post-testwithout external control group Reference: Stillmanet al. JAMA 1990;264:1565-1569

  40. % People Smoking Reference: Stillmanet al. JAMA 1990;264:1565-1569

  41. Avg. Daily Cigarette Remnant Counts Reference: Stillmanet al. JAMA 1990;264:1565-1569

  42. Median Nicotine Vapor Concentrations (ug/m3) Reference: Stillmanet al. JAMA 1990;264:1565-1569

  43. % Employees Smoking

  44. California residents 1980-1982 before federal tax 1983-1988 before state tax 1989-1990 after state tax cigarette sales per capita cigarette sales per capita cigarette sales per capita Pre-test / Post-testwithout external control group References: Emery et al. AJPH 2001;21(4)278-283 Flewellinget al. AJPH 1992;82:867-869 Siegel et al. AJPF 2000;90:372-379

  45. California and U.S. minus California adult per capita cigarette pack consumption, 1984/1985-2004/2005 Packs/Person 200 $0.25 tax increase 150 $0.02 tax increase US minus CA $0.50 tax increase 100 California 50 0 Source: California State Board of Equalization (packs sold) and California Department of Finance (population). U.S Census, Tax Burden on Tobacco, and USDA.. Note that data is by fiscal year (July 1-June 30). Prepared by: California Department of Health Services, Tobacco Control Section, February 2006.

  46. Program Evaluation Designs Quality of evidence from program evaluation depend on … • type of program evaluation design • execution of the program evaluation • generalizability of program evaluation results

  47. Population Ineligibility Eligibility Participation No Participation Randomization ? Intervention Group Control Group Outcome(s) Outcome(s) Consider for ‘generic’ evaluation design

  48. Concepts of validity and reliability and their importance(evaluation “threats”)

  49. Validity vs. Reliability

More Related