1 / 33

Using the CDC Evaluation F’work to Avoid “Minefields” on the Road to Good Evaluation

By: Thomas J. Chapel, MA, MBA Office of the Director Office of Program Planning and Evaluation Centers for Disease Control and Prevention. Using the CDC Evaluation F’work to Avoid “Minefields” on the Road to Good Evaluation. Presented to: 2002 National Asthma Conference October 24, 2002.

blythe
Download Presentation

Using the CDC Evaluation F’work to Avoid “Minefields” on the Road to Good Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. By: Thomas J. Chapel, MA, MBA Office of the Director Office of Program Planning and Evaluation Centers for Disease Control and Prevention Using the CDC Evaluation F’work to Avoid “Minefields” on the Road to Good Evaluation Presented to: 2002 National Asthma Conference October 24, 2002

  2. Why We Evaluate “... The gods condemned Sisyphus to endlessly roll a rock up a hill, whence it would return each time to its starting place. They thought, with some reason, that there was no punishment more severe than eternally futile labor....” The Myth of Sisyphus –MMWRFramework for Program Evaluation in Public Health 1

  3. Defining Evaluation • Evaluation is... • the systematic investigation of the merit, worth, or significance of an “object”–Michael Scriven • Program is... • any organized public health action/activity

  4. Research vs. Program Evaluation • A continuum, not a dichotomy, but at far ends may differ in: • Framework and steps • Decision making • Standards • Key questions • Design • Data collection sources and measures • Analysis timing and scope • Role of values in making judgments • Centrality of attribution as conclusion • Audiences for dissemination of results

  5. The Continuum • Efficacy…does my effort work in ideal circumstances • Effectiveness…does my effort work in real world settings, and work the same way across settings • Implementation fidelity…is my (efficacious and effective) effort being implemented as intended.

  6. Today’s Focus Top Minefields on the Road Conducting Good Evaluation! –MMWRFramework for Program Evaluation in Public Health 5

  7. Minefield # 8 Not linking planning and evaluation… –MMWRFramework for Program Evaluation in Public Health 6

  8. Minefield # 7 Evaluating only what you can measure… –MMWRFramework for Program Evaluation in Public Health 7

  9. You Get What You Measure… “…In Poland in the 1970s, furniture factories were rewarded based on pounds of product shipped. As a result, today Poles have the world’s heaviest furniture…” (New York Times, 3/4/99) –MMWRFramework for Program Evaluation in Public Health 8

  10. Minefield # 6 Thinking evaluatively only at the end… –MMWRFramework for Program Evaluation in Public Health 9

  11. When to Evaluate…. Good program evaluation shifts our focus from “Did it (my effort) work?” to “Is it (my effort) working?” –MMWRFramework for Program Evaluation in Public Health 10

  12. Minefield # 5 Not asking “ who (else) cares… –MMWRFramework for Program Evaluation in Public Health 11

  13. Minefield # 4 Neglecting intermediate outcomes… –MMWRFramework for Program Evaluation in Public Health 12

  14. Forgetting Intermediate Outcomes 13

  15. Minefield # 3 Neglecting process evaluation… –MMWRFramework for Program Evaluation in Public Health 14

  16. Minefield # 2 Confusing attribution and contribution… –MMWRFramework for Program Evaluation in Public Health 15

  17. “Networked” Interventions

  18. Minefield # 1 Using more “sticks” than “carrots”… –MMWRFramework for Program Evaluation in Public Health 17

  19. Framework forProgram Evaluation 18

  20. Standards forEffective Evaluation • Not HOW TO do an evaluation, but help direct choices among options at each step: • At each step, standards ask which choice(s) • Utility (7): Best serve information needs of intended usersFeasibility (3): Are most realistic, prudent, diplomatic, and frugal given resources • Propriety (8): Best meet law, ethics, and due regard for the welfare of those involved and affected • Accuracy (12): Best reveal and convey technically accurate information 19

  21. Broadening Our Thinking About Evaluation • What to evaluate • When to evaluate • Who should be involved in evaluation • How to evaluate 20

  22. Who Should Evaluate? 21

  23. Why Involve Stakeholders • Smoke out disagreements in… • Definition of the problem • Activities and priorities of program • Outcomes that equate to success • What constitutes “proof” of success • Get their help with.. • Credibility of findings • Access to key players • Follow-up • Dissemination of results 22

  24. Using Logic Modelsfor Evaluation • Clarity on • What are activities • What are intended effects • What is the sequence/order of intended effects • Which activities are to produce which effects • Consensus with stakeholders on all of the above • Focus the evaluation design 23

  25. Some Factors That Influence Choice of Evaluation Focus • Users and uses – Who wants the information and what are they interested in? • Accountability to (other) stakeholders – For what effects are key stakeholders expecting to see results? • Resources – Time, money, expertise • Stage of development –How long has the program been in existence? • “Ripple effect”- How far out would an intervention of this intensity reasonably be expected to have an effect?

  26. Setting Evaluation Focus: Some “Process” Issues • What are the likely key challenges to “implementation fidelity? • “Dropped baton” issues are key • Partner failed to do their part • Client/family/patient failed to fulfill their referral • Other common challenges • Inadequate dosage • Bad access • Failure to retain participants • Wrong match of staff and participant 25

  27. Evidence Gathering: Choosing Design • What intervention was actually delivered? • Were impacts and outcomes achieved? • Was the intervention responsible for the impacts and outcomes?

  28. Justifying Claims About Intervention Effectiveness • Performance vs. a comparison/control group • Time sequence • Plausible mechanisms (or pathways toward change) • Accounting for alternative explanations • Similar effects observed in similar contexts 27

  29. Choosing DataCollection Methods • Function of: • Time • Cost • Sensitivity of the issue • “Hawthorne effect” • Ethics • Validity • Reliability 28

  30. Maximizing Use of Results: Key Questions • Who is the audience? • What will be of greatest importance to them? • How will they use the information provided? • How much time will they be willing to spend reading and assimilating the material? • What type of vocabulary will express the information most clearly? 29

  31. Some CDC Asthma Examples • Comprehensive School-Based Asthma Project • Controlling Asthma in American Cities (CAAP) Project 30

  32. Helpful Publications @ www.cdc.gov/eval 31

  33. Community Tool Boxhttp://ctb.ku.edu/ 32

More Related