1 / 60

Program Evaluation Webinar Series Part 1:

Program Evaluation Webinar Series Part 1:. “Top Roadblocks on the Path to Good Evaluation– And How to Avoid Them” Presented by: Tom Chapel. Top Roadblocks on the Path to Good Evaluation– And How to Avoid Them . Thomas J. Chapel, MA, MBA Chief Performance Officer (Acting)

evette
Download Presentation

Program Evaluation Webinar Series Part 1:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Program Evaluation Webinar Series Part 1: “Top Roadblocks on the Path to Good Evaluation– And How to Avoid Them” Presented by: Tom Chapel

  2. Top Roadblocks on the Path to Good Evaluation– And How to Avoid Them Thomas J. Chapel, MA, MBA Chief Performance Officer (Acting) CDC/Office of the Director/OCOO Presented November 20, 2008 Tchapel@cdc.gov 404-498-6073

  3. Objectives • Program evaluation and typical “roadblocks” in doing good evaluation. • CDC’s Evaluation Framework as way to surmount roadblocks.

  4. Key Points • In today’s session we will discuss: • What is important about CDC’s framework? • Why does it lead to better use of findings? STEPS Engage stakeholders Ensure use and share lessons learned Describe the program Standards Utility Feasibility Propriety Accuracy Focus the evaluation design Justify conclusions Gather credible evidence

  5. Why We Evaluate… • “... The gods condemned Sisyphus to endlessly roll a rock up a hill, whence it would return each time to its starting place. • They thought, with some reason…

  6. Why We Evaluate… …there was no punishment more severe than eternally futile labor....” The Myth of Sisyphus

  7. The Problem • The stuff I do doesn't make a difference! • Why don't things get better?!

  8. Implementing Program Evaluation • How • do I • motivate? Not this… This… What gets in the way?

  9. Today’s Focus Top Roadblocks on the Road to Good Evaluation

  10. Defining Evaluation • Evaluation is the systematic investigation of the merit, worth, or significance of any “object”. Michael Scriven

  11. Use the Findings! • If the findings don’t get used… the program will not improve.

  12. What is “Evaluation?” • Evaluation is not… Evaluation is… • An orientation to your program. • The idea of continuous reflection. A specific set of tools or techniques.

  13. Defining Evaluation • Evaluation is the systematic investigation of the merit, worth, or significance of any “object”. Michael Scriven

  14. What is a “Program”? • Not only: • Big training programs • Community interventions • But also: • Recommendations and guidelines • Surveillance systems • In other words, a program is anything with an intended outcome.

  15. Roadblock #6 Not understanding where evaluation “fits in” …

  16. The “Silo” Model

  17. The Integrated or “CQI” Model • To achieve “continuous quality improvement” planners, performance measurers, and evaluators must communicate with each other.

  18. The Customer is the Key • Program evaluation must: • See planning, performance measurement, and evaluation as being integrated. • Start with the idea of having a customer or an intended user of findings. • Direct the evaluation with the customer in mind.

  19. Roadblock #5 Making the “perfect” the enemy of the “good”.

  20. Roadblock #5 What if you said, “To be cardiovascularly fit, you must run a marathon.”?

  21. Thanks, but… • That's not me. I don't have that expertise. I don't have the money to do that. I don't have those skills.

  22. Do What You Can! • There’s always an evaluation worth doing. • The biggest mistake is doing nothing because you can only do a little. • Even a little bit is going to yield some benefit.

  23. Roadblock #4 • Evaluating only what you can “measure”… … because those are the things wecanmeasure with validity, reliability and accuracy.

  24. Upstream Questions • How many brochures? How many trainees? • How many people showed up? Did we get a lot of product out there?

  25. Downstream Questions • What have you • done for me lately? • How has it mattered? What have you done for public health?

  26. Measuring the Right Thing… “…Sometimes, what counts can’t be counted. And what can be counted doesn’t count….” Albert Einstein

  27. Evaluation Starts By Saying… • What are the important things that need to be measured? • Can I measure them with enough rigor to meet the needs of this situation this time? • Sometimes the answer is “NO!”

  28. You Get What You Measure… “…In Poland in the 1970s, furniture factories were rewarded based on pounds of product shipped. As a result, today Poles have the world’s heaviest furniture…” (New York Times, 3/4/99)

  29. Roadblock #3Neglecting Intermediate Outcomes…. • Nothing has advanced the evaluation • cause in public health more than preaching • this idea of intermediate outcomes.

  30. Intermediate OutcomesContribute to Downstream Success • How is it that my program will make • a contribution to that downstream outcome? We call these “intermediate outcomes”.

  31. What is the Program Logic? • What needs to happen to achieve the desired outcome? • What is the “program logic”? My action Desired outcome

  32. Don’t just ask: Did it work? What are the markers that tell me I’m on the right road? How many tomatoes did I get?

  33. Ask: Is it working? What are the markers that tell me I’m on the right road? • Are planting, watering, and weeding taking place? • Have the blossoms “set”? • Are there nematodes on the plants?

  34. Research Model Develop Theory Program Activities Measure Outcome

  35. Research Model Develop Theory Program Activities Measure Outcome • If I achieved the outcome– great! • If I didn’t achieve the outcome– why?

  36. Evaluation Unpacks the “Black Box” My action Desired outcome

  37. The World’s BestChildren’s Soccer Program

  38. But We Never Won a Game

  39. Focus on Intermediate Outcomes • Can we: • pass the ball? • spread out? • spend more time on the opponent’s side of the field?

  40. Forgetting Intermediate Outcomes • ScienceCartoonsPlus.com

  41. What’s In the Box? • My program: • training • technical assistance • funding • partnerships • Desired outcome: • less morbidity • fewer mortalities

  42. What’s In the Box? • My program: • training • technical assistance • funding • partnerships • Desired outcome: • less morbidity • fewer mortalities Intermediate outcomes

  43. The Power of Evaluation • Establishing intermediate outcomes allows you to determine if you are making progress in the right direction.

  44. Diabetes Intermediate Outcomes

  45. Why Intermediate Outcomes? • I’m making progress in the right direction. • I am contributing to the downstream outcome.

  46. Identifying Intermediate Outcomes • What is the ultimate outcome I’m seeking? • Who (besides me) needs to take action to achieve it? • What action do they need to take? These are the intermediate outcomes that populate the “black box” or the “program logic”.

  47. Roadblock #2 Confusing attribution and contribution… “I can’t make the case that my program was responsible for that change.”

  48. The Role of Public Health • a direct deliverer of services Public health is not … Public health is… a mobilizer and convener Based on: The Future of Public Health, Institute of Medicine, 1988.

  49. “Networked” Interventions OUTPUTS SHORT-TERM Program A-1 OUTCOMES Agency A Program A-n LONG-TERM OUTCOMES Program B-1 Agency B SYSTEM OUTCOME Program C-1 Agency C Program C-n Program D-1 Agency D Program D-n

  50. Attribution OUTPUTS SHORT-TERM Program A-1 OUTCOMES Agency A Program A-n LONG-TERM OUTCOMES Program B-1 Agency B SYSTEM OUTCOME Program C-1 Agency C Program C-n Program D-1 Agency D Program D-n

More Related