1 / 77

What is Evaluation and Why we do it?

What is Evaluation and Why we do it?. Mariana Garrettson, MPH Shrikant I. Bangdiwala, PhD UNC - Injury Prevention Research Center University of North Carolina – Chapel Hill Affiliate Safe Community Support Centre for Evaluation. Evaluation: how we think about it Data: What we know

cricket
Download Presentation

What is Evaluation and Why we do it?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What is EvaluationandWhy we do it? Mariana Garrettson, MPH Shrikant I. Bangdiwala, PhD UNC - Injury Prevention Research Center University of North Carolina – Chapel Hill Affiliate Safe Community Support Centre for Evaluation NSC Chicago 2010

  2. Evaluation: how we think about it Data: What we know What you know about evaluation What you are already doing around evaluation Discussion What we DON’T know about your coalitions and evaluation UNC’s goals as the Affiliate Support Centre for Evaluation 45min 15min 45min 05min Agenda NSC Chicago 2010

  3. Main purposes of evaluation • Evaluation helps determine: • How well a program/policy works relative to its goals & objectives • Why a program/policy did or didn’t work, relative to planned process • How to restructure a program/policy to make it work, or work better • Whether to change funding for a program NSC Chicago 2010

  4. Main purposes of evaluation …is all of this worth the effort… NSC Chicago 2010

  5. Evaluation and Safe Communities Why should a Safe Community prioritize evaluation? NSC Chicago 2010

  6. Safe Communities – 6 Indicators • An infrastructure based on partnership and collaborations, governed by a cross-sectional group that is responsible for safety promotion in the community; • Long-term, sustainable programs covering both genders and all ages, environments, and situations; • Programs that target high-risk groups and environments, and programs that promote safety for vulnerable groups; • Programs that document the frequency and causes of injuries; • Evaluation measures to assess their programs, processes, and the effects of change; • Ongoing participation in national and international Safe Communities networks. NSC Chicago 2010

  7. Why should Safe Communities do evaluation? • Safe Communities requires constant improvement • Good resource allocation • Justify resource allocation • Share successes, avoid repeating mistakes • Continuous quality improvement • Funders require it • Recertification NSC Chicago 2010

  8. Evaluation needs to begin in and be part of the planning process… NSC Chicago 2010

  9. Why did Alice get lost? Alice: Which way should we go? Cat: That depends on where you are going. Alice: I don’t know where I’m going. Cat: Then it doesn’t matter which way you go! Lewis Carroll (1872) Alice in Wonderland NSC Chicago 2010

  10. To warm up your thinking…imagine you are organizing a vacation for your family... • What will this vacation look like? • How will you determine that it was a good vacation? NSC Chicago 2010

  11. Does your family need a vacation? • What information do you use to judge this? • Do you all need the same kind of vacation? NSC Chicago 2010

  12. Planning exercise…. • What are your vacation goals? (e.g. relaxing, seeing relatives, trying new foods) • What kind of vacation are you going to plan? • where to go? (e.g. Omaha, Paris, Grand Canyon) • how to travel? (e.g. car, plane, bus, bike) NSC Chicago 2010

  13. Evaluating your vacation…. • How will you decide if it will be a success? • What criteria will you use? For example... • in-laws got along • successfully ordered a meal in French • came back relaxed • kids didn’t fight • no rattlesnake bites • great photos NSC Chicago 2010

  14. How will you determine, during the trip, that things are going as you planned? • made all travel connections on time • stayed within credit card limit • took photos of major landmarks • no lost luggage • visited all the places you planned NSC Chicago 2010

  15. Planning process... whether vacation or a program… • Deciding what needs there are to be addressed • Developing goals • Identifying one or more strategies for achieving goals • Assessing progress in implementing the strategy as it was intended • Figuring out how it worked -- did it accomplish the goals? • Telling others about it NSC Chicago 2010

  16. Types of Evaluation Formative Evaluation How can the program activities be improved before implementation? Program Planning Phase Program Implementation Phase Process Evaluation How is/was the program (being) implemented? Impact / Outcome Did the program succeed in achieving the intended impact or outcome? Post Program Phase NSC Chicago 2010

  17. Cycle of Program Planning and Evaluation Identify problem & population • Define target audience Disseminate • Evaluation: • Process • Impact • Outcome Identify resources Implement Test, Refine, Implement • Evaluation: • Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010

  18. Identify problem & population • Define target audience Disseminate • Evaluation: • Process • Impact • Outcome Identify resources Implement Test, Refine, Implement • Evaluation: • Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010

  19. Identify population & problem • Surveillance data • Other needs assessment strategies • key informant interviews • focus groups • surveys • evaluations of past programs • literature • consultation with peers • other info… NSC Chicago 2010

  20. Identify problem & population • Define target audience Disseminate • Evaluation: • Process • Impact • Outcome Identify resources Implement Test, Refine, Implement • Evaluation: • Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010

  21. Define Target Audience • To whom is the program directed? • Whose injuries need to be reduced? • Who is the target of the program? • at risk persons • care givers (e.g. parents) • general public • media • decision makers NSC Chicago 2010

  22. Understand target audience • What are their characteristics? • Special needs (e.g. literacy) • Interests, concerns, priorities • Attitudes & beliefs re: problem & solutions to problem • Cultural issues NSC Chicago 2010

  23. Identify problem & population • Define target audience Disseminate • Evaluation: • Process • Impact • Outcome Identify resources Implement Test, Refine, Implement • Evaluation: • Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010

  24. Identify resources • Community partners • interest in topic • working on similar projects • On-going activities • Sources of financial support • Interests in community NSC Chicago 2010

  25. Identify problem & population • Define target audience Disseminate • Evaluation: • Process • Impact • Outcome Identify resources Implement Test, Refine, Implement • Evaluation: • Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010

  26. Set goals & objectives • Goal • broad statement of what program is trying to accomplish • Objectives • Specific • Measurable • Time-framed NSC Chicago 2010

  27. Identify problem & population • Define target audience Disseminate • Evaluation: • Process • Impact • Outcome Identify resources Implement Test, Refine, Implement • Evaluation: • Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010

  28. Choose Strategies • Identify existing strategies/programs • Literature: evidence based? promising practice? • Champion(s) • Successes from other Safe Communities • Develop new strategies: • Logic model (how would it work) • Haddon matrix NSC Chicago 2010

  29. Haddon Matrix NSC Chicago 2010

  30. Three-dimensional Haddon Matrix Phases Other?? Pre-Event Feasibility Preferences Event Stigmatization Equity Post-event Freedom Person Vehicle/ Vector) Physical Environ. Social Environ. Cost Effectiveness Decision Criteria Factors NSC Chicago 2010 Runyan, CW. Injury Prevention, 1998(4):302-307.

  31. Identify problem & population • Define target audience Disseminate • Evaluation: • Process • Impact • Outcome Identify resources Implement Test, Refine, Implement • Evaluation: • Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010

  32. What is the best way to influence the target population? Will the activities reach the people intended, be understood and accepted by target population? How can activities be improved? Improves (pilot-tests) program activities before full-scale implementation May increase likelihood program or policy will succeed May help stretch resources Formative Evaluation Questions it answers Why it’s useful NSC Chicago 2010 * Modified from Thompson & McClintock, 2000

  33. Identify problem & population • Define target audience Disseminate • Evaluation: • Process • Impact • Outcome Identify resources Implement Test, Refine, Implement • Evaluation: • Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010

  34. Implementation • As planned, with attention to detail • Documented clearly so others can replicate if appropriate NSC Chicago 2010

  35. Identify problem & population • Define target audience Disseminate • Evaluation: • Process • Impact • Outcome Identify resources Implement Test, Refine, Implement • Evaluation: • Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010

  36. Process evaluation • Purpose is to address: • What was done? • How was it implemented? • How well was it implemented? • Was it implemented as planned? NSC Chicago 2010

  37. Process evaluation – examples of questions • Who carried out intervention? • Was this the appropriate person/group? • Who supported and opposed intervention? • What methods/activities were used? NSC Chicago 2010

  38. Why is it useful? • Allows replication of programs that work. • Helps understand why programs fail. NSC Chicago 2010 * Modified from Thompson & McClintock, 2000

  39. The intervention cannot be a black box… must be clearly understood ? Outcome Idea NSC Chicago 2010

  40. Identify problem & population • Define target audience Disseminate • Evaluation: • Process • Impact • Outcome Identify resources Implement Test, Refine, Implement • Evaluation: • Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010

  41. Example: Bike helmets Intervention Impacts Outcomes Parental attitudes toward child helmet use Physician counseling parents Head injury in bike crashes Enforcement of helmet law Purchase of helmets Deaths from head injury in crashes Media campaign Use of helmets by children NSC Chicago 2010

  42. Impact evaluation • Purpose is to address changes in: • knowledge • attitudes • beliefs/ values • skills • behaviors / practices NSC Chicago 2010

  43. Using impact measures to… Establishing effectiveness Suppose we have a public safety campaign as our strategy Need to show Campaign  Behavior Outcome If we already have demonstrated that  Behavior Outcome We simply need to show Campaign Behavior NSC Chicago 2010

  44. Outcome evaluation • Purpose is to address changes in: • injury events (e.g. frequency, type, pattern) • morbidity (e.g. frequency, severity, type) • mortality (e.g. frequency, time to death) • cost (e.g. direct and indirect) NSC Chicago 2010

  45. Evaluation – Examples of questions Did the local policy of smoke alarms in apartments… • Get passed • Where people aware of it? • Did people have access to smoke alarms? • Did people get them installed properly? • Do people keep them maintained? • Lead to a reduction in the number or rates of: • events (e.g. apartment fires) • injuries • deaths • costs (e.g. burn center costs, family burden, property loss) NSC Chicago 2010

  46. Identify problem & population • Define target audience Disseminate • Evaluation: • Process • Impact • Outcome Identify resources Implement Test, Refine, Implement • Evaluation: • Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010

  47. Dissemination • Dissemination not done well • Not attempted • Not based on research about how to disseminate information to intended audience • Dissemination done well • Defining audience • How to access audience • How best to communicate change message to them • Presentation of clear, straightforward messages NSC Chicago 2010

  48. Identify problem & population • Define target audience Disseminate • Evaluation: • Process • Impact • Outcome Identify resources Implement Test, Refine, Implement • Evaluation: • Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010

  49. Scope of support from UNC IPRC The scope of an evaluation depends on Purpose /question to be addressed How extensive is the evaluation? Gold standard or Brass standard What resources are available? How soon is evaluation required? How often is evaluation required? Who /how many people will be involved? NSC Chicago 2010

  50. Questions??? Identify problem & population • Define target audience Disseminate • Evaluation: • Process • Impact • Outcome Identify resources Implement Test, Refine, Implement • Evaluation: • Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010

More Related