1 / 132

Evaluation

Evaluation. Module 5: Evaluation. Introductions. Name Experience with evaluation. Module 5: Evaluation. Learning Objectives. Explain the purpose and benefits of evaluation Describe the five parts of an evaluation plan

whouston
Download Presentation

Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation

  2. Module 5: Evaluation Introductions • Name • Experience with evaluation

  3. Module 5: Evaluation Learning Objectives • Explain the purpose and benefits of evaluation • Describe the five parts of an evaluation plan • Understand the components and relationships depicted in a program logic model • Identify program stakeholders • Define the different types of evaluation • Determine indicators • Define data collection methods • Identify data sources • Understand basic data analysis techniques and methods of data interpretation • Describe goal(s) of an evaluation report and its written components

  4. Module 5: Evaluation Agenda: Day 1 (8:30 am – 4:00 pm)

  5. Module 5: Evaluation Agenda: Day 2 (8:30 am – 12:00 pm)

  6. Module 5: Evaluation Logistics • Parking, bathrooms, schedule • Participation • Group rules • Participant Guide

  7. Module 5: Evaluation Getting Started with Evaluation

  8. Getting Started with Evaluation Learning Objectives • Explain the purpose and benefits of evaluation • Describe evaluation planning • List and describe the five components of an evaluation plan

  9. Getting Started with Evaluation What is Evaluation? “The systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or make informed decisions about future program development.“ Source: U.S. Centers for Disease Control and Prevention (CDC)

  10. Getting Started with Evaluation Why Conduct Evaluation? Evaluation can help you to: • Determine the extent to which a program or policy is happening as planned • Measure program or policy outcome and impact over time • Improve program or policy performance • Demonstrate accountability and efficient use of resources

  11. Getting Started with Evaluation Other Benefits of Evaluation Evaluation can provide: • An opportunity to listen to the people served/affected by a policy or program • Improved employee/supporter morale • Increased media coverage • Broadened program visibility and reach

  12. Getting Started with Evaluation Evaluation Planning What is an Evaluation Plan? An evaluation plan is a roadmap of the steps and processes involved in your evaluation

  13. Getting Started with Evaluation Evaluation Planning Why Have an Evaluation Plan? An evaluation plan will help you: • Define the evaluation scope and expectations • Identify strategies and methods for data collection and analysis • Determine time and resources required • Organize evaluation activities • Prioritize actions and needs

  14. Getting Started with Evaluation Developing an Evaluation Plan Program goal Logic model Program objective Stakeholders Process and outcome evaluation Evaluation questions Indicators Quantitative, qualitative data Data collection methods Sample population/methods Analysis strategy Quantitative, qualitative analysis Interpretation Report goal(s) Report audience Report type

  15. Getting Started with Evaluation Evaluation Planning 1 2

  16. Getting Started with Evaluation Evaluation Planning 3

  17. Getting Started with Evaluation Evaluation Planning 4 5

  18. Getting Started with Evaluation Case Study • Smoke-Free Public Spaces Initiative • Reduce exposure to secondhand smoke among residents through widespread adoption and enforcement of smoke-free policies in public places. • Policy • Outreach • Education • Media

  19. Module 5: Evaluation Tea Break

  20. Module 5: Evaluation Section 1: Program Description

  21. Section 1: Program Description Evaluation: Program Description Program goal Logic model Program objective Stakeholders Process and outcome evaluation Evaluation questions Indicators Quantitative, qualitative data Data collection methods Sample population/methods Analysis strategy Quantitative, qualitative analysis Interpretation Report goal(s) Report audience Report type

  22. Section 1: Program Description Evaluation Planning: Program Description 1

  23. Section 1: Program Description What is the Goal? The goal of your program or policy should describe the: • Overall purpose of the program or policy • Intended results of the program or policy • Specific target population that will be affected by the program or policy What are you trying to accomplish with the program or policy?

  24. Section 1: Program Description What is a Logic Model? A logic model graphically depicts the relationships among: • Resources that are used • Activities that take place • Products or immediate changes created during the program • Benefits or longer-term changes that result from the program

  25. Section 1: Program Description Logic Model - Purpose Logic models help: • Determine whether program activities logically lead to desired outcomes • Stakeholders understand the program’s purpose, the necessary resources, the activities it will conduct, and its capacity to affect change • Develop questions to be answered through the evaluation

  26. Section 1: Program Description Logic Model - Template Intermediate Short-Term Long-Term

  27. Section 1: Program Description Logic Model - Components Outputs: Tangible, direct results of program activities Outcomes: Changes expected as a result of program Activities: Actions taken to implement program Inputs: Resources to implement program

  28. Section 1: Program Description What are Inputs? • Resources to implement the program or policy • Examples: • Funding • Local government agencies • Agency leaders • Local health care organizations • Local worksites • Local media outlets • Others?

  29. Section 1: Program Description What are Activities? • Describe events and actions that will enable project to reach its objectives and goal • What will happen during the project? • Examples: • Develop a model smoke-free public places policy and implementation steps. • Meet with leaders in government agencies and business to educate them on smoke-free policy and implementation steps. • Develop and implement anti-smoking media campaign for TV, radio, online, and in print. • Others?

  30. Section 1: Program Description What are Outputs? • Tangible, direct results of program activities • Examples: • Training on model smoke-free policy and implementation • Exposure to information about the dangers of secondhand smoke • Meetings with government agencies and business leaders • Others?

  31. Section 1: Program Description What are Outcomes? • Changes expected as a result of program Examples:

  32. Section 1: Program Description Question: How are outcomes different from outputs?

  33. Section 1: Program Description Outputs Vs. Outcomes

  34. Section 1: Program Description Case Study: Program Goal 1

  35. Funding Local govt. agencies Agency leaders Local work-sites Local media outlets General public Write SF public places policy Meet with govt./ business leaders Create/ distribute education materials Develop/ implement media campaign Engage media outlets Policy document Understanding of policy by govt., business, comm. Materials given to X # of agencies/ businesses Media campaign Media mentions Long-Term Short-Term Intermediate Section 1: Program Description Increased knowledge of/support for policy Policy Adoption Enforcement of policies Reduced tolerance/ promotion of tobacco use Increased public places with enforced policies Reduced use of tobacco in/ around public places Increased cessation Reduced exposure to secondhand smoke Reduced tobacco consumption Reduced tobacco-related morbidity/ mortality Case Study: Logic Model

  36. Module 5: Evaluation Section 2: Evaluation Design

  37. Section 2: Evaluation Design Evaluation: Design Program goal Logic model Program objective Stakeholders Process and outcome evaluation Evaluation questions Indicators Quantitative, qualitative data Data collection methods Sample population/methods Analysis strategy Quantitative, qualitative analysis Interpretation Report goal(s) Report audience Report type

  38. Section 2: Evaluation Design Evaluation Planning: Design 2

  39. Section 2: Evaluation Design What are Program Objectives? • Describe the results of actions and activities • Describe what we intend to change as a result of the program or policy • Must be specific, measurable, achievable, relevant, and time-limited

  40. Getting Started with Evaluation Program Objectives: Case Study Objectives: • By September 2015, 75% of residents will report seeing anti-smoking messages in at least three media channels. • By September 2015, 60% of residents will believe smoking should be banned in restaurants and in public transport (e.g., airports, train stations, buses). • By January 2016, a policy that bans smoking in public places will be adopted.

  41. Section 2: Evaluation Design What is a Stakeholder? Stakeholders are people or organizations that are invested in the program, are interested in the results of the evaluation, and/or have a stake in what will be done with the results of the evaluation. Stakeholders can include: • People served or affected by the program/policy • People involved in the program/policy operations • Program funder(s) • Intended users of the evaluation findings

  42. Section 2: Evaluation Design Stakeholders and Evaluation Stakeholders can: • Provide input at every stage of evaluation • Help implement evaluation activities • Help increase the credibility of the processes • Promote the use of evaluation findings • Provide advocacy support

  43. Section 2: Evaluation Design Question: Who would you consider to be the stakeholders for our case study evaluation?

  44. Section 2: Evaluation Design Types of Evaluation Process evaluation Outcome evaluation

  45. Section 2: Evaluation Design Types of Evaluation Process Evaluation: Assesses the delivery or implementation of a program or policy Outcome Evaluation: Measures the effect of the program or policy on the target population Process: Activities, materials, delivery, logistics, number of participants, number of staff, etc. Outcome: Knowledge, attitudes, actions, skills, behaviors, policy

  46. Section 2: Evaluation Design Questions: What is an example of how we might use a process evaluation in our case study? How might we use an outcome evaluation?

  47. Section 2: Evaluation Design Process Evaluation • Assesses the degree to which the program is being delivered as planned • Documents characteristics of the program each time it is delivered (across staff or sites) • Provides information for improving program elements and quality • Contributes to quality control • Provides information on program accountability Helps you fully understand how a program works

  48. Section 2: Evaluation Design Outcome Evaluation • Focuses on the results of a program or policy by answering questions such as: • Did knowledge, attitudes, and practices change as a result of the program or policy? • Did the program or policy have the desired effect on the target population? Outcome evaluation measures the effect of a program or policy on the target population over the: Short-term, Intermediate, or Long-term

  49. Section 2: Evaluation Design Short-Term Outcome Evaluation Short-Term Immediate or early results of a program or policy, such as: • Amount of change in the target population’s knowledge, attitudes, and beliefs • Degree to which program/policy is meeting short-range objectives • Evidence of which materials/activities appear to be effective • Indications of how resources might be best used Knowledge, Attitude, Skills

  50. Section 2: Evaluation Design Intermediate Outcome Evaluation Intermediate Seeks to measure further progress in reaching long-term goal, including: • Type of action taken by target population to make changes related to increased knowledge • Indications about how well the program was able to help participants apply what they learned Practices, Norms, Environment

More Related