1 / 26

Fundamentals of Evaluation: Real World Evaluation Approaches

Learn about the fundamentals of program evaluation, including engaging stakeholders, describing the program, focusing the evaluation design, and gathering credible evidence.

margaretw
Download Presentation

Fundamentals of Evaluation: Real World Evaluation Approaches

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. FUNDAMENTALS OF EVALUATIONPart I:Real World Evaluation Approaches May 16, 2012 YALE-GRIFFIN PREVENTION RESEARCH CENTER David L. Katz, MD, MPH Valentine Yanchou Njike, MD, MPH Jesse Reynolds, MS

  2. Specific Evaluation Steps • Engage stakeholders • Describe the program • Focus the evaluation design • Gather credible evidence • Justify conclusions • Ensure use and share lessons learned • Adapted from Joint Committee on Standards for Educational Evaluation. Program evaluation standards: how to assess evaluations of educational programs. 2nd ed. Thousand Oaks, CA: Sage Publications, 1994.

  3. Each step is broken down into 3 parts: • Definition • Role • Activities

  4. Step 1: Engage stakeholders (defined) • Fostering input, participation, and power-sharing among those persons who have an investment in the conduct of the evaluation and the findings. • It is especially important to engage primary users of the evaluation.

  5. Step 1: Engage stakeholders (role) • Increases the chance that the evaluation will be useful. • Can improve the evaluation’s credibility, clarify roles and responsibilities, enhance cultural competence, help protect human subjects, and avoid real or perceived conflicts of interest.

  6. Step 1: Engage stakeholders (activities) • Consulting insiders (e.g., leaders, staff, clients, and program funding sources) and outsiders (e.g., skeptics) • Taking special effort to promote the inclusion of less powerful groups or individuals • Coordinating stakeholder input throughout the process of evaluation design, operation, and use; • Avoiding excessive stakeholder identification, which might prevent progress of the evaluation.

  7. Step 2: Describe the program (defined) • Providing specific detail about the features of the program being evaluated, including its purpose and place in a larger public health context. • Description includes information regarding: • The way the program was intended to function and the way that it actually was implemented. • Includes features of the program’s context that are likely to influence conclusions regarding the program.

  8. Step 2: Describe the program (role) • Improves evaluation’s fairness and accuracy. • Permits a fair assessment of strengths and weaknesses and helps stakeholders understand how program features fit together and relate to a larger context.

  9. Step 2: Describe the program (activities) • Characterizing the need (or set of needs) addressed by the program; • Listing specific expectations as goals, objectives, and criteria for success; • Clarifying why program activities are believed to lead to expected changes;

  10. Step 2: Describe the program (activities) • Drawing an explicit logic model to illustrate relationships between program elements and expected changes; • Assessing the program’s maturity or stage of development; • Analyzing the context within which the program operates; • Considering how the program is linked to other ongoing efforts; and • Avoiding creation of an overly precise description for a program that is under development.

  11. Step 3: Focus the evaluation design (defined) • Planning in advance where the evaluation is headed and what steps will be taken. • The process continues until a focused approach is found to answer evaluation questions with methods that stakeholders agree will be: • Useful • Feasible • Ethical • Accurate • Evaluation questions and methods can be adjusted to achieve an optimal match that makes the evaluation useful to primary users.

  12. Step 3: Focus the evaluation design (role) • Provides investment in quality • Increases the chances that the evaluation will succeed by identifying procedures that are: • Practical • Politically viable • Cost-effective • Failure to plan thoroughly can be self-defeating, leading to an evaluation that might become impractical or useless. • When stakeholders agree on a design focus, it is used throughout the evaluation process to keep the project on track.

  13. Step 3: Focus the evaluation design (activities) • Meeting with stakeholders to clarify the real intent or purpose of the evaluation; • Learning which persons are in a position to actually use the findings, then orienting the plan to meet their needs; • Understanding how the evaluation results are to be used; • Writing explicit evaluation questions to be answered;

  14. Step 3: Focus the evaluation design (activities) • Describing practical methods for sampling, data collection, data analysis, interpretation, and judgment; • Preparing a written protocol or agreement that summarizes the evaluation procedures, with clear roles and responsibilities for all stakeholders; and • Revising parts or all of the evaluation plan when critical circumstances change.

  15. Step 4: Gather credible evidence (defined) • Compiling information that stakeholders perceive as trustworthy and relevant for answering their questions. • Such evidence can be experimental or observational, qualitative or quantitative, or it can include a mixture of methods. • Adequate data might be available and easily accessed, or it might need to be defined and new data collected.

  16. Step 4: Gather credible evidence (defined cont.) • Whether a body of evidence is credible to stakeholders might depend on such factors as: • How the questions were posed • Sources of information • Conditions of data collection • Reliability of measurement • Validity of interpretations • Quality control procedures

  17. Step 4: Gather credible evidence (role) • Enhances the evaluation’s utility and accuracy • Guides the scope and selection of information and gives priority to the most defensible information sources • Promotes the collection of valid, reliable, and systematic information that is the foundation of any effective evaluation.

  18. Step 4: Gather credible evidence (activities) • Choosing indicators that meaningfully address evaluation questions; • Describing fully the attributes of information sources and the rationale for their selection; • Establishing clear procedures and training staff to collect high-quality information;

  19. Step 4: Gather credible evidence (activities) • Monitoring periodically the quality of information obtained and taking practical steps to improve quality; • Estimating in advance the amount of information required or establishing criteria for deciding when to stop collecting data in situations where an iterative or evolving process is used; and • Safeguarding the confidentiality of information and information sources.

  20. Step 5: Justify conclusions (defined) • Making claims regarding the program that are warranted on the basis of data that have been compared against pertinent and defensible ideas of merit, value, or significance (i.e., against standards of values) • Conclusions are justified when they are linked to the evidence gathered and consistent with the agreed on values or standards of stakeholders.

  21. Step 5: Justify conclusions (role) • Reinforces conclusions central to the evaluation’s utility and accuracy • Involves: • Values clarification • Qualitative and quantitative data analysis and synthesis • Systematic interpretation, and appropriate comparison against relevant standards for judgment

  22. Step 5: Justify conclusions (activities) • Using appropriate methods of analysis and synthesis to summarize findings; • Interpreting the significance of results for deciding what the findings mean; • Making judgments according to clearly stated values that classify a result (e.g., as positive or negative and high or low); • Considering alternative ways to compare results (e.g., compared with program objectives, a comparison group, national norms, past performance, or needs);

  23. Step 5: Justify conclusions (activities) • Generating alternative explanations for findings and indicating why these explanations should be discounted; • Recommending actions or decisions that are consistent with the conclusions; and • Limiting conclusions to situations, time periods, persons, contexts, and purposes for which the findings are applicable.

  24. Step 6: Ensure use and share lessons learned (defined) • Ensuring that • a) stakeholders are aware of the evaluation procedures and findings • b) the findings are considered in decisions or actions that affect the program (i.e., findings use) • c) those who participated in the evaluation process have had a beneficial experience (i.e., process use).

  25. Step 6: Ensure use and share lessons learned (role) • Ensures that evaluation achieves its primary purpose — being useful • Several factors might influence the degree of use: • Evaluator credibility • Report clarity • Report timeliness and dissemination • Disclosure of findings • Impartial reporting • Changes in the program or organizational context

  26. Step 6: Ensure use and share lessons learned (activities) • Designing the evaluation to achieve intended use by intended users; • Preparing stakeholders for eventual use by rehearsing throughout the project how different kinds of conclusions would affect program operations; • Providing continuous feedback to stakeholders regarding interim findings, provisional interpretations, and decisions to be made that might affect likelihood of use; • Scheduling follow-up meetings with intended users to facilitate the transfer of evaluation conclusions into appropriate actions or decisions; and • Disseminating both the procedures used and the lessons learned from the evaluation to stakeholders, using tailored communications strategies that meet their particular needs.

More Related