1 / 62

CSAP’s CAPT Southwest Region Service to Science Academy March 28-29, 2011 Oklahoma City, OK

CSAP’s CAPT Southwest Region Service to Science Academy March 28-29, 2011 Oklahoma City, OK. Presented by CSAP’s CAPT Southwest Regional Team Southwest Prevention Center The University of Oklahoma 1639 Cross Center Drive, Suite 254 Norman, OK 73019 (800) 853-2572.

cachez
Download Presentation

CSAP’s CAPT Southwest Region Service to Science Academy March 28-29, 2011 Oklahoma City, OK

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSAP’s CAPT Southwest Region Service to Science AcademyMarch 28-29, 2011Oklahoma City, OK Presented by CSAP’s CAPT Southwest Regional Team Southwest Prevention Center The University of Oklahoma 1639 Cross Center Drive, Suite 254 Norman, OK 73019 (800) 853-2572

  2. Approaches to Prevention Evaluation • How Do You Feel About Evaluation? • What comes up for you when you think about it?

  3. Learning Objectives • Understand the purposes of evaluation • Understand the key components of evaluation • Use a logic model as a guide to create an evaluation plan • Identify measures and sources of data for evaluation • Identify the benefits of internal and external evaluation • Use evaluation findings to assist with decision-making

  4. Prevention Evaluation Why is it important?

  5. Evidence-Based Prevention • A prevention activity is judged to be evidence-based if “good” evaluation – research that has been shown to be rigorous according to a set of carefully defined criteria – demonstrates that the activity is effective. • The evaluation should demonstrate that: • the activity produces the expected positive results intended, and • these results can be attributed to the activity or program rather than to other factors.

  6. Evaluation The systematic collection and analysis of information about program activities, characteristics, and outcomes to reduce uncertainty, improve effectiveness, and make decisions.

  7. It’s About Utility • Planning programs • Monitoring implementation of programs • Improving or redesigning programs • Advancing knowledge of innovative programs • Providing evidence of program effectiveness

  8. Definitions Process Evaluation – Documenting program implementation Outcome Evaluation – Documenting effects that you expect to achieve after the program is implemented

  9. Traditional vs. Collaborative Evaluation

  10. The Collaborative Model The primary mechanism is an evaluation team made up of: • Evaluator • Program Staff • Other Stakeholders (e.g., in a school-based program, stakeholders may include curriculum designers, school board members, teachers, parents, students)

  11. Engage Stakeholders Ensure Use and Share Lessons Learned Describe the Program Focus the Evaluation Design Justify Conclusions Select Appropriate Methods Framework for Evaluation

  12. Step 1: Engage Stakeholders

  13. Stakeholders Those organizations and individuals who care about either the program or the evaluation findings In general, anyone who has something to gain or lose from the program

  14. Activity 1 : Who Are Your Stakeholders? • Identify the stakeholders in your program • Identify their interests • Rank order from most to least important • Program stakeholders • Evaluation stakeholders

  15. Activity 1 :Things to Keep in Mind • Did you put yourself on the list? • Did you identify competing needs? • Were the agendas of all stakeholders explicit? • Were you clear about what could and couldn’t be accomplished?

  16. Step 2: Describe the Program

  17. Definition of a Logic Model Description of what a program is expected to achieve and how it is expected to work A map linking together a project’s goals, activities, services, and inclusive of assumptions

  18. Benefits of a Logic Model • Develops understanding • Helps monitor progress • Serves as an evaluation framework • Helps expose assumptions • Helps restrain over-promising • Promotes communications

  19. Activity 2: Let Purpose Be Your Guide • Discuss among yourselves what’s the purpose of your evaluation; the utility. • Acquaint your S2S evaluator consultant and/or your invited evaluators with your program and what purpose you hope that an evaluation would help with.

  20. Blank Logic Model

  21. Designing a Logic Model • Goals: What risk and protective factors will be addressed? • Focus Population: Who will participate in, or be influenced by, the program? • Strategies: What services and activities will be provided? • “If-Then” Statements: How will these activities lead to expected outcomes? • Short-Term Outcomes: What immediate changes are expected for individuals, organizations, or communities? • Long-Term Outcomes: What changes would the program ultimately like to create?

  22. Sample Logic Model A

  23. Sample Logic Model A

  24. Sample Logic Model A

  25. Assumptions • Identify the assumptions underlying your program. • Do your program activities lead logically to your goals? • How and why do you expect your program to achieve your goals? • What are the steps that will lead logically from your program activities to your goals?

  26. “If-Then” Statements • IF the program invests time and money to develop an inventory of the drug-free activities…THEN youth will be more informed about opportunities within the community. • IF youth know what’s available…THEN they’ll be more likely to participate. • IF youth participate in alternative drug-free activities…THEN they’ll be more likely to develop friendships with non-using peers and THEN be less likely to use ATOD themselves.

  27. Sample Logic Model A

  28. The Short & Long of It Short-Term Outcomes – the immediate program effects that you expect to achieve (e.g., improving problem solving skills) Long-Term Outcomes – the long-term or ultimate effects of the program (e.g., reducing drug use)

  29. A Word about Outcomes • There are no right number of outcomes • The more immediate the outcome, the more influence the program has over its achievement • The longer term the outcome, the less direct influence a program has over its achievement • Because other forces affect an outcome doesn’t mean it shouldn’t be included • Long-term shouldn’t go beyond the program’s purpose or focus audience

  30. Sample Logic Model A

  31. Activity 3: Generating a Logic Model • Review the example, “Sample Logic Model A.” • Complete Row 1 of the Blank Logic Model work sheet using your program. • Record your Logic Model on chart paper and post.

  32. Step 3: Focus the Evaluation Design

  33. Activity 4: Logic Model Focus • Focus your Logic Model on the area in which you would like to direct your evaluation efforts. You may have brought a Program Logic Model (or you may need to develop one), review and assess your possibilities recalling the purpose you may have identified earlier. • Discuss this at your table.

  34. Designing an Evaluation Clarify PURPOSE of evaluation that leads to  QUESTIONS that require  INFORMATION and data obtained from  METHODS

  35. Points to Consider • Keep in mind the purpose of the evaluation • What’s going to be evaluated • Who wants to know what • When you need the information • What you intend to do with the evaluation results • Resources you have available for the evaluation (e.g., time, money, people)

  36. Be realistic . . . But think creatively.

  37. Tips for Generating Evaluation Questions • 3 to 5 questions are often adequate. • Use open-ended questions, not “yes-no.” • Avoid compound questions, i.e., questions that include multiple statements. • A good rule of thumb is that the questions start with “To what extent . . .”

  38. Sample Evaluation Questions Process Evaluation • How are resources allocated to various activities? • To what extent was the program implemented as planned? • What obstacles were encountered during program implementation? Outcome Evaluation Over the duration of the program, to what extent has: • School attendance improved? • Community-wide prevention awareness activities changed adult norms about substance use? • Youth substance use decreased?

  39. Beware of the “Black Box”! Outcome evaluations that focus solely on program effects are dangerous!

  40. Activity 5: Developing Your Questions • Complete Row 2 of your “Blank Logic Model” work sheet, developing 2-3 evaluation questions for each column. • Select the top 3 evaluation questions you would like answered.

  41. Sample Logic Model A

  42. Step 4: Select Appropriate Methods

  43. Evaluation Methods

  44. Sample Logic Model A

  45. Activity 6: Identifying Data Collection Sources • Complete Row 3 of your “Blank Logic Model” work sheet, identifying 2-3 data sources for each column. • What gaps are you noticing? Does your logic model still link together? • Are you beginning to favor a particular method? Quantitative? Qualitative? Or both?

  46. Benefits of Quantitative Methods • Standardized • Succinct • Easily aggregated for analysis • Systematic • Easily presented in short space • The ability to generalize is widely accepted

  47. Benefits of Qualitative Methods • Detailed and variable • Unanticipated benefits and/or concerns are possible to detect • Offer explanations for short-term outcomes • Help generate new ideas and/or theories

  48. Benefits of Multi-Method Evaluation • Understand program processes and outcomes from multiple perspectives • Strengths of some methods compensate for weaknesses in others • Results will be useful to a variety of audiences • Results will be credible to a variety of audiences

  49. Limitations of Multi-Method Evaluation • Requires multiple evaluation skills and evaluation team members • Cost is usually higher than single method evaluations • Methodological rigor possible with single method evaluation may be sacrificed • Contradictory or inconsistent findings may require additional analysis and increase complexity of reporting

More Related