1 / 44

Evelyn Gonzalez

Evelyn Gonzalez. Program Evaluation. Making a difference. AR Cancer Coalition Summit XIV March 12, 2013. Evaluating Programmatic Efforts. Objectives. Overview of evaluation Defining SMART objectives for your goals Know how to use different methods of evaluation

annora
Download Presentation

Evelyn Gonzalez

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evelyn Gonzalez Program Evaluation

  2. Making a difference AR Cancer Coalition Summit XIV March 12, 2013 • Evaluating Programmatic Efforts

  3. Objectives • Overview of evaluation • Defining SMART objectives for your goals • Know how to use different methods of evaluation • Be more willing to evaluate your efforts

  4. What is Program Evaluation? …the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improveprogram effectiveness, and/or inform decisions about future programming. (Patton, Utilization Focused Evaluation, 1997)

  5. Why Evaluate • Did the program/intervention work? • Was it worth it? • What worked; what didn’t? • Who did we reached? • Did we get our monies worth?

  6. When should we begin Evaluation?

  7. Evaluation Plan An evaluation plan is the “blueprint” • What will be evaluated • What information will be collected • When will it be collected • What will be done with the results

  8. CDC Framework for Evaluation

  9. Identify an Evidence- Base Program (EBP) Planning Evaluation Phase Implementation Phase • Evaluation • As you implement • End of program Planning Phase • Implementation of the program • Gather data as you go • Monitor • Establish Goals & Objectives • Establish baseline START • Collect • Analyze Data Community/Audience Stakeholders

  10. Goals: Definition • The “grand reason” for engaging in your public health effort • Span 3 or more years • State the desired end result of the program.

  11. Objectives: Definition • More specific than goals. • They state how the goals will be achieved in a certain timeframe. • Well written objectives are SMART: • Specific • Measurable • Achievable • Realistic and Relevant • Time-framed

  12. S.M.A.R.T. • Specific • Who are you reaching (priority audience)? • What intervention will you use? • Where, setting

  13. S.M.A.R.T. • Measurable • Dosing, how many times will you do the intervention • What is the expected outcome • Increase of X% following the intervention • Decrease of smoking by X%

  14. S.M.A.R.T. • Attainable • Is your intervention feasible? • Realistic and Relevant • Does the objective match the goal? • Is it evidence-based program (EBP)?

  15. S.M.A.R.T. • Time-framed • By when do you anticipate the change? • End of the session • 3,6,9 months • 5 years

  16. SMART Objective Exercise • You are working on an intervention that will increase awareness about breast cancer risk • Objective 1: Participants will be aware of the major risk factors for developing breast cancer. • How can this be re-written to be SMART?

  17. SMART Objective Exercise • Original: • Participants will be aware of the major risk factors for developing breast cancer. • SMART Objective: • Upon post test following the intervention, participants will be able to identify 3 major risk factors for developing breast cancer.

  18. Re-written: • Original: • This program will increase screening for colorectal cancer in Arkansas. • SMART: • Colorectal screening will be increased by 5%, over the prior year for age appropriate males in Arkansas.

  19. Goal: Promote and Increase the Appropriate Utilization of High-Quality Breast Cancer Screening • Objective 1: Public Education for Breast Cancer Screening – • Increase knowledge and improve attitudes of all women with regards to the importance of breast cancer screening Strategy 1 – Promote campaigns to educate the public about the importance of mammography. • Action 1 – Increase awareness among all women 40 and older of the importance of regular breast cancer screening

  20. The Evaluation Procedure • Planning—Develop the questions, consult with the program stakeholders or resources, make a timeline • Data Collection—Pilot testing. How will the questions be asked? Who will ask them? • Data Analysis—Who will analyze the data and how? • Reporting—Who will report and how? Who will receive the data and when? How will it affect the program • Application—How could your results be applied in other places?

  21. Planning for Evaluation • Look at the evaluation methods used in the original EBP. • When discussing evaluation, think about these questions: • What is important to know? • What do you need to know versus what is nice to know? • What will be measured and how? • How will this information be used?

  22. Some definitions… • Indicators or measures are the observable and measurable data that are used to track a program’s progress in achieving its goals. • Monitoring (program or outcome monitoring, for example) refers to on-going measurement activity

  23. Process Evaluation • Process evaluation can find problems early on in the program. • It includes an assessment of the staff, budget review, and how well the program is doing overall. • For this kind of evaluation, it may be useful to keep a log sheet to record each of your activities. From Windsor et al., 1994

  24. Impact Evaluation • Impact evaluation can tell if the program has a short-term effect on the behavior, knowledge, and attitudes of your population. • It also measures the extent to which you have met your objectives. From Green and Kreuter, 1991

  25. Outcome Evaluation • Outcome evaluation looks to see if the long-term program goals were met. • These goals could be changes in rates of illness or death, as well as in the health status of your population. From McKenzie & Smeltzer, 1997

  26. Application to Your Program: • Identify Program Goals • For each goal: • Identify Process Objectives • Identify Outcome Objectives • For each objective: • Identify Indicators • Identify Data Source • Plan Data Collection • Plan Data Analysis

  27. Data Collection Methods Surveys Interviews Focus Groups Observation Document Review

  28. Pre- and Post-Evaluation • You may develop a way to compare the baseline data from the needs assessment with the final outcome of your program. • Pre/Post survey in an education session. • This will let you see if you have achieved your objectives.

  29. Information Collection • Primary sources • Quantitative: Surveys/questionnaires • Qualitative: Focus groups, public meetings, direct observation • Qualitative: In-depth interviews with community leaders, interviews with other program planners.

  30. Strategies • Will depend on which EBP/Intervention selected • Answer these questions: • What specific behaviors do I want my audience to acquire or enhance? • What information or skills do they need to learn to act in a new way? • What resources do I need to carry out the program? • What methods would best help me meet my objectives?

  31. Using Mixed Data Sources/Methods • Involves using more than one data source and/or data collection method.

  32. Program Objectives and Evaluation • Your objectives should be measurable so that they can be evaluated. • The evaluation should be in line with your objectives. • Try not to make up new things to evaluate.

  33. Pilot Testing • You may want to do a pilot test in order to evaluate the effect of your program. • A pilot test is a practice run using a small group who are similar to your target audience.

  34. Replicating the Evaluation • Evidence-based programs have already done some type of evaluation. • Look to see how the program was evaluated before. Try to use the same methods. • You do not have to evaluate everything!

  35. Monitoring Progress

  36. Now that you’ve collected the data, what do you do with it? • Analyzing data • Who • When • How • Interpretation of results and sharing findings

  37. So what? • Must be able to answer this! • Do not just look for the good outcomes • Learn from what didn’t work • Share both the positive and negative outcomes

  38. Developing Recommendations Your evaluation’s recommendations should be: • Linked with the original goals/SMART objectives. • Based on answers to your evaluation questions. • Should have stakeholder input • Tailored to the end users of the evaluation results to increase ownership and motivation to act.

  39. Sharing Recommendations Community • Executive Summary • Final Report • Newsletter article(s) • Website article • Town hall meeting(s) • Radio interviews • Local newspapers Institution & Yourself • Executive Summary • Final Report • Journal articles • Professional conferences • Poster sessions • Meetings with colleagues

  40. Tips & considerations • Consult with partners with evaluation experience • Budget 10-15% for evaluation • Staffing • Build a database • Analysis • Consider pilot testing your program • Pilot test your evaluation method & tool(s)

  41. Trust yourself. You know more than you think you do! Benjamin Spock

More Related