1 / 47

Evaluation 101

Evaluation 101. Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013. Coalition. A group of individuals representing diverse organizations or constituencies who agree to work together to achieve a common GOAL

quiana
Download Presentation

Evaluation 101

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation 101 Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013

  2. Coalition A group of individuals representing diverse organizations or constituencies who agree to work together to achieve a common GOAL - Feighery& Rogers, 1990

  3. Evaluation is . . . - the systematic collection of information . . . to reduce uncertainties, improve effectiveness, and make decisions (Michael Q. Patton, 1988)

  4. Why Evaluate? Provide Accountability to Community, Funders & Stakeholders Quality Efficiency Effectiveness Effectiveness Effectiveness Effectiveness

  5. Why Evaluate? What gets measured, gets done If you don’t measure results, you can’t tell success from failure If you can’t see success, you can’t reward it If you can’t reward success, you’re probably rewarding failure If you can’t see success, you can’t learn from it If you can’t recognize failure, you can’t correct it Adapted from: Reinventing Government, Osborne and Gaebler, 1992

  6. Why Evaluate? • Monitor overall progress toward goals • Determine whether individual interventions are producing the desired progress • Permit comparisons among groups • Continuous quality improvement • Ensure only effective programs are maintained • Justify the need for further funding

  7. Types of Evaluation infrastructure, functions and procedures extent of implementation realization of vision

  8. Collective Impact Model • Common agenda - a vision • Shared measurement system • Mutually reinforcing activities • Continuous communication • Backbone support organization

  9. Vision, Stakeholder Engagement Implementation of Mutually Reinforcing Activities Collective IMPACT

  10. Formative Evaluation • Why is the collaboration needed? • Do we have the resources needed? • Do we have strong leadership? • Are the right stakeholders represented? • Is a collaboration the best approach? • Is there a shared vision?

  11. Process Evaluation STRATEGIES • Are you implementing things as planned? • Are you reaching the target population? • Are you implementing with quality? • How many are you reaching? COALITION • Are the right people participating? • Are meetings productive? • Are workgroup charges clear • Is the work beginning?

  12. Outcome Evaluation • What has changed or improved? • Are we achieving our intended goals? • Was the effort worth the time & costs? ShortIntermediateLong Term

  13. Evaluating a Coalition Can be tricky • need to evaluate my own accomplishments without undermining success of the whole • all in this together but how do we distinguish the contribution of one agency or one stakeholder from another

  14. CDC’s Framework for Evaluation • Engage Stakeholders • Set Goals • & Plan Programs • Ensure Use & Share Lessons Learned • Focus the Evaluation Design • Analyze Data & Interpret Results • Choose Methods & Collect Data

  15. Standards to Consider

  16. CDC’s Framework for Evaluation • Engage Stakeholders • Set Goals • & Plan Programs • Ensure Use & Share Lessons Learned • Focus the Evaluation Design • Analyze Data & Interpret Results • Choose Methods & Collect Data

  17. Engaging Stakeholders • Include Stakeholders who are: • Implementers • Partners • Participants – those affected • Decision-makers • Establish evaluation team at onset with areas for stakeholder input • Obtain buy-in & commitment to plan

  18. CDC’s Framework for Evaluation • Engage Stakeholders • Set Goals • & Plan Programs • Ensure Use & Share Lessons Learned • Focus the Evaluation Design • Analyze Data & Interpret Results • Choose Methods & Collect Data

  19. Set Goals and Develop Plan • Problem Statement - define the need Who? Where? Why? • Envision the Future - Set Your Goals • Set the Context • Select Strategies and Set Targets • Connect the Dots . . . create a Logic Model

  20. Logic Models A logic model is a road map for the shared work of all of the stakeholders... it answers the questions: • Where are we now? • Where are we going? • How will we get there?

  21. A Logic Model . . . Can it get any simpler? Needs and Strengths Strategies Outcomes

  22. Strategies Resources Outcomes Activities Outputs Short Medium Long-term Needs & Strengths Our results What we do Who we reach Where we are Logical Chain of Connections University of Wisconsin-Extension, Program Development and Evaluation

  23. Detailed logic model University of Wisconsin-Extension, Program Development and Evaluation

  24. CDC’s Framework for Evaluation • Engage Stakeholders • Set Goals • & Plan Programs • Ensure Use & Share Lessons Learned • Focus the Evaluation Design • Analyze Data & Interpret Results • Choose Methods & Collect Data

  25. Focus the Evaluation Design What do we want to know? • Coalition • Programs • Participants • Outcomes • Coalition impact • Influencing factors

  26. Develop Indicators • What Will Change? • For Who? • By How Much? • By When? • Indicators for Activities - process indicators • Indicators for Outcomes - outcome indicators • There can be more than one indicator for each activity or outcome

  27. Level of impact

  28. Coalition Evaluation Questions • Are we meeting our members’ needs? • Do our work groups function well? • Have we improved community awareness? • Are we influencing policies & practices? • Are we building organizational/community capacity? • Are we building strategic partnerships? • Have we strengthened our base of support? • Are we reaching our priority audiences? • Which strategies are effective? • Are we making a difference?

  29. CDC’s Framework for Evaluation • Engage Stakeholders • Set Goals • & Plan Programs • Ensure Use & Share Lessons Learned • Focus the Evaluation Design • Analyze Data & Interpret Results • Choose Methods & Collect Data

  30. Choose Methods and Collect Data • Collect enough data to be reliable, but consider burden • Consider existing data sources • Don’t try to measure everything • Use mixed methods • Qualitative& Quantitative

  31. Methods • Focus Groups • Interviews • Structured Observations • Document/ Record Review • Case Studies • Surveys • Participant Assessments • Statistical Analysis of program data • Cost-Benefit Analysis

  32. CDC’s Framework for Evaluation • Engage Stakeholders • Set Goals • & Plan Programs • Ensure Use & Share Lessons Learned • Focus the Evaluation Design • Analyze Data & Interpret Results • Choose Methods & Collect Data

  33. Analyze Data& Interpret Results • Organize and classify the data • Tabulate – counts, numbers, descriptive statistics • Identify Themes • Stratify – look at data by variables/demographics • Make Comparisons • pre-post • between groups • Present Data in clear format – use narratives, charts, tables, graphs, maps

  34. CDC’s Framework for Evaluation • Engage Stakeholders • Set Goals • & Plan Programs • Ensure Use & Share Lessons Learned • Focus the Evaluation Design • Analyze Data & Interpret Results • Choose Methods & Collect Data

  35. Ensure Use & Share Lessons Learned • Recommendations • Preparation - Engage & Guide Stakeholders • Feedback • Follow-up • Dissemination

  36. The only man who behaves sensibly is my tailor; he takes my measurements anew every time he sees me, while all the rest go on with their old measurements and expect me to fit them. - George Bernard Shaw

  37. QUESTIONS?

  38. RESOURCES Full Resource List on Literacy Powerline Website University of Wisconsin – Extension www.uwex.edu/ces/pdande U.S. Dept. HHS CDC - Strategy & Innovation www.cdc.gov/eval/guide/CDCEvalManual.pdf Annie E. Casey Foundation www.aecf.org Two reports by Organizational Research Services • A Practical Guide to Documenting Influence and Leverage • A Guide to Measuring Advocacy and Policy.

  39. For more information… Literacy Powerline www.literacypowerline.com Apter & O’Connor Associates www.apteroconnor.com dianne@apteroconnor.com cynthia@apteroconnor.com 315-427-5747 • For more information…

  40. Research vs. Evaluation Research seeks to prove • Investigator-controlled • Authoritative • Scientific method – isolate /control variables • Limited number of Sources - accuracy • Facts – descriptions, associations, effects Evaluation seeks to improve • Stakeholder-controlled • Collaborative • Incorporate variables -account for circumstances • Multiple Sources - triangulation • Values – quality, value, importance

More Related