1 / 30

Maine Philanthropy Center October 8, 2008

Maximizing Impact: Helping Funders Get Smart About Evaluation Kien S. Lee Association for the Study and Development of Community 438 N. Frederick Avenue, Suite 315, Gaithersburg, MD 20877 (301) 519-0722, ext. 108  kien@capablecommunity.com. Maine Philanthropy Center October 8, 2008.

amy
Download Presentation

Maine Philanthropy Center October 8, 2008

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Maximizing Impact: Helping Funders Get Smart About EvaluationKien S. LeeAssociation for the Study and Development of Community438 N. Frederick Avenue, Suite 315, Gaithersburg, MD 20877(301) 519-0722, ext. 108  kien@capablecommunity.com Maine Philanthropy Center October 8, 2008 Association for the Study and Development of Community

  2. I have to rewrite the evaluator’s report because it is too jargony and not useful. After reading it, I still have no idea how we are doing. The evaluator told us that our program was not ready for evaluation I’m told an internal evaluation is not objective The funder had unrealistic expectations. There is no way we can measure these outcomes given the constraints We submit all this information to the funder, I’m not sure anyone actually reads them I want to evaluate a public education campaign, how much will it cost? Association for the Study and Development of Community

  3. American Evaluation Association Guiding Principleswww.eval.org • Systematic Inquiry • Competence • Integrity/Honesty • Respect for People • Responsibilities for General and Public Welfare Association for the Study and Development of Community

  4. Useful Terms to Know About • Evaluability Assessments • Prospective Evaluations • Retrospective Evaluations • Formative, Process, Implementation Evaluations • Summative, Outcome, Impact Evaluations • Meta-Evaluations Association for the Study and Development of Community

  5. Readiness • Clear and realistic goals, objectives, and strategies • Clear understanding of how strategies will lead to anticipated outcomes • Capacity to collect the data needed • Commitment to learning • Resources to hire an evaluator Association for the Study and Development of Community

  6. Internal or External Evaluator? • Purpose • Roles in relation to other parts of the organization • Knowledge and skills • Existing systems • Resources Association for the Study and Development of Community

  7. What Do You Look For in An Evaluator? • Knowledge of Evaluation • Philosophical and Methodological Orientations • Commitment to Professional Development • Cultural Competency • Special Skills and Experiences Required for the Evaluation Association for the Study and Development of Community

  8. Multiple Hats Worn by An Evaluator Conflict Resolution Manager Learning Facilitator Advocate for Social Justice Evaluator Educator Association for the Study and Development of Community

  9. What if you can’t afford a full-fledged evaluation? • Identify and monitor only those outcomes and indicators that are critical • Engage external evaluator to review your logic model, measures, and/or instruments • Engage external evaluator to double-check quality of data and analyses • Engage external evaluator to develop monitoring system and train staff Association for the Study and Development of Community

  10. Ask the following questions over and over again: • What do I want to know? • Who else needs to be involved? • Strengths and limitations? • How will the findings be used? • When will the findings be needed? • What is the best way to communicate the findings? Association for the Study and Development of Community

  11. What do I want to know from the evaluation? • Merit or worth? • Ways to improve our work? • Compliance of grantees? • Contribute to the field? • Inform policies? Patton, M. (1997). Utilization-focused Evaluation. Thousand Oaks, CA: SAGE Publications Mark, M., Henry, G., & Julnes, G. (2001). Evaluation: An Integrated Framework for Understanding, Guiding, and Improving Policies and Program. San Francisco, CA: Jossey-Bass. Association for the Study and Development of Community

  12. Who else needs to be involved? Internal Stakeholders • Board and Executive Leadership • Staff External Stakeholders • Grantees • Community Leaders • Partners Association for the Study and Development of Community

  13. What are the strengths and limitations of the evaluation? • What are the theories and assumptions underlying the evaluation? • What are the pros and cons of the evaluation design? • What can I say or not say about the findings and their implications? Association for the Study and Development of Community

  14. How do I plan to use the findings? • What decisions do I need to make based on the results? • What type of changes am I prepared to make? • What changes, if any, can be shared with some people and not with others? Association for the Study and Development of Community

  15. When do I need the findings? • What is the timeframe for making decisions? • When do my supervisor, board members, community leaders, policymakers, or others want to know about the evaluation findings? Association for the Study and Development of Community

  16. What is the best way to communicate the findings? Where will the evaluation findings be shared? • Staff meetings, press conference, board meetings, public hearings, etc. Who is the most effective messenger for the findings? • Evaluator, you, your supervisor, someone else Association for the Study and Development of Community

  17. Short, written communications Reports Executive summaries Newsletters, bulletins, briefs, and brochures Verbal presentations Video presentations Posters Working sessions Photography Cartoons Poetry Drama Examples of different forms of communicating evaluation findings: Preskill, H., Torres, R., & Piontek, M. (2004). Evaluation Strategies for Communication and Reporting. Thousand Oaks, CA: SAGE Publications Association for the Study and Development of Community

  18. Logic Modeling • Is a graphic illustration of your theory of change • Makes explicit underlying assumptions about the change process • Specifies connections between inputs, strategies, activities, immediate outcomes, intermediate outcomes, and long-term outcomes • Provides a process for reflection and improvement Association for the Study and Development of Community

  19. The logic model in non-jargon terms! • Provides the story line for your grantmaking and evaluation • Pushes you to ask and answer the question, “If you do A, then what do you expect to happen? If your answer is B, tell me why you think B will occur.” • A picture that lays out your investment hypothesis Association for the Study and Development of Community

  20. Assessing the Foundation’s Effectiveness and Impact • Start with the overall theory of change and logic model for the entire foundation • Then, develop logic models for each program priority area • Then, develop logic models for each program or initiative within priority area • Check to ensure that each logic model supports the one “above” and “below” it Association for the Study and Development of Community

  21. Example of A Simple Logic Model Immediate Outcomes Inputs Strategy #1 Activities Strategy #2 Activities Long-term Outcomes Intermediate Outcomes Contextual Conditions Association for the Study and Development of Community

  22. Indicators and Benchmarks Road signs that you are moving in the right direction • Shows the change or progress over a defined time period • Directly related to your anticipated outcome • Can be in the form of a number or statistic where you can see an increase or decrease or a first-time practice or procedure that marks a significant change Association for the Study and Development of Community

  23. In An Initiative with Multiple Grantees and Sites…. • Develop uniform indicators across grantees that demonstrates funder’s effectiveness • Establish these uniform indicators ahead of time before grant is awarded • Build data requirements into progress reporting template • Provide technical assistance to grantees to track the uniform indicators and indicators specific to their effort Association for the Study and Development of Community

  24. Measuring Context • Rival explanations or counterfactuals • Qualitative description (converging evidence) • Statistical methods (multiple regression, hierarchical linear modeling) Association for the Study and Development of Community

  25. How can we best learn from evaluation? • Conduct reflection meetings regularly • Be prepared for different scenarios • Set up appropriate structures and processes for engaging all the stakeholders (be creative!) • Communicate mid-course adjustments that resulted from learnings • Identify promising practices and learning assets Association for the Study and Development of Community

  26. Why do some evaluation reports sit on the shelf? • The evaluation purpose is not connected to learning • The report is too technical and jargon-filled • The report does not address the learning questions • The analysis is weak, confusing, or inappropriate • The evaluation report is too late to be useful • The findings are deemed too controversial to discuss openly Association for the Study and Development of Community

  27. What should the evaluation report look like? • Things to consider based on audience needs: • Knowledge of technical language • Format • Length • Style • Number and type of reports (e.g., interim and final; internal and external) Association for the Study and Development of Community

  28. Characteristics of a Useful Evaluation Report • Flows like the logic model • Jargon-free, well-written • Findings first, evidence second • Uses a variety of illustrations • Answers key questions • Provides recommendations for mid-course adjustments or future improvements • Includes an executive summary Association for the Study and Development of Community

  29. Don’t forget – part of the reflection should focus on the evaluation itself! • What would you have done differently given data and resource constraints? • How could the reflection and learning process be improved? • Did we give appropriate feedback to the evaluator? Association for the Study and Development of Community

  30. Key Take-aways • Determine the purpose of your evaluation and the best evaluator for the job • Engage the appropriate stakeholders early on • Plan for the reporting, communication about the findings, and learning process at the beginning of the evaluation • Assess the context for presenting the findings • Know and apply AEA’s guiding principles • Hire an external evaluator for certain tasks • Anything else? Association for the Study and Development of Community

More Related