1 / 15

Logic Models: How to Develop, Link to M&E and Adapt

Logic Models: How to Develop, Link to M&E and Adapt . Lesli Hoey PhD Candidate Cornell Department of City and Regional Planning. Evaluating Int’l Development Projects: One-Day Skills Building Workshop on M&E Cornell International Institute for Food and Agriculture Development

trory
Download Presentation

Logic Models: How to Develop, Link to M&E and Adapt

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Logic Models: How to Develop, Link to M&E and Adapt Lesli Hoey PhD Candidate Cornell Department of City and Regional Planning Evaluating Int’l Development Projects: One-Day Skills Building Workshop on M&E Cornell International Institute for Food and Agriculture Development November 5, 2011

  2. Outline How to develop a logic model Using logic models to design M&E M&E across program phases Linear vs. complex interventions

  3. Step 1: Purpose and use Why are you developing a logic model? Who will use it? How? Step 2: Involve others Who should participate in creating the logic model? Step 3: Set the boundaries for the logic model What will the logic model depict: a single, focused endeavor; a comprehensive initiative; a collaborative process? What level of detail is needed? Step 4: Understand the situation What is the situation giving rise to the intervention? What do we know about the problem/audience/context? Developing a Logic Model Adapted from: Taylor-Powell and Henert, 2008

  4. Everyone identifies resources, activities, participants and outcomes on post-it notes arranged on wall. Check for “if-then” relationships, edit duplicates, ID gaps, etc. Small subgroups develop their own logic model of the program. The whole group merges these into one. Participants bring a list of program outcomes. Sort into short- and long-term outcomes by target group. Edit duplicates, ID gaps, etc. Discuss assumptions about chain of outcomes, external factors. Link resources, activities. Use web-based systems, e-mail or other distance methods. Subcommittee creates the model and reviews with others. Process Options Adapted from: Taylor-Powell and Henert, 2008

  5. Logic Models & Evaluation Helps us match evaluation to the program Helps us know what and when to measure • - Are you interested in process and/or outcomes? Helps us focus on key, important information • -Where will you spend limited evaluation resources? • - What do we really need to know? Source: Taylor-Powell and Henert, 2008

  6. Types of Evaluation Mapped Across the Logic Model Outcome evaluation: To what extent are desired changes occurring? Goals met? Who is benefiting/not benefiting? How? What seems to work? Not work? What are unintended outcomes? Impact evaluation: To what extent can changes be attributed to the program? What are the net effects? What are final consequences? Is program worth resources it costs? Needs/asset assessment: What are the characteristics, needs, priorities of target population? What are potential barriers/facilitators? What is most appropriate to do? Process evaluation: How is program implemented? Are activities delivered as intended? Fidelity of implementation? Are participants being reached as intended? What are participant reactions? Source: Taylor-Powell and Henert, 2008

  7. Water Quality Project Example Formative Evaluation Questions Summative Evaluation Questions Indicators Source: Taylor-Powell, 2002

  8. Program Phases and Evaluation F O R M A T I V E Initiation – Need dynamic, flexible, rapid feedback about implementation and process. Includes monitoring, post-only feedback, unstructured observation, sharing of implementation experiences. Mostly qualitative. Development – Focus on observation, assessment of change in key outcomes, emerging consistency. Includes pre-post differences. Qualitative or quantitative. Mature – When a program is routinized and stable, compare outcomes with expectations, with performance in alternative programs, or sites with no program. Includes experimental and quasi-experimental designs, more structured and comparative qualitative approaches. Dissemination – Focused on transferability, generalizability or external validity. Measure consistency of outcomes across different settings, populations or program variations. S U M M A T I V E Source: Trochim, 2006

  9. Linear Newtonian causality Interdependent systems relationships Complex nonlinear dynamics Three ways of conceptualizing and mapping theories of change Source: Patton, 2008

  10. Interdependent Systems Relationships SHORT-TERM OUTCOMES OUTPUTS Dept 1 MID-TERM OUTCOMES Dept 2 LONG-TERM OUTCOMES Dept 3 Dept 4 Adapted from Chapel, 2006 in Taylor-Powell and Henert, 2008

  11. Strong High Capacity Coalitions Source: Patton, 2008 Timely, Strong Opportunistic Lobbying & Judicial Engagement National/ Grassroots Coordination EFFECTIVE ADVOCACY Complex, Non-Linear Intervention Solid Knowledge & Research Base Disciplined Focused Message/ Effective Communications Collaborating Funders/ Strategic Funding

  12. Conditions that challenge traditional model-testing evaluation • High innovation • Ongoing development • High uncertainty • Dynamic, rapid change • Emergent (difficult to plan and predict) • Systems Change • Interdependence Adaptive Management Adaptedfrom: Patton, 2008

  13. Ideal Type Evaluation Models Adaptedfrom: Patton, 2008

  14. Useful Resources See CIIFAD website for evaluation institutes and WMU Visit U Wisconsin Extension website Look at these books: Bamberger, M., Rugh, J. and M. Linda. 2011 (2nd Ed). Real World Evaluation Working Under Budget, Time, Data, and Political Constraints. Los Angeles: Sage. Patton, M.Q. 2008 (4th Ed). Utilization-Focused Evaluation. Los Angeles: Sage. Patton, M.Q. 2011. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. NY: Guilford Press. Williams, B and I. Imam. 2006. Systems Concepts in Evaluation – An Expert Anthology. Point Reyes CA: Edge Press/AEA World Bank.2006. Conducting Quality Impact Evaluations Under Budget, Time and Data Constraints. Washington, DC: Author

  15. References Cited Patton, M.Q. 2008. “Evaluating the complex: Getting to maybe”. Power point presented in Oslo, Norway. Available online: aidontheedge.files.wordpress.com/2009/09/patton_oslo.ppt Taylor-Powell, E. and E. Henert. 2008. “Developing a logic model: Teaching and training guide”. Madison: University of Wisconsin – Extension Trochim, W. 2007. “Evolutionary perspectives on evaluation: Theoretical and practical implications”. Paper Presented at the Colorado Evaluation Network

More Related