1 / 48

Logic Model: A Program Performance Framework

Logic Model: A Program Performance Framework. Madison Workshop December 5, 2001 Ellen Taylor-Powell, Ph.D. Evaluation Specialist UW- Extension-Cooperative Extension. A New Era What gets measured gets done If you don’t measure results, you can’t tell success from failure

odell
Download Presentation

Logic Model: A Program Performance Framework

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Logic Model: A Program Performance Framework Madison Workshop December 5, 2001 Ellen Taylor-Powell, Ph.D. Evaluation Specialist UW- Extension-Cooperative Extension

  2. A New Era • What gets measured gets done • If you don’t measure results, you can’t tell success from failure • If you can’t see success, you can’t reward it • If you can’t reward success, you’re probably rewarding failure

  3. If you can’t see success, you can’t learn from it • If you can’t recognize failure, you can’t correct it • If you can demonstrate results, you can win public support Osborne and Gaebler, 1992 in MQ Patton, 1997:14

  4. Logic Model is… • Picture of a program • Graphic representation of the program “theory” or “action” – what it invests, what it does and what results • Logical chain of if-then relationships; if x occurs, then y will occur • Core of program planning and evaluation

  5. LOGIC Reasonable To be expected • MODEL Represents reality, isn’t reality

  6. Why Logic Models--Why the Hype? • Shows difference between what we do and impact we are having • Provides a common vocabulary • Focus on quality and continuous improvement

  7. Logic Model: Origins • Private Sector: Total quality management • Public Sector: GPRA, performance budgeting • Non-Profit Sector: Outcomes measurement, eg. United Way • International Arena: Results Framework of USAID, etc. • Evaluators: Evaluability assessment, bennett hierarchy

  8. Logic Model S I T U A T I O N INPUTS OUTPUTS OUTCOMES

  9. Everyday Logic Model H E A D A C H E Getpills Takepills Feel better

  10. An Extension Example: Business Counseling Extension invests time and resources A variety of educational activities are provided to business owners who participate These owners gain knowledge and change practices resulting in Improved business performance

  11. OUTCOMES-IMPACT OUTPUTS INPUTS Short Medium Long-term Activities Participation S What the medium term results are What the short term results are I What the ultimate impact(s) is What we invest What we do Who we reach T U Staff Workshops Participants Learning Action Conditions Volunteers Meetings Customers A Time Counseling Citizens Awareness Behavior T Money Facilitation Knowledge Practice Social Materials Assessments Attitudes Decisions Economic I Equipment Product dev. Skills Policies Civic Technology Media work Opinions Social action O Environmental Partners Recruitment Aspirations Training Reactions N Motivations ASSUMPTIONS 1) ENVIRONMENT 2) 3) Influential factors 4) LOGIC MODEL: Program Performance Framework

  12. Families will have needs met Program invests time & money Resource inventory can be developed Families will know what is available Families will access services Logical Linkages: Series of If-Then Relationships IF THEN IF THEN IF THEN IF THEN INPUTS OUTPUT OUTCOMES

  13. Activity (p 2-12) Completing the causal linkages

  14. LOGIC MODEL: Program Performance INPUTS OUTPUTS OUTCOMES Activities Participation Short Medium Long-term Program investments What we invest What we do Who we reach What results SO WHAT??

  15. Outcomes vs. Activities BE OUTCOME DRIVEN, NOT ACTIVITY DRIVEN

  16. Activity (p 2-13) Program Performance Levels

  17. Situation • Situational analysis • Need • Asset • Problem analysis • Priority setting • Engaging others

  18. INPUTS OUTPUTS OUTCOMES Parents increase knowledge of child dev Design parent ed curriculum Staff Parents use improved parenting skills Reduced rates of child abuse & neglect Targeted parents attend Money Provide 6 training session Parents learn new ways to discipline Partners

  19. Activity (p 2-24) Which are the Outcomes-Impacts?

  20. SHORT MEDIUM LONG-TERM Seniors increase Practice safe cooling of food; food preparation guidelines Lowered incidence of food borne illness knowledge of food contamination risks Participants increase Establish financial goals, Reduced debt and knowledge and skills in use spending plan increased savings financial management Community increases Residents and employers Child care needs are met understanding of discuss options and childcare needs implement a plan Empty inner city parking Youth and adults learn Money saved, nutrition lot converted to gardening skills, nutrition, improved, residents enjoy community garden greater sense of food preparation and mgt. community Chain of Outcomes

  21. Individual Child, parent, client, resident Group family, team, community group Agency, organization System Community Child is prepared to enter school; teen uses savings/spending plan Community group has inclusive membership policy; family increases its savings Communications are more open; agency adopted smoke-free policy Family serving agencies share resources Shared community responsibility has increased; youth are valued as contributing members Focus of Outcomes

  22. How far out the outcome chain do we go? • What is logical? • What is realistic? • What is meaningful?

  23. INPUTS OUTPUTS OUTCOMES Parents increase knowledge of child dev Develop parent ed curriculum Staff Parents use improved parenting skills Reduced rates of child abuse & neglect Targeted parents attend Money Deliver 6 interactivesessions Parents learn new ways to discipline Partners Research WHICH OUTCOMES???

  24. Outcome of Interest? • Inherently valued outcome (Mohr, 1995) • higher level outcome is immaterial • we are willing to assume that a higher outcome will also be attained if we achieve the outcome of interest • Participant valued outcome: if participants experience a change or benefit that makes a a real difference to them (United Way, 1999)

  25. Assumptions • Beliefs about the program • the participants • the way the program will operate • how resources, staff will be engaged • the theory of action

  26. Assumptions, cont. • Faulty assumptions are often the reason for poor results • Check and test assumptions • Identify potential barriers for each ‘if-then’ sequence

  27. Environment–Influential Factors • Extension program does not exist in a vacuum • Context of the program • politics, family circumstances, cultural milieu, demographics, economics, values, biophysical environment, policies, services • What affects the program over which you have little control?

  28. What does a logic model look like? • Graphic display of boxes and arrows • Any shape possible • Circular, dynamic • Relationships, linkages • Level of detail • simple • complex • Multiple models

  29. Compare examples

  30. Logic Model: Limitations • Represents reality, not reality • Programs are not linear • Focuses on expected outcomes • Challenge of causal attribution • Many factors influence outcomes • Doesn’t address: Are we doing the right thing?

  31. Benefits • Brings detail to broad goals • Shows the ‘chain of events that link inputs to results • Builds understanding and consensus • Identifies gaps in logic and uncertain assumptions • Signals what to evaluate and when • Summarizes complex program to communicate with externals

  32. Building a Logic Model • New program • Existing program • Team; organization • Involvement of others • Keep it dynamic

  33. INPUTS OUTPUTS Activities Participation OUTCOMES - IMPACT Short Medium Long-term • ASSUMPTIONS Logic Model: WORKSHEET Program Title Situation/Problem

  34. Check Your Logic Model • Are the outcomes really outcomes? • Is the longest-term outcome • meaningful? • logical? • realistic? • Are the connections between inputs, outputs, and outcomes clear and reasonable? • Does it represent research and best practice? • Does it represent the program’s purpose; response to the situation?

  35. PLANNING EVALUATION INPUTS OUTPUTS OUTCOMES • Programmatic • investments • i Short Medium Long term Activities Participation

  36. INPUTS OUTPUTS OUTCOMES Parents increase knowledge of child dev Design parent ed curriculum Staff Parents use improved parenting skills Reduced rates of child abuse & neglect Targeted parents attend Money Provide 6 training sessions Parents learn new ways to discipline Partners EVALUATION: What do you want to know? How will you know it? Quality of curriculum # of sessions delivered # parents attending/session which parents % of parents Increase in knowledge/skill- post session survey Actual use - follow-up phone interview Decrease in rates - agency records Providing Leadership for Program Evaluation Where does evaluation fit?

  37. What do you want to know? Indicators - how will you know it Source of information Method to collect info Schedule - when/where Evaluation Plan

  38. Evaluation Questions – What do you want to know?

  39. Indicators - How will you know it? • The evidence or measures that indicates what you wish to know or see: • often multiple indicators are necessary • may be quantitative or qualitative • culturally appropriate

  40. Logic model with indicators for Outputs and Outcomes Outputs Outcomes Farmers practice new techniques Farm profitability increases Program implemented Targeted farmers Farmers learn Number and percent of farmers attending Number and percent who learned content Number and percent who practice the recommendations Number of workshops held Number and percent show farm profits increase; amount of increase

  41. OUTCOME INDICATORS Increased youth-adult partnerships #, % of Boards with youth participation Reduction in N and P application rates #, % acres managed according to BMP guidelines Quality of conservation plan implementation Improved family financial management #, % with savings goal set #, % with debt reduction goal set #, % using spending plan #, % maintaining emergency fund INDICATORS: Examples How would I know it?

  42. INPUTS OUTPUTS OUTCOMES Parents increase knowledge of child dev Design parent ed curriculum Staff Parents use improved parenting skills Reduced rates of child abuse & neglect Targeted parents attend Money Provide 6 training sessions Parents learn new ways to discipline Partners Indicators Evaluating Your Program #, % parents who increased knowledge #, % parents who learned new ways Quality of curriculum # sessions fully delivered # and %parents who attended each session Decrease in rates of abuse & neglect among these parents #, % parents using improved skills (specify skills)

  43. Source of information Participants Parents Teachers Key informants Method of collecting the information Survey Interview Observation End-of-program questionnaire Focus group Records Source and Method of Data Collection

  44. Evaluation Plan cont. How will results be Who will do what, when, How will the data be analyzed and shared? To whom, what resources? interpreted? how, when?

  45. Enfin… • View evaluation as learning - integrate into the way we work • Build evaluation in upfront • Ask ‘tough questions’ • Make measurement meaningful • Be accountable for highest professional standards

More Related