A tool to support the development evaluation of state unit of aging programs projects
Download
1 / 65

A Tool to Support the Development & Evaluation of State Unit of Aging Programs & Projects - PowerPoint PPT Presentation


  • 95 Views
  • Uploaded on

Logic Modeling:. A Tool to Support the Development & Evaluation of State Unit of Aging Programs & Projects. John A. McLaughlin [email protected] My Aim Today.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' A Tool to Support the Development & Evaluation of State Unit of Aging Programs & Projects' - kaden-beach


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
A tool to support the development evaluation of state unit of aging programs projects

Logic Modeling:

A Tool to Support the Development & Evaluation of State Unit of Aging Programs & Projects

John A. McLaughlin

[email protected]


My aim today
My Aim Today

  • Orient you to a different way to think about conceptualizing and telling the performance story of your State Unit on Aging (SUA) programs and projects

  • Provide a simple tool for creating a functional picture of how your SUA works to achieve its aims

  • Offer some helpful hints for framing a useful performance measurement and evaluation strategy for your SUA.


Beliefs
Beliefs

  • Social Advocacy

    • Client/customer focus

      • The right to be part of a well run program

  • Program Staff Advocacy

    • Managing for Results

      • Nobody gets it right the first time out!


Themes you ll hear today
Themes You’ll Hear Today

  • GOOD MANAGEMENT

    • Relevance

    • Quality

    • Performance

  • Connections

  • Evidence


More words
More Words

  • Goals -- Impacts

  • Objectives

    • Outcome -- changes

      • Short-term (proximal)

      • Intermediate (distal)

    • Supporting

      • Resources

      • Activities

      • Outputs: productivity and reach


Performance management tools
PERFORMANCE MANAGEMENT TOOLS

PERFORMANCE MANAGEMENT

Performance management includes activities to ensure that goals are consistently being met in an effective and efficient manner. Performance management tools include logic models, performance measurement and program evaluation.

Performance Measurement

Helps you understand what level of performance is achieved by the program/project.

Program Evaluation

Helps you understand and explain whyyou’re seeing the program/project results.

Logic Model

Tool/framework that helps identify the program/project resources, activities, outputs customers, and outcomes.


Logic models as recipes

Recipes have 3 essential components!

A good cook follows the recipe – program staff would do well to create & follow their recipe for success!

Logic Models as Recipes


Logic models as maps

If you were going on a trip, what would be the first question you need to answer?

Then, what tool would you need?

Logic Models as Maps


Recipes maps are used for
Recipes & Maps are used for: question you need to answer?

  • Planning

  • Communicating

  • Performance Measurement and Evaluation


The logic model
The Logic Model question you need to answer?


Level i logic model
Level I Logic Model question you need to answer?

RESOURCES / INPUTS

The ingredients you

need to implement

your program!

YOUR PROGRAM

What you do to achieve your long-term aims!

RESULTS / IMPACT

Why you are in Business!


Level ii logic model
Level II Logic Model question you need to answer?

Contextual Influences

1rst Order Outcome

2nd Order Outcome

Resources

Customers

Activities

Outputs

Impact

1

2

3

4

5

6

7

Program’s Sphere of Influence

HOW

WHY


Understanding the sphere of influence
Understanding the Sphere of Influence question you need to answer?

  • Ask your team to estimate their level of confidence that their program will lead to each outcome in the logic model.

    • The Strategic Impact

    • The Intermediate Outcomes

    • The Short-term

  • Identify Performance Partners!


Complex effects chain
Complex Effects Chain question you need to answer?

Partners

Transparency

Shared Common Outcomes


Elements of logic models
Elements of Logic Models question you need to answer?

  • Resources / Inputs: Programmatic investments available to support the program.

  • Objectives / Activities: Things you do– activities you plan to conduct in your program.

  • Outputs: Product or service delivery/implementation targets you aim to produce.

  • Customer: User of the products/services. Target audience the program is designed to reach.

  • Outcomes: Changes or benefits resulting from activities and outputs.

  • Outcome Structure

    • Short-term (K, S, A) – Changes in learning, knowledge, attitude, skills, understanding

    • Intermediate (Behavior) – Changes in behavior, practice or decisions

    • Long-term (Condition) – Changes in condition

  • External Influences: Factors that will influence change in the affected community.


Outputs outcomes
Outputs & Outcomes question you need to answer?


Outputs outcomes1
Outputs & Outcomes question you need to answer?

OUTCOME

OUTPUT


Volunteers
Volunteers question you need to answer?

  • If the program is addressing a situation of low volunteer involvement in community affairs and the purpose of the program is to increase volunteering among community residents as a part of a larger community development initiative, then increased numbers of residents volunteering in community life would be an outcome. The outcome is expressed as a behavioral change.


Number or type of participants who attend number of clients served
Number or type of participants who attend; number of clients served.

  • If the purpose of the program is to increase use of a service by an underserved group, then numbers using the service would be an outcome. The outcome is not numbers attending or served; the outcome is expressed as use that indicates behavioral change.


Participant satisfaction
Participant Satisfaction. served.

  • For our purposes in education and outreach programming, client satisfaction may be necessary but is not sufficient. A participant may be satisfied with various aspects of the program (professionalism of staff, location, facility, timeliness, responsiveness of service, etc) but this does not mean that the person learned, benefited or his/her condition improved.


Training research producing
Training, Research, Producing served.

  • These are Outputs. They may be essential aspects that are necessary and make it possible for a group or community to change. But, they do not represent benefits or changes in participants and so are not outcomes. They lead to, result in outcomes, but in and of themselves, they are outputs.


Steps in the logic model process
Steps in the Logic Model Process served.

  • Establish a stakeholder work group and collect documents.

  • Define the problem and context for the program or project.

  • Define the elements of the program in a table.

  • Develop a diagram and text describing logical relationships.

  • Verify the Logic Model with INTERNAL / EXTERNAL stakeholders.

  • Then use the Logic Model to identify and confirm performance measures, and in planning, conducting and reporting performance measurement and evaluation.


Step 1 establish work group collect documents information
Step 1: served. Establish work group & collect documents & information.

  • Convene / consult a work group

    • provides different perspectives and knowledge

    • attempts agreement on program performance expectations

  • Review sources of program or project documentation

    • Strategic and operational plans

    • Budget requests

    • Current metrics

    • Past evaluations

  • Conduct interviews of appropriate staff


Step 2 define problem program addresses context
Step 2: served. Define problem program addresses & context.

The Context

Drivers of Success

Constraints on Success

Factors leading

to the Problem

1

2

3* your niche

The Problem

Program Addresses

The Program


Step 3 define elements of program or project in a table

Outcomes served.

Resources/ Activities Outputs Customers Short-term Intermediate Long-term

Inputs Reached (Change in Attitude) (Change in Behavior) (Change in (Condition)

External Influences:

Step 3: Define elements of program or project in a table.

HOW

WHO

WHAT & WHY


Step 4 develop a diagram text describing logical relationships

Which leads to these outcomes. served.

5.0

We use these resources…

1.0

For these activities…

2.0

To produce these outputs 3.0

So that the customers can change these ways.

4.0

Leading to these results!

6.0

Step 4: Develop a diagram & text describing logical relationships.

Draw arrows to indicate/link causal relationships between logic model elements.

Work from both directions (right-to-left and left-to-right)


Two important rules to follow
Two Important Rules served.to Follow

For every action identified in the Logic Model, the must be an output that connects to an outcomethrough aspecific customer.

OR

An action must produce an output that becomes a key input to another activity.

THINK CONNECTIONS!


Logic modeling exercise 1

Logic Modeling Exercise 1 served.

Brief application of logic modeling using a United Way example


Logic modeling exercise
Logic Modeling Exercise served.

GOAL:Provide an opportunity for participants to apply the principles and practices of logic modeling in an interactive setting.

INSTRUCTIONS:

  • The group will be given a set of index cards that contain words or statements that answer the list of questions on the next slide.

  • As a group, review the questions and use the index cards to map out the logic of our case study program on your flip chart paper.

  • When the cards are placed/glued on the paper in the correct order, draw lines connecting the cards to show the logic relationships.

  • When you have completed your logic model, the cards will be ordered so that they describe the program logic and its underlying assumptions (boxes and connecting arrows).

  • Check your logic using if, then and how, why statements. When you have completed this exercise be prepared to report out to the larger group. REMEMBER THE RULES!


Questions to guide modeling
Questions to Guide Modeling served.

  • What are the essential resources we need to implement program?

  • What programs / activities do we have to implement with these people to achieve our results?

  • What are the outputs of our programs?

  • Who / what do we need to reach to achieve these results?

  • What are the short-term and intermediate changes that will enable us to realize our strategic results?

  • What are the strategic results / long-term environmental outcomes we are aiming for?

  • What external influences to the program context do we have to be aware of?


Worksheet simple logic model diagram
Worksheet served.Simple Logic Model Diagram

Outputs

(representative)

Short-term Outcomes

Intermediate

Outcomes

Target Audience

Activities

Resources

Long-Term Outcomes

EXTERNAL INFLUENCES


Z logic supplier customer relationship
“Z” Logic served.Supplier-Customer Relationship

Unpacking supports more focused Performance Measurement & thus more useful evaluation, as well as better understanding & communication about how the “Program” is supposed to work!



LEVEL I LOGIC MODEL served.

Clean

Safe

Swimable

Fishable

Water

EPA

State

Local

Private

Improving Water Quality Training Program

LEVEL II LOGIC MODEL

Outcomes

Resources

Short-term

Activities

Impact

Customers

Intermediate

Long-term

Materials Development

Recruitment

Training

Technical Assistance

Website

EPA

State

Local

Private

Increased awareness of harmful effects

Increased awareness of new technologies and incentives

Developers & builders acquire new technologies & change practice

Clean/ Safe/ Swim-able/ Fishable Water

Reduction in NPS pollutants in waterways

More Fish

Clean Beaches

Healthier Wetlands

Developers/ Builders


LEVEL III LOGIC MODEL served.

Outcomes

Resources

Activities

Outputs

Short-term

Intermediate

Long-term

Impact

  • EPA

  • State

  • Local

  • Private

  • Materials Development

  • Materials Ready

Developers and builders acquire new technologies and change practice

Reduction in NPS pollutants in waterways

  • ·Increased awareness of harmful effects

  • Increased awareness of new technologies and incentives

Healthier Wetlands

  • More Fish

  • Clean Beaches

Clean/ Safe/ Swim-able/ Fishable Water

  • Trainees Ready

  • Recruitment

  • Developers/ Builders Trained

  • Training

  • Technical Assistance

  • Trainees Receive TA

  • Trainees/ Others aware of/using new information

  • Website



Strategic plan check goals
Strategic Plan Check: Goals served.

  • Is the goal statement outcome oriented?

  • Does it specify the expected strategic change / impact for a specific target group (older persons & disabled)?

  • What evidence is available that this impact / change is important (relevance)? Are there existing needs data?

  • What specific roles, if any, do partners (internal & external) play in the success of this impact?

  • Are there missing Goals to enable the mission / vision to be realized?

  • What concerns you most about this goal, right now?


Strategic plan check objectives
Strategic Plan Check: Objectives served.

  • Is the objective outcome oriented?

  • Does it clearly specify the anticipated change for a specific target group & why they need to be changed?

  • Does the change relate to the goal? Will success with this objective lead to success with the goal? (QUALITY)

  • What evidence is available that that this change is important? Are there existing needs data?

  • What specific roles, if any, do partners (internal & external) play in the success of this objective?

  • Are there missing Objectives to enable the goal to be realized?

  • What concerns you most about this objective, right now?


Strategic plan check strategies
Strategic Plan Check: Strategies served.

  • Is there a reasonable degree of confidence that strategy will result in achievement of a specific outcome for a specific group?

  • What evidence is available that this strategy is the right strategy – in comparison to others – to achieve the outcome that is specified?

  • What specific roles, if any, do partners (internal & external) play in the success of this strategy?

  • Considering the strategy you’ve adopted, do you have sufficient resources on hand or available to actualize the strategy?

  • Are there missing strategies to enable the objectives to be realized?

  • What concerns you most about this strategy, right now?


Scenario checking what if s
Scenario Checking served.What if’s!

  • Select several external forces & imagine related changes which might influence the SUA, e.g., change in regulations, demographic changes, etc. Scanning the environment for key characteristics often suggests potential changes that might effect the alliance, as does sharing the plan with stakeholders!

  • For each change in a force, discuss 3 different future SUA scenarios (including best case, worst case, & reasonable case) which might arise with the SUA as a result of each change. Reviewing the worst-case scenario often provokes strong motivation to change the SUA – forming partnerships, changing strategy.

  • Conduct likelihood / Impact assessment on each external influence.


Scenario checking what if s1
Scenario Checking served.What if’s!

  • Select most likely external changes to effect the SUA, e.g., over the next 3-5 years, identify the most reasonable strategies the SUA can undertake to respond to change. Suggest what the SUA might do, or potential strategies, in each of the 3 scenarios to respond to each change.

  • This process should be repeated for each element of the Logic Model

    • Program structure – Resources, Activities, Outputs

    • Outcome structure – Short-term, Intermediate, Strategic

  • REMEMBER – “NOBODY GETS IT RIGHT THE FIRST TIME OUT!”


Logic modeling exercise 2

Logic Modeling Exercise 2 served.

Brief application of logic modeling focusing on a typical SUA program


Logic modeling exercise1
Logic Modeling Exercise served.

GOAL: Provide an opportunity for participants to apply the principles & practices of logic modeling in an interactive setting.

INSTRUCTIONS:

  • Participants will identify 1 SUA program (e.g., community awareness, home delivered or congregate meals, education) to Logic Model as it operates currently.

  • Group will construct a Level I & Level II Logic Model.

  • After constructing the Models and checking using if, then and how, why questions, the participants should discuss who they might tweak the Model to address Choice.

  • Participants will be prepared to present their Modules to the whole group.


Benefits of logic modeling
Benefits of Logic Modeling served.

  • Communicates the performance story of the program or project.

  • Focuses attention on the most important connections between actions and results.

  • Builds a common understanding among staff and with stakeholders.

  • Helps staff “manage for results” and informs program design.

  • Finds “gaps” in the logic of a program and works to resolve them.


Logic modeling benefits
Logic Modeling Benefits served.

Kellogg, 1998


The real value
The real value -- served.

  • Most of the value in a logic model is in the process of creating, validating, and modifying the model … The clarity in thinking that occurs from building the model and the depth and breath of those involved are critical to the overall success of the process as well as the program.

Adapted from W.K. Kellogg Foundation Handbook, 1998




Orientations for performance measurement evaluation
Orientations for Performance Measurement & Evaluation served.

PERFORMANCE MEASUREMENT

  • Accountability, description

    • What objectives/outcomes have been accomplished at what levels?

      PROGRAM EVALUATION

  • Learning, Program Improvement, Defense

    • What factors, internally and/or externally influenced my performance? (Retrospective)

    • What effect will this level of performance have on future performance if I don’t do something? (Prospective)

    • What roles (+/-) did context play in my performance?


Key questions grantees need to answer about their programs
Key Questions Grantees Need to Answer About Their Programs served.

  • What am I doing, with whom, to whom/what? (effort)

  • How well am I doing it? (quality)

    • Customer Feedback

    • Peer Review for Technical Quality

    • User Review for Social Validity

  • Is anybody (anything) better off? (effect)

    • Short-term

    • Long-term

  • What role, if any, did my program play in the results?

  • What role, if any, did the context play?

  • Were there any unintended outcomes?

  • What will happen if I don’t do something?

Performance Measurement

Program Evaluation


Hierarchy of Performance Measurement Data served.

Program Logic Hierarchy

Performance Measurement Hierarchy

Matching Levels of Performance Information

Program Logic Elements

7. Measures of impact on overall

problem, ultimate goals, side effects,

7. End results

social and economic consequences

6. Measures of adoption of new practices

6. Practice and behavior change

and behavior over time

5. Measures of individual and group changes

5. Knowledge, attitude, and skill changes

in knowledge, attitude, and skills

4. What participants and clients say about

the program; satisfaction; interest;

4. Reactions

strengths; weaknesses

3. The characteristics of program

participants and clients; numbers, nature

3. Participation

of involvement; background

2. Implementation data on what the program

2. Activities

actually offers or does

1. Resources expanded; number and types of

1. Resources

staff involved; time extended


Two questions
Two Questions served.

  • What is the right Outcome?

    • Short-term

    • Intermediate

    • Strategic

  • Am I getting at the right Outcome, the right way?

    • Efficiency

    • Effectiveness




Defending your impact claim
Defending your served.ImpactClaim

  • Did we observe a change in the anticipated outcome(s) as seen in your performance measures?

  • Can we connect any element of our program (what we did) to that change using your performance measures?

  • Are there any rival explanations (usually in the context)?


Definitions
Definitions: served.

Performance Measurement:

The ongoing monitoring & reporting of program progress & accomplishments, using pre-selected performance measures.

Performance measure – a metric used to gauge program or project performance.

Indicators – measures, usually quantitative, that provide information on program performance and evidence of a change in the “state or condition” in the system.


Definitions1
Definitions: served.

Program Evaluation:

A systematic study that uses measurement & analysis to answer specific questions about how well a program is working to achieve its outcomes & why.


+ DISCREPANCY served.

STANDARD/ DESIRED

LEVEL

OF

PROGRAM

PERFORMANCE

ACTUAL LEVEL OF

PROGRAM

PERFORMANCE

- DISCREPANCY


Example
Example served.

  • Standard: 95% of targeted community-based treatment facilities will adopt BMPs by June 2006.

  • Performance: 65% of targeted community-based treatment facilities adopt BMPs by June 2006.

  • Managers’ Question: Should we act and if so, what should we do?

  • Prospective Evaluation Question:What impact on predicted longer-term impacts will this observed level of performance have?

  • Retrospective Evaluation Question: What programmatic or contextual factors influenced the observed level of performance?


The logic model evaluation

HOW served.

WHY

Longer term

Short term

Intermediate

outcome

Activities

Outputs

Customers

Resources/

Inputs

outcome

outcome

(STRATEGIC

AIM)

RESULTS FROM

PROGRAM

PROGRAM

EXTERNAL CONDITIONS INFLUENCING PERFORMANCE (+/-)

The Logic Model & Evaluation


Assessing strength of evaluation design for impact
Assessing Strength of Evaluation Design for Impact served.

  • Is the population representing the counterfactual equivalent in all pertinent respects to the program population before that population is exposed to the intervention – selection bias

  • Is the intervention the only force that could cause systematic differences between the 2 populations once exposure begins?

  • Is the full force of the intervention applied to the program population, and is none applied to the counterfactual?

    • Implementation evaluation

    • Independence


In the end logic models
In the end Logic Models: served.

Enable planners to:

  • Develop a more convincing, plausible argument RE how their program is supposed to work to achieve their outcomes & communicate this to funding agencies & other stakeholders.

  • Focus their PM/PE on the right elements of performance to enable program improvement & the estimation of causal relationships between & among elements.

  • Be better positioned to present & defend their claims about their program performance to external stakeholders.


ad