Measuring program results for abstinence education grantees
This presentation is the property of its rightful owner.
Sponsored Links
1 / 42

Measuring Program Results for Abstinence Education Grantees PowerPoint PPT Presentation

  • Uploaded on
  • Presentation posted in: General

Measuring Program Results for Abstinence Education Grantees. Vijaya ChannahSorah, Ph.D. Independent Consultant, Results Management. RESULTS. What do we mean by measuring results? > Measuring Outputs and Outcomes > Program Evaluation > Logic Modeling (as a tool)

Download Presentation

Measuring Program Results for Abstinence Education Grantees

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Measuring program results for abstinence education grantees

Measuring Program ResultsforAbstinence Education Grantees

Vijaya ChannahSorah, Ph.D.

Independent Consultant, Results Management



  • What do we mean by measuring results?

    > Measuring Outputs and Outcomes

    > Program Evaluation

    > Logic Modeling (as a tool)

  • The purpose is to improve program outcomes and to achieve our vision . . . to give our youth knowledge and skills to live a rich, grounded life . . .

Page 2

Why are results important

Why are Results Important?

  • Enable you to know what works well (in what populations, under which conditions)

  • Enable you to take corrective actions

  • Other states/projects can implement effective program elements and test them in their own area

Page 3

Training approach


Walk systematically through:

  • Performance Measurement (outputs, outcomes, efficiency, etc.)

  • Program Evaluation

  • Logic Modeling

  • Tying Everything Together

Page 4

Performance measurement


  • Definitions: outputs, outcomes, etc.

  • Setting targets

  • How different from research / program evaluation

Page 5

Performance measurement definitions

Performance Measurement: Definitions

  • Outcomes: ultimate purpose of the program

  • Outputs: intermediate results of activities

  • Activities: your processes

  • Inputs: resources

  • Efficiency: outcomes or outputs over costs (usually)

Page 6

Performance measurement definitions examples

Performance Measurement: Definitions / Examples

Outcome example:

  • Decrease the rate of births to unmarried teenage girls ages 15 to 19. (2002 baseline was 35.4%)

    Output example:

  • Number of teachers trained in abstinence education (by 200X).

  • Activities: Developing curriculum, conducting training, etc.

  • Inputs: Funds, x number of abstinence experts, ideas, etc.

  • Efficiency: Number of teachers trained over training dollars (a PROCESS Efficiency Measure).

Page 7

Challenges in performance measurement

Challenges in Performance Measurement

  • Defining “efficiency” for human services programs (e.g., abstinence education)

  • Defining outcomes (e.g., family violence prevention)

  • Timely, reliable data

Page 8

Performance measurement program evaluation definitions

Performance Measurement / Program Evaluation: definitions

Difference between performance measurement & program evaluation

Performance measurement shows:

  • Trends over time

  • Comparison between actual and desired results/outcomes

    Program evaluation shows:

  • Outcomes relative to what they would have been in absence of the program

  • Program’s causal contribution to the observed outcome

Page 9

Program evaluation


  • Definition:

    Program evaluations are systematic, scientifically-based studies conducted to assess the impact of a program

    A program evaluation typically applies scientific techniques to determine how confident we can be that a particular OUTCOME was CAUSED BY the intervention(s)

    Evaluations examine achievement of program objectives in the context of environmental and external factors (and takes them into account)

Page 10

Program evaluation1

Program Evaluation

Basic Approaches:

  • Randomized controlled trials


    O1 X O2

    - - - - - - - - - - - -

    O1 O2


    Random assignment to control and experimental groups

Page 11

Program evaluation2

Program Evaluation

Basic Approaches:

  • Quasi experimental design


    O1 X O2

    - - - - - - - - - - - -

    O1 O2


    No random assignment to control and experimental groups

Page 12

Program evaluation3

Program Evaluation

Basic Approaches:

  • Longitudinal quasi experimental design


    O1 X O2 X O3 X O4


    Observations and treatments (interventions) over time

Page 13

Program evaluation4

Program Evaluation

Basic Approaches:

  • Single group pre- post-design


    O1 X O2


    No comparison/control group. Not recommended. Conclusions about causality will be uncertain.

Page 14

Program evaluation5

Program Evaluation

Key things to look for (and discuss with evaluator):

  • Type and rigor of study design

  • Hypotheses addressed?

Page 15

Program evaluation6

Program Evaluation

Key things to look for continued…

  • Timing between pre- and post-tests (if applicable)

  • Data collected & methods of collection

  • Frequency of data collection

  • Demographics, external factors collected?

  • Internal/external validity of study

Page 15 - A

Program evaluation7

Program Evaluation

Using milestones/interim results to take corrective actions

  • Ensure evaluation, data gathering, and outcomes measurement allow for interim reporting

  • Find out about the data lag (and implications)

  • Determine how interim results can be interpreted (will you know what to change about the program elements/activities?)

Page 16

Program evaluation8

Program Evaluation


  • Consider evaluating relatively small chunks of program aspects (you may not be able to conduct an evaluation of the entire project)

  • Build use of interim results into your workplan

Page 17

Program evaluation9

Program Evaluation

Reminders (continued):

  • Tap into partnerships: are existing data available (or is information available) that might be useful?

  • Check in “early and often” with 3rd party evaluator ~ stay in close contact

  • Design the formats for evaluation reports (interim and final) at the start

Page 18

Program evaluation10

Program Evaluation

Awareness of analysis techniques:

  • Multiple regression

  • Non-response analysis

  • Correlation (is NOT causality!)

Page 19

Program evaluation11

Program Evaluation

Feedback loop

  • Learn from the performance and program evaluation process and results

  • Incorporate program evaluation into the logic model (discussed next)

Page 20

Logic models


A diagram, chart, or picture of all the major elements of your entire project

The idea is to make it user-friendly to you and your group, and applicable to your purpose

You will get basic building blocks here, and can then tailor the logic model to your needs and preferred ways of thinking

Page 21

Overview of logic models

Overview of Logic Models

  • A logic model tracks how we get from our challenges to our solutions and desired long term outcomes …











Logic model defined

Logic Model Defined

What is a Logic Model?

  • A graphic representation of a program.

  • It shows what the program is designed to accomplish, including the services it delivers, expected results of those services, and the linkages between services and program goals.

Page 23

Logic models1

Logic Models

…can go both ways

X ---------------------------------------------Y

(e.g., how can we make a raft float?)


(e.g., why did the raft sink?)

Page 24

Logic model uses

Logic Model Uses

Use for program:

  • Design

  • Budgeting

  • Implementation

  • Evaluation

  • Communication

  • Marketing

  • Workplanning

    So logic models can be used to help plan and manage the whole program

Page 25

Logic model construction process

Logic Model Construction Process

Brainstorming the Draft Model:

  • Establish the scope and context

  • Determine challenge(s), outcomes, inputs, activities, outputs, and measurement components

Page 26

Logic model construction process continued

Logic Model Construction Process Continued…

  • Create model draft

  • Express relationships among/between key components

  • Determine evaluation needs/points using dotted line arrows (solid arrows show known relationships)

Page 26 - A

Logic model construction process1

Logic Model Construction Process

Brainstorm the Model Development:

  • Use flipcharts and colored post-its

  • Create model draft step by step

  • No idea is “wrong”

  • Think creatively!

Page 27

Logic model development

Logic Model Development

1. Identify CHALLENGE/Social ill: What do we want to improve in the population?

  • Out of wedlock births

  • Relationships before marriage

  • Diseases (STDs)

    Challenges are often expressed as statements of fact, based on empirical data/statistics.

    There may be multiple challenges addressed by a single program.

Page 28

Logic model development1

Logic Model Development

2. Identify LONG-TERM OUTCOMES: The ultimate end goals of your program, how your service population will look after your interventions have taken place

  • Decrease out of wedlock births

  • Increase proportion of abstinent youths

  • Decrease preventable disease (STDs)

    Long-term outcomes/goals (as well as all other outcomes) are usually expressed as changes: you will use words such as “improved,” “increased,” “decreased,” etc.

    Ultimate long-term outcomes/goals are sometimes “pie in the sky” or utopian.

Page 29

Logic model development2

Logic Model Development

The remainder of the logic model elaborates how we get from the CHALLENGE to the LONG TERM OUTCOMES.











Logic model development3

Logic Model Development

3. Identify Inputs/Resources (personnel, funds, laws/regulations, creative ideas, etc.)

4. Identify Activities:

  • Train the trainer

  • Delivering abstinence education in [churches, community centers, schools]

Page 31

Logic model development4

Logic Model Development

5. Identify Outputs, such as number of trainers trained, number of training courses developed and administered.

6. Identify/develop Key Outcomes and Measures:

  • Decrease the rate of births to unmarried teenage girls ages 15-19 (35.4% in 2002) Target: 35% in 2003

  • Decrease the proportion of youth ages 15-19 who have engaged in sexual intercourse (46.7% in 2003) Target: 45.5% for 2004

    … and set Targets

Page 32

Logic model development5

Logic Model Development

7. Determine where program evaluation/research needs to take place (and depict arrows accordingly):

- - - - - - > = plausible causal relationship (or desired effects/results)

---------- = known causal relationship (based on scientific research/program evaluation)

Page 33

Logic model development6

Logic Model Development


  • Logic modeling is a continuous (not static) process

  • Incorporate activities/processes into more detailed project workplans

  • Do not restrict your thinking - jot down ideas for the workplan or other areas, items, etc. as you think of them while brainstorming

  • Use the abstinence-until-marriage (Abstinence Education) logic model template as a starting point, and to assist thinking...

Page 34

Logic model development7

Logic Model Development

Complementary Tools:

  • Workplans

  • Strategic plans

  • Flowcharts

  • Process diagrams

  • Related logic models

  • Performance budgets containing “global” Agency performance measure information

Page 35

Linking how does all this relate

LINKING – How does all this relate?

  • Performance measurement & reporting (represented in the boxes of the logic model): focus is on outcomes/ultimate results

  • Program evaluation (represented in the arrows of the logic model): focus is on impacts

  • Logic Modeling (gives a clear picture of what outcome measures needed, program evaluation needed, etc.)

Page 36

In conclusion

In Conclusion

  • Need to focus on producing results

  • We have lots of information! – how do we assemble it to help us manage?

  • Use logic modeling to track and improve performance (at federal, state, and grantee levels)

Page 37

Questions discussion

Questions / Discussion

  • Questions on any aspect of measuring program results?

  • Is it clear how all aspects relate to one another and how logic modeling can be used as a powerful tool?

Page 38

Go forth produce results and measure for our youth

Go Forth, Produce Results, and Measure !. . . for our youth

Page 39

Questions answers

Questions & Answers

  • You may submit questions pertaining to today’s web cast until 5:00pm EDT, Wednesday June 14, 2006, to the following address:

    [email protected]

  • Answers will be posted at as soon as they are available.

Page 40

  • Login