1 / 20

Impact Evaluation the broad and lasting effects

Impact Evaluation the broad and lasting effects. Will the magic last?. Even if an innovative educational program, a “magic bullet”, has “worked”, there is no guarantee it will continue to “work” or even be used in the future.

loischerry
Download Presentation

Impact Evaluation the broad and lasting effects

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Impact Evaluation the broad and lasting effects

  2. Will the magic last? • Even if an innovative educational program, a “magic bullet”, has “worked”, there is no guarantee it will continue to “work” or even be used in the future. • What kinds of evidence are needed to know that a Program makes a long-term impact?

  3. Will the magic last? • Long term follow-up with the participants • Strength of the effects • Administrative stability / turnover • Administrative support / demands • Resources to continue the Program at all • Resources to continue the Program in its most powerful form

  4. CIPPI Model • Context • Input • Process • Product • Impact • The goal is to make INFORMED judgments about Program merit or value.

  5. The Nature of Evaluation Tasks • The goal is to make INFORMED judgments about Program merit or value. • Making judgments requires a thorough understanding of: • How long do the effects last? • What caused the effects in the first place? • Can the Program continue?

  6. Impact Evaluation • The central focus of an Impact Evaluation is to extent beyond Product Evaluation – an emphasis on the accomplishment of specific, short-term objectives - to a focus on the long-term Impact of the Program. • If the Product Evaluation demonstrated that the Program is meeting its objectives, Impact Evaluation may focus on higher orderquestions.

  7. Higher Order Questions • Lasting Impact • Breadth and Depth of the Impact • Causal Inferences • Attributing Change to Training Programs • Community Impact • Sustainability

  8. Lasting Impact • Is it reasonable to expect the Program effects to persist past Program participation? • Are there resources available to include follow-up in the evaluation plan? • This kind of study may take place after short term studies have shown Program effects.

  9. Lasting Impact • Have the participants been followed after completion of the Program? • How long do the effects of the Program last? • What does the Program expect / believe / claim will happen for participants in the long run?

  10. Breadth and Depth of the Impact • Are Program participants able to show the benefits of participation in multiple settings, or across various aspects of their lives? • For example, if children who have participated in the Program exhibit more positive social behaviors at school, do they do so at home as well?

  11. Causal Inferences • What causal inferences can be made about the effects of the Program from the Product Evaluation evidence? • Can the results of the Product Evaluation be attributed to the Program and only the Program? • Have the results been replicated across service delivery cycles?

  12. Causal Inferences • A controlled, experimental, comparative study is necessary if the stakeholders are interested in causality. • Cost considerations • Design feasibility considerations – control group, random assignment, etc.

  13. Change Due to Training • If the evaluation is focusing on results of training or professional development, it becomes complex and difficult to connect changes and effects back to training. • Why?

  14. Change Due to Training • Administrative support • Administrative expectations for implementation • Technical assistance and follow-up • Interactions between the new and innovative strategies from training and what was already in place • Implementation fidelity

  15. Change Due to Training • If the stakeholders want casual inferences made about the effects of training, all four of Kirkpatrick’s levels of evaluation need to be used. • Reaction • Learning • Behavior • Results

  16. Community Impact • Is the program reaching all of the targeted population? • Has the Program impacted any positive changes in the community context within which it operates? • Does the community know about / trust the Program and its benefits?

  17. Community Impact • To what extent is the Program effective in communicating with the targeted population about Program effects and opportunities to participate? • How effective are the recruitment, marketing, and reporting strategies of the Program?

  18. Sustainability • Is the Program sustainable? • What resources will be needed to continue, expand, and strengthen the Program? • Can the Program be replicated in another service delivery cycle? • Can the Program be transported to other settings?

  19. Evaluation Strategies • The same methods as Product Evaluation • Tracking Program participants beyond the treatment • Obtain informed consent to track at the beginning • Multiple Informants – parent surveys, etc. • Experimental methods • Kirkpatrick model

  20. Summative Conclusions • Make a summary judgment about what Program outcomes can be expected to last. • Help the Program revisit its mission, goals, and objectives. In light of the cumulative weight of the evaluation evidence, are there any suggested revisions?

More Related