1 / 17

CIL-Net & SILC-Net Evaluation Overview

CIL-Net & SILC-Net Evaluation Overview. December 18, 2007. Role of evaluators. Facilitate evaluation framework using established evaluation standards and principles Utilization-focused Evaluation Michael Quinn Patton, 2002 Program Outcomes Model United Way of America, 1998.

cadman-day
Download Presentation

CIL-Net & SILC-Net Evaluation Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CIL-Net & SILC-Net EvaluationOverview December 18, 2007

  2. Role of evaluators • Facilitate evaluation framework using established evaluation standards and principles • Utilization-focused Evaluation • Michael Quinn Patton, 2002 • Program Outcomes Model • United Way of America, 1998

  3. Evaluation approach • Method is for partners to understand and have ownership of evaluation. • ILRU, NCIL, APRIL, RSA, consultants • Challenge is conducting a high quality evaluation and applying findings in the real world.

  4. Some evaluation terms • Inputs • Activities • Outputs • Outcomes

  5. Activities Inputs Outputs Outcomes Program Outcome Model United Way of America Outcome Measurement Initiative, 1998

  6. Inputs • Inputs are resources, constraints, and information • Resources • Staff (e.g., ILRU, NCIL, APRIL, consultants, trainers) • Funding • Participants • Technology • Constraints • Contracts • Information • Feedback from participants through evaluation

  7. Activities • Activities are processes or actions to produce the training or TA • Conducting training – onsite, online, webcast • Providing technical assistance • Sharing information

  8. Outputs • Outputs are products of the activities • Number of participants • Number of training sessions • Number of webcasts • Number of technical assistance interactions

  9. Outcomes • Outcomes are the benefits of the training and technical assistance to a person or group of people. • Examples are: • Greater knowledge of IL philosophy • Increased management capacity in CILs • Change in practice in nursing home transition

  10. Activities Inputs Outputs Outcomes Resources ▪ Staff ▪ Money ▪ Technology Constraints ▪ RSA requirements Information ▪ Feedback Conducting Training Providing Technical Assistance Sharing information # trainings # Materials distributed # of technical assistance interactions # Participants New Knowledge Increased skills Changed attitudes or values Modified behavior Improved condition Changed status

  11. On linking activities to outcomes • It is impossible to establish causality in any final sense when dealing with complexities of real programs where activities and outcomes are never pure and uncontaminated by extraneous factors. • But it is helpful to think about the relationships and develop causal models that are testable. • We cannot provide definitive answers but we can arrive at some reasonable estimation of the likelihood that particular activities have had an effect.

  12. Assumptions of a utilization-focused evaluation approach • The partners must: • Understand the evaluation process and findings • Be actively involved in developing the process • Use evaluation tools • Learn how to use the results to improve performance every step of the way.

  13. First step in a utilization-focused evaluation process: • Is to identify and organize relevant decision makers and information users of the evaluation information.

  14. Relevant decision makers and information users are the people who: • Want the information, • Are able and willing to use the information, and • Who care about seeing evaluation results

  15. Second Step is identifying and focusing the relevant evaluation questions.

  16. Relevant evaluation questions • To facilitate the framing of evaluation questions in complex programs, it is often helpful to approach goals clarification at three levels: • The overall mission of the program • The goals of specific programmatic units • The specific objectives that specify client outcomes.

  17. Criteria for utilization-focused evaluation questions • It is possible to bring data to bear on the question. • There is more than one possible answer to the question, (ie the answer is not predetermined by the phrasing of the question.) The identified decision makers and users of information: • . . . want information to help answer the question. • . . . need information to help answer the question. • . . . want to answer the question for themselves, not just for someone else. • . . . care about the answer to the question. • . . . can indicate how the answer to the question will be used, i.e., can specify the relevance of an answer to the question for future action.

More Related