html5-img
1 / 20

Evaluation 101: A Workshop

Evaluation 101: A Workshop. Overview, Concepts and Application. Today:. Review some definitions Talk about practical concepts for AETC evaluation Discuss a useful framework Apply these concepts and the framework to your programs Identify other evaluation resources.

tallys
Download Presentation

Evaluation 101: A Workshop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation 101: A Workshop Overview, Concepts and Application

  2. Today: • Review some definitions • Talk about practical concepts for AETC evaluation • Discuss a useful framework • Apply these concepts and the framework to your programs • Identify other evaluation resources

  3. Definition: Rossi, Freeman and Lipsey • “Program evaluation is the use of social research procedures to systematically investigate the effectiveness of … programs.”

  4. Concepts: Rossi et al’s 5 program domains • Program evaluation typically involves assessment of one or more of the five program domains: • The need for the program • Design of the program • Program implementation and service delivery • Program impact or outcomes • Program efficiency

  5. Concepts: Chlemisky’s Perspectives • Accountability: Information collected for decision makers (emphasis: outcomes) • Developmental: Information collected to improve performance (emphasis: process measures) • Knowledge: Information collected in the interest of generating understanding and explanation (emphasis on processes and outcomes).

  6. Scriven and context • Evaluation has two arms: • Data gathering • Contextualizing results

  7. Some definitions… • Process evaluation: • Addresses questions about how well the program is functioning • Is useful for diagnosing outcome • Is critical in quality improvement

  8. Some definitions… • Key questions in process evaluation: • Who is served? • What activities or services are provided? • Where is the program held? • When and how long?

  9. Some definitions… • Outcome evaluation: • Gauges the extent to which a program produces the intended improvements it addresses • Addresses effectiveness, goal attainment and unintended outcomes • Is critical in quality improvement

  10. Some definitions… • Key questions in outcome evaluation: • To what degree did the desired change(s) occur? • Outcomes can be initial, intermediate or longer-term • Outcomes can be measured at the patient-, provider-, organization or system level.

  11. Some definitions… • Impact is sometimes used synonymous to “outcome.” • Impact is perhaps better defined as a longer-term outcome. For clinical training programs, impacts may be improved patient outcomes.

  12. Some definitions… • Indicators or measures are the observable and measurable data that are used to track a program’s progress in achieving its goals. • Monitoring (program or outcome monitoring, for example) refers to on-going measurement activity

  13. Some definitions… • CQI/QM: • HAB’s definition of quality: The degree to which a health or social service meets or exceeds established professional standards and user expectations

  14. Some tools for planning… • United Way’s Outcomes Manual • HRSA’s Quality Improvement Handbook • W.K. Kellogg Foundation Evaluation Handbook

  15. Some methods… • Identify some Quantitative Methods • Identify some Qualitative Methods • Which is best, qual or quant?

  16. A framework for AETC evaluation: Kirkpatrick • Measure Reaction • Measure Learning • Measure Behavior • Measure Results • Identify some ways to measure each in an AETC training setting

  17. Application to Your Program: • Identify Program Goals • For each goal: • Identify Process Objectives • Identify Outcome Objectives • For each objective: • Identify Indicators • Identify Data Source • Plan Data Collection • Plan Data Analysis

  18. Exercise:

  19. Recommended Reading • Definitely: • Kirkpatrick • Cruz the American Journal Evaluation (the Journal of the American Evaluators’ Association) • If you have time: • Chelimsky • Rossi • If you want a good read: • Hunt

  20. Handouts • Table 1.1 from Chapter 1 in Chelimsky: The Coming Transformations in Evaluations • Exhibit 2-P An Australian Teams’ Ten Step Approach to Program Evaluation (p. 75 in Rossi et al) • Chapters 1-3 in Kirkpatrick • American Evaluators’ Association Guiding Principles for Evaluators • Isenberg’s CQI Program Handout

More Related