1 / 24

Extension Program Evaluation

Extension Program Evaluation. Michigan Planning and Reporting System (MI PRS) Winter 2011Training. Part of Planning Process. Evaluation is an upfront activity in the design or planning phase of a program. Evaluation is not an after-program activity. . Why Outcomes?.

pooky
Download Presentation

Extension Program Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training

  2. Part of Planning Process • Evaluation is an upfront activity in the design or planning phase of a program. • Evaluation is not an after-program activity.

  3. Why Outcomes? Today, in a time of continued reduction in government funding, Extension professionals are challenged more than ever before to document outcomes of programs and address stakeholder demands for accountability.

  4. Review of Part of Bennett's Hierarchy As one moves up the hierarchy, the evidence of program impact gets stronger. End Results Practice Change KASA Reactions People Involvement Activities Resources

  5. Collecting impact data on programs is costly, time consuming, and requires skill. (But not impossible!) • Extension professionals are expected to evaluate a minimum of one program a year at impact level.

  6. Example • A pre/post measure can assess short term outcomes on knowledge, attitudes, skills, and aspirations (motivation to change). • A plan for participant follow-up is required to assess behavior or practice change.

  7. Plan early • Plan early on what is needed with cost, time, skills (data collection, analysis, interpretation), and resources that are needed to evaluate an extension program. • Work with Institute teams/ groups.

  8. Evaluating programs at the lower levels (inputs, participation, collaboration, activities, and reactions) may require little effort and are less expensive. This is process evaluation.

  9. Process Evaluation • Process evaluation, also called formative evaluation, helps program staff to assess ongoing programs for improvement and implementation. • Examples: program fidelity, reaching target audiences

  10. Outcome Evaluation • Documenting impact or community-level outcomes requires skills relative to questionnaire development, data collection and analysis, interpretation and reporting.

  11. Summative evaluation, also called impact or outcomes evaluation, may require understanding of evaluation designs, data collection at multiple points, and sophisticated statistical analyses such as Analysis of Covariance and the use of covariates.

  12. A Framework for Linking Costs and Program Outcomes Using Bennett's Hierarchy

  13. Professional Development • Plans for professional development are captured in MI PRS, consider building skills in evaluation. • Develop with Institute work teams program evaluation plans that fit with logic models.

  14. To make an Evaluation Plan: • Decide if the program is ready for formative/process or summative/outcome evaluation. • Link program objectives to evaluation questions that address community outcomes.

  15. To make an Evaluation Plan, Cont. • Identify key indicators for evaluation (make sure they are measurable and relevant). • Consider evaluation costs (follow-up techniques and comparison groups used in summative designs are more expensive). • Develop a cost matrix.

  16. Tracking program and project processes and outputs, as well as outcomes, will require data collection and analysis systems outside of MI PRS. • Link program costs and cost of evaluation to the outcomes.

  17. Conclusion In the end, evaluation questions that address the “so what” issue are connected to outcomes and costs, and ultimately justify the value of Extension programs to public good.

  18. Key Reference Radhakrishna, R., & Bowne, C. (2010). Viewing Bennett’s hierarchy from a different lens: Implications for Extension program evaluation. Journal of Extension, 48 (6). Retrieved 1/24/11 at: http://www.joe.org/joe/2010december/tt1.php

  19. MSUE Resources • Organizational Development webpage • Planning, Evaluation, and Reporting section

  20. Evaluation Resources will Grow!

  21. Other Extension materials on Evaluation….with future MSU specific resources to be released in 2011

  22. MSU Evaluation Specialist • Assists with work teams to develop logic model objectives and evaluation strategies • Consults on evaluation designs • Provides guidance to data analysis and selecting measures • Develops and delivers educational programs related to Extension program evaluation • Facilitates evaluation plan development or brainstorming for Institute work teams

  23. Organizational Development team member • Dr. Cheryl Peters, Evaluation Specialist • Statewide coverage • cpeters@anr.msu.edu • 989-734-2168 (Presque Isle) • 989.734.4116 Fax • Campus Office: Room 11, Agriculture Hall. Campus Phone: 517-432-7605 • Skype: cpeters.msue

  24. MI PRS Resources

More Related