1 / 22

Module 5: Evaluating training effectiveness

Module 5: Evaluating training effectiveness. MOA – FAO – TCP Workshop on Managing Training Institutions Beijing, 11 July 2012. Objectives (I). To understand a major international model for evaluating training effectiveness. Objectives (II).

Download Presentation

Module 5: Evaluating training effectiveness

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Module 5: Evaluating training effectiveness MOA – FAO – TCP Workshop on Managing Training Institutions Beijing, 11 July 2012

  2. Objectives (I) • To understand a major international model for evaluating training effectiveness.

  3. Objectives (II) • To review national and international examples of how training institutions evaluate training effectiveness.

  4. Objectives (III) • To work in small groups to design evaluation activities relating to actual modules or courses of instruction from the FFRC and HHRRC.

  5. The Kirkpatrick framework • Named for Donald Kirkpatrick, who developed this approach to evaluation in the 1950s. • Still the most common framework for the evaluation of training in North America.

  6. Four levels

  7. 1. Reaction • Frequently referred to as happy face evaluation, this level measures participant reaction to, and satisfaction with, the program and the learning environment. • If trainees are not satisfied with their training, the likelihood of learning is reduced.

  8. 2. Learning • Changes in knowledge, skills, and / or attitudes. • Learning is the direct goal of training – without it the likelihood of behaviour change and impact are reduced.

  9. 3. Behaviour change • This level determines whether changes in behavior have occurred as a result of the program. • Absence of behaviour change may not be the result of absence of satisfaction with, or learning from, a training program.

  10. 4. Results • This level looks atthe final results that occurred because the participants attended the program. • Results are "the bottom line" or impact of the program.

  11. Evaluation beyond Kirkpatrick • The Kirkpatrick framework helps us to evaluate the core questions of training: did the trainees learn anything, and did that learning make any difference to their performance and their world. • There are other pertinent questions to ask when evaluating training programs.

  12. Other evaluation questions • Were resources invested in the training program used efficiently to produce the desired outcomes? • Were major stakeholders to the training program satisfied with its process and outcomes?

  13. Other evaluation questions • Was the training program developed and delivered in a manner which could be sustained over time? • What is the relationship between the training program and the strategic direction of the training organization?

  14. Stages of evaluation • Planning for evaluation. • Formative (process) evaluation. • Summative (terminal / impact) evaluation.

  15. Methods of evaluation • Review of documents and records associated with a training program. • Observation of program delivery by evaluators and experts.

  16. Methods of evaluation • Surveys and interviews with trainees or informants able to provide feedback regarding the trainees.

  17. Methods of evaluation • Focus groups with trainees or stakeholders. • Examinations or assignments to test the knowledge and skills of trainees (sometimes done as pre-test and post-test).

  18. Example of evaluation work • Standard, online course evaluation from Continuing Education at the University of Calgary. • http://fluidsurveys.com/surveys/reg/scott-s-test-june-14-2012/ • See the Training Manual for • Background • Survey questions, procedures, and management

  19. Example of evaluation work • Systematic Program Review (SPR) from Continuing Education at the University of Calgary. • See the Training Manual for • Background • SPR criteria, methods, and indicators

  20. Example of evaluation work • Canadian Agriculture Lifelong Learning (CALL) program • See the Training Manual for • Background • Detailed information on the four levels

  21. Example of evaluation work • JICA evaluation of Chinese interns / students receiving agricultural training in Japan. • See the Training Manual for summary.

  22. Thank-you. Time for questions and discussion.

More Related