1 / 26

Super*Vision Training Program

Super*Vision Training Program. Evaluation Methodology. Jing Wan. Components. Introduction Logic model Evaluation methods(Design, data collection, analysis) Communication & dissemination. Introduction. Evaluation questions Formative evaluation Summative evaluation

samson-love
Download Presentation

Super*Vision Training Program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Super*Vision Training Program Evaluation Methodology Jing Wan

  2. Components • Introduction • Logic model • Evaluation methods(Design, data collection, analysis) • Communication & dissemination

  3. Introduction • Evaluation questions Formative evaluation Summative evaluation • Data collection methods & Measures Formative evaluation Summative evaluation (Qualitative measures+ quantitative measure) • Communication & Dissemination

  4. Program Analysis • Overview of logic model

  5. Program Analysis

  6. Program Analysis • Inputs: Training structure & set-up Training process Training content Trainer Trainee Supportive environment

  7. Program Analysis • Outputs & Outcomes Program products & activities Changes in trainees’ KSA Improved workforce & organizational performance

  8. Evaluation Methods • Evaluation questions • Evaluation design • Data collection methods

  9. Evaluation questions • Related to inputs Is training content relevant to the trainees’ job assignment? How satisfied are the trainees with their involvement in the training session? • Related to outputs Does the training program provide the planned assignments?

  10. EQ • Related to outcomes/impacts Did the trainees experience any changes in their KSA, and job performance? What are the effects of the training program on each department of DHSMV (eg. Improved work efficiency, enhanced networking, and partnerships)?

  11. Evaluation Design • Formative (process) evaluation design How well the training program matches the evaluation theory behind its design, what the program actually does during the process. • Summative (outcome) evaluation design How the training program is related to changes in the trainees, whether the program causes the desired changes.

  12. Data collection methods • Formative (process) evaluation Quantitative measure: What to measure: trainees training experience How to measure: after training survey From whom collected: trainees How to analyze: statistical analysis for quantitative data

  13. Example 1

  14. Example 2

  15. Data collection Cont. • Formative (process) evaluation Qualitative measure: What to measure: the training meets the trainees’ expectation How to measure: after training survey From whom collected: trainees How to analyze: content analysis for qualitative measure

  16. Example 3

  17. Data collection cont. • Summative (outcome) evaluation Qualitative & quantitative measures What to measure: short-term outcomes How to measure: follow-up questionnaire & telephone interview From whom collected: trainees & subordinates How to analyze: statistical analysis & content analysis

  18. Example 4

  19. Example 5

  20. Communication & Dissemination • Communication challenges General evaluation anxiety (results can affect decisions) Disengagement Management operating style (resistant to change, rapid staff turnover, dysfunctional information-sharing system)

  21. CD cont. • Communication methods Written reporting • Final evaluation report • Progress reports Verbal presentations • Debriefing meetings Bring together stakeholders to present key evaluation findings and recommendations or other evaluation components.

  22. ECP

  23. ECP cont.

  24. EMP

  25. EMP cont.

More Related