1 / 22

Designing the Model for Evaluation

Designing the Model for Evaluation. Without measurement, there is no possibility of continuous improvement. Model: Svenson and Rinderer, 1992. OBJECTIVES:. 1. Introduce Svenson. 2. Review Evaluation Types, Traditions, and Perceptions.

Download Presentation

Designing the Model for Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Designing the Model for Evaluation Without measurement, there is no possibility of continuous improvement. Model: Svenson and Rinderer, 1992

  2. OBJECTIVES: 1. Introduce Svenson. 2. Review Evaluation Types, Traditions, and Perceptions. 3. Learn the Role of Evaluation in the Strategic Planning Process (according to Svenson). 4. Discuss Svenson’s Nine-Step Results-Measurement Model 5. Determine the Applications of Svenson’s Model to Evaluation Traditions and Perceptions.

  3. Who is Svenson? • Co-author (with Monica Rinderer) of:The Training and Development Strategic Plan Workbook • More than twenty-five years experience in training • One of the five models for Evaluation

  4. Evaluation Traditions: Scientific 1. Using control of quantitative variables. Systems 2. Using flow of order (i.e. sequential modules). Qualitative 3. Based on value. Eclectic 4. Collected from various sources.

  5. Evaluation Perceptions (Viewpoints): 1. Discrepancy Comparison to known benchmarks. 2. Democratic Changes made mostly from a governing board. 3. Analytical Based on empirical data (taken over time). 4. Diagnostic Research-based examination of program by sections. 5. Other

  6. Strategic Planning Process Phases (Svenson and Rinderer, 1992) Strategic Vision & Goals Alternative Strategies & Resource Requirements Organization Management Administrative Strategies Implementation Organization of Training Governance & Advisory Structure Measuring Results Supervisor & Management Support Finance & Accounting Primary Results Secondary Results Results Data Sources System Design Process

  7. Svenson’s Nine-Step ModelFor Measuring Results 1. Identify Decisions. 2. Define Results to be Measured. 3. Identify Required Data. 4. Define Data Sources and Measurement Means. 5. Specify Reports. 6. Specify Storage/Retrieval Database. 7. Design Overall Information Flow. 8. Design Reports-Handling Administrative System. 9. Evaluate Cost of Evaluation Compared to Benefits.

  8. Svenson’s Results Measurement Model for Evaluation continued

  9. Svenson’s Results Measurement Model for Evaluation (continued)

  10. Measuring Results 1. Primary Training Results 2. Secondary Results 3. Results Data Sources 4. System Design Process

  11. Primary Training Results • Training needs met. • Organizational benefits from training needs met. • Cost of training needs met. • Quality/effectiveness of training provided. • Productivity of training resources. • Training needs not met. • Organizational cost of training needs not met.

  12. Secondary Results • Needs analysis & curriculum architecture process results. • Instructional materials development results. • Instructional delivery results. • Training facilities utilization.

  13. Company Annual Training Results This Year Last Year Deviation • Number of Employees 3730 3925 5.0% • Hourly 3000 3200 -6.25% • Professional 350 325 -6.25% • Management 380 400 -5.00% • Number of Training per Employee • Hourly 5.1 4.6 +10.9 • Professional 10.3 7.2 +43.1 • Management 7.4 6.5 +13.4

  14. Results Data Sources • Registration and scheduling data. • Personnel data. • Accounting data. • Trainee feedback questionnaire. • Instructor feedback questionnaire. • Supervisor feedback questionnaire. • Mastery test results. • Evaluation team data.

  15. “It’s Results That Matter” George Bergeron, ALCOA RPD President

  16. Evaluation (Review) 1. Types:General Summative Formative 2. Traditions:Scientific Systems Qualitative Eclectic 3. Perceptions (Viewpoints):Discrepancy Democratic Analytical Diagnostic Other

  17. Questions from Svenson Presentation 1. What evaluation type, tradition, and perception does Svenson’s system fit into best and why?

  18. “Statistical” Perception: The “Other” Evaluation Perception Employing the principles of statistics this perception deals with the process of the collection, analysis, interpretation, and presentation of masses of data from the training program’s results sources.

  19. Questions from Svenson Presentation 2. Give an example of how Svenson’s model could work in your industry.

  20. Questions from Svenson Presentation 3. Differentiate Svenson’s model from any of the other models of evaluation.

  21. References: Svenson, R. & Rinderer, M. (1992). The Training and Development Strategic Plan Workbook. Prentice-Hall, Englewood Cliffs, N.J. Rae, L. (1993). Evaluating Trainer Effectiveness. Business One Irwin, Homewood, Ill.

More Related