1 / 53

PROGRAM EVALUATION

PROGRAM EVALUATION. Produced by Dr. James J. Kirk Professor of HRD Western Carolina University. What You Will Learn. The definition of “program evaluation” Various terms used used by program evaluators Common reasons for conducting program evaluations Selected types of evaluations.

lel
Download Presentation

PROGRAM EVALUATION

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PROGRAM EVALUATION Produced by Dr. James J. Kirk Professor of HRD Western Carolina University

  2. What You Will Learn • The definition of “program evaluation” • Various terms used used by program evaluators • Common reasons for conducting program evaluations • Selected types of evaluations

  3. WHAT IS EVALUATION ? Determining the worth of something. Employee Orientation Program EVALUATION

  4. WHAT CONSTITUTES A PROGRAM? Employee Orientation Program Planned learning activities.

  5. NEEDS ASSESSMENTS VS TRAINING EVALUATIONS PROGRAM EVALUATION NEEDS ASSESSMENT

  6. FORMAL EVALUATION VS INFORMAL EVALUATION Formal Informal

  7. SUMMATIVE EVALUATION End Result

  8. FORMATIVE EVALUATION Steps Taken To Achieve The Desired End Result

  9. THE EVALUATION OBJECT That which is evaluated.

  10. EVALUATION OBJECTS 1. Orientation Program 2. Management Training 3. Safety Program 4. Mentoring Program 5. Cross Training 6. Drug Education 7. Job Rotation 8. Team Building

  11. DESCRIPTION OF OBJECTS 1. Who is involved? 2. Why it exists? 3. What are its parts or functional elements? 4. When does it take place? 5. Where it exists?

  12. EXTERNAL EVALUATIONS Conducted by someone from outside the organization.

  13. INTERNAL EVALUATIONS Conducted by someone from inside the organization.

  14. WHY EVALUATE?

  15. PURPOSES OF AN EVALUATION

  16. GOAL - FREE EVALUATIONS Journalistic Style Evaluations

  17. THE EVALUATION AUDIENCE

  18. SOME COMMON AUDIENCES • MANAGERS • OTHER PROFESSIONALS • PARTICIPANTS • TRAINERS • GOVT. OFFICIALS • PROGRAM PLANNERS • VENDORS

  19. TYPES OF AUDIENCES PARTICIPANTS STAKEHOLDERS CLIENTS

  20. AUDIENCE CHARACTERISTICS 1. Age, Sex, Race 2. Occupation 3. Education/Training Background 4. Values 5. Knowledge of Evaluation 6. Special Concerns 7. Special Interests 8. Hidden Agendas

  21. STAKEHOLDERS CAN ADVERSELY IMPACT AN EVALUATION

  22. REDUCE ANY NEGATIVE IMPACT FROM STAKEHOLDERS

  23. EVALUATION CONSTRAINTS MONEY TIME EXPERTISE POLITICS

  24. TIMES WHEN AN EVALUATION MAY NOT BE APPROPRIATE Crisis Political Unrest Right After A Program Has Begun When Evaluation May Cost More Than Program

  25. EVALUATION MYTHS 1. My CEO does not require evaluation, so why should I do it? 2. Measuring progress toward objectives is an adequate evaluation strategy. 3. There are too many variables affecting the behavior change for me to evaluate the impact of training.

  26. EVALUATION MYTHS 4. I can’t measure the results of my training. 5. I don’t need to justify my existence, I have a proven track record. 6. I don’t know what information to collect. 7. Measurement is only effective in the production and financial areas.

  27. EVALUATION MYTHS 8. If I can”t calculate the return on investment, then it is useless to evaluate the program. 9. Evaluation will probably cost too much. 10. Evaluation will lead to criticism. 11. The emphasis should be the same in all organizations.

  28. SELLING AN EVALUATION STUDY TO TOP MANAGEMENT?

  29. EVALUATIONAPPROACHES

  30. CLASSIFIED BY 1. Research design 2. Type of data 3. Way data is collected 4. Who does the evaluating 5. Who uses the info 6. Type of ?’s asked 7. Scope of the evaluation

  31. OBJECTIVES-ORIENTED APPROACH PROPONENTS Tyler, Provus, Metfessel & Michael etc. PURPOSE Determine extent of achieved objectives. DISTINGUISHING CHARACTERISTICS Specify measurable objectives & compare objectives with performance. PAST USES Curriculum development and needs assessment etc. CONCEPTUAL CONTRIBUTIONS Pre-post performance measurements. CRITERIA USED Measures objectives, reliability and validity. BENEFITS Simple, sets objectives. LIMITATIONS Reductionistic, linear.

  32. MANAGEMENT-ORIENTED APPROACH PROPONENTS Stufflebeam, Alkin & Provus. PURPOSE Provides info for decision-making. DISTINGUISHING CHARACTERISTICS Evaluating all stages of program development. PAST USES Accountability, program planning. CONCEPTUAL CONTRIBUTIONS Identifies / evaluates needs and objectives. CRITERIA USED Utility, propriety, and technical soundness. BENEFITS Comprehensive and sensitive to leadership. LIMITATIONS Expensive and focuses on production.

  33. CONSUMER-ORIENTEDAPPROACH PROPONENTS Scriven, Komoski. PURPOSE Provides info for educational purchases etc. DISTINGUISHING CHARACTERISTICS Uses criterion checklists to analyze products etc. PAST USES Consumer reports. CONCEPTUAL CONTRIBUTIONS Provides criteria for evaluating educational products. CRITERIA USED Objective criteria to infer conclusions / make recommendations. BENEFITS Provides info on cost, consumer needs & product developers. LIMITATIONS Cost, not open to cross examination.

  34. EXPERTISE-ORIENTED APPROACH PROPONENTS Eisner, Accreditation Groups. PURPOSE Professional judgments DISTINGUISHING CHARACTERISTICS Judgment based upon individual knowledge and experience. PAST USES Self-study, accreditation, criticism. CONCEPTUAL CONTRIBUTIONS Legitimizes subjective criticism. CRITERIA USED Qualified BENEFITS Capitalizes LIMITATIONS Personal bias, overuse of intuition.

  35. ADVERSARY-ORIENTED APPROACH PROPONENTS Wolf, Owens, Levine & Kourilsky. PURPOSE Expose program’s strengths / weaknesses. DISTINGUISHING CHARACTERISTICS Airs opposing viewpoints / public hearings. PAST USES Examines controversial programs / issues. CONCEPTUAL CONTRIBUTIONS Uses forensic / judicial public hearings, clarifies issues. CRITERIA USED Balance, open to public. BENEFITS Aims at closure / resolution, audience impact.. LIMITATIONS Fallible arbiters / judges, cost, time involved.

  36. NATURALISTIC OR PARTICIPANT-ORIENTED APPROACH PROPONENTS Stake, Patton, Guba & Lincoln etc. PURPOSE Expose complexities of educational activity. DISTINGUISHING CHARACTERISTICS Multiple realities, inductive logic and discovery. Ethnographies of operating program. PAST USES Emergent evaluation designs, studying context criteria for judging naturalistic inquiry. CONCEPTUAL CONTRIBUTIONS CRITERIA USED Credibility BENEFITS Focuses on describing, judging and understanding. LIMITATIONS Nondirective, atypical, may not reach closure.

  37. RESULTS ORIENTED APPROACH? The focus is on Kirkpatrick’s 4th level of evaluation-”results.”

  38. WHAT IS THE BEST APPROACH? ?

  39. SELECTION CRITERIA 1. Purpose of the evaluation 2. Expertise of the evaluator 3. Evaluation audience 4. Time 5. Money 6. Scope 7. Help available

  40. HRD EVALUATION MODELS • Xerox • IBM • AT&T/Bell • Saratoga Inst. • CIPP • CIRO • Kirkpatrick

  41. AT&T/BELL • Reaction outcomes • Capability outcomes • Applications outcomes • Worth outcomes

  42. IBM • Reaction • Testing • Applications • Business

  43. Zerox • Entry capability • End-of-course performance • Mastery job performance • Organizational performance

  44. CIPP (Phi Delta Kappa) • Context evaluation • Input evaluation • Process Evaluation • Product Evaluation

  45. SARATOGA INSTITUTE • Trainee satisfaction • Learning change • Behavior change • Organization change

  46. CIRO (Warr, Bird, Rackham) • Context evaluation • Input evaluation • Reaction evaluation • Outcome evaluation

  47. KIRKPATRICK • Reaction • Learning • Behavior • Results

  48. COMMON EVALUATION LEVELS • Participant’s Reaction • Participant’s Increased Knowledge

More Related