1 / 51

Extension Program Evaluation

Extension Program Evaluation. Dale A. Moore VMTRC School of Veterinary Medicine UC Davis. What’s the purpose of evaluation?. Improve the program Improve teaching Make a difference Scholarly pursuit Get the higher-ups off your back Meet promotion requirements. Goals of This Presentation.

doyle
Download Presentation

Extension Program Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ExtensionProgram Evaluation Dale A. Moore VMTRC School of Veterinary Medicine UC Davis

  2. What’s the purpose of evaluation? • Improve the program • Improve teaching • Make a difference • Scholarly pursuit • Get the higher-ups off your back • Meet promotion requirements

  3. Goals of This Presentation • Discuss “traditional” program evaluation • Help you focus your evaluations • Learn the different levels of evaluation • Help you construct evaluation questions • Provide you a tool for data analysis • Discuss educational program evaluation using Learning Stage Theory

  4. Pre-Test • Take out the Pre-Test • Take 5 minutes to answer the questions • Turn them in to Pat

  5. Who will use the results? • The public • Your extension division • Colleagues • You and program planning staff … • All your stakeholders

  6. Who will use the results? Your Audience

  7. Do I have the resources to conduct this evaluation project? • Time • Money • People • Expertise • Or – are the data already available?

  8. Levels of Evaluation • Perceptions of the program or course • Competence with new skills, knowledge, abilities or new attitudes • Individual performance – a change in behavior within their environment • Industry outcomes

  9. What are your outcomes?

  10. Mechanisms for Evaluation • Direct observation • Audience response systems • Surveys • Telephone or face-to-face interviews • Focus groups • Expert or peer-review (formative evaluations) • Testimonials • Industry data

  11. Audience Response System

  12. Surveys • Who is the audience? • How are you going to deliver it? • Are you going to randomly choose possible responders? • What percent response do you need to make it meaningful? • Pre- and Post-program delivery to address immediate changes in knowledge/attitudes?

  13. Pre-Test --–-- Post-Test

  14. Pre-Test --–-- Post-Test • Advantages: • Get attention at beginning of the program • Provide some level of anxiety so that they “look” for the answers during the presentation • Gives instructors some measure of learning needs / areas upon which to focus • Identify short-term changes in knowledge / attitudes

  15. Pre-Test – Post-Test • Disadvantages: • Takes time away from the program • Need to be able to quickly “grade” and analyze your data • Need to be able to make adjustments to the program if needed • Only addresses short-term learning and not behavior change • Adults may not enjoy being graded

  16. What kind of questions? • Reaction • Learning • Behavior • Results ?

  17. What kind of questions/data if the outcome is behavior change?

  18. Questions about… • Satisfaction with different aspects of the program • Speaker effectiveness • New knowledge or skills • A change in attitude • A “commitment to change” • New program ideas

  19. Question Types • Open- ended questions • Closed • Set of responses • Likert scale • Numeric responses • Categorical responses • What are the advantages and disadvantages of each kind of question?

  20. Data Analysis • How do you go about analyzing responses to different kinds of questions? • Comparison of means – Student’s T-test • Comparison of proportions • Categorical data analysis – Chi-Square • Non-parametric like the Sign test • Multivariate analyses * EpiCalc2000 – free program from Web; URL in handout

  21. Documenting the Impacts of Continuing Education Dale Moore and Hank Slotnick

  22. Do you sometimes feel that… despite… • Results of your needs assessments, and • Your best intentions to create an appealing program • Those who need to attend the program, don’t? • The minds of some who come aren’t open?

  23. Objectives • Provide the theoretical framework for documenting change • Demonstrate applications of the theory • Demonstrate a new method of representingchange data

  24. Background on the Problem • Key meta-analytic findings of CME programs: • Findings consistent with adult learning: • Simple provision of information is often insufficient. • Active learning appears to be more effective. • Multiple exposures predispose toward success. • Most traditional CME does not change physician behavior Why not?

  25. Behavioral Change TheoriesProchaska & Mezirow & Slotnick • Suggest that change occurs in stages • In each stage the learner has different needs • To become interested in the idea of change • the learner’s mind must be opened • the learner must feel a need to attend to something, something has happened to unsettle them, or they have an immediate problem to solve

  26. STAGES OF CHANGE INTERESTED, BUT…. (contemplation) UNAWARE OR UNIMPRESSED (precontemplation) CHOICES:DECISIONS (preparation) ACTION FINE TUNING & MAINTAINING CHANGE (maintenance) Adapted from: Prochaska et al, 1992

  27. Mezirow: Theory of Perspective Transformation • Disorienting dilemma • Self-examination with feelings dissonance or discomfort • Critical re-assessment of assumptions • Recognition of discontent and that others have changed • Exploration of options for new ways of seeing/behaving • Planning a course of action, gathering resources • Acquisition of knowledge and skills • Provisional trying of new perspective/behavior • Building of competence and self-confidence in new activity • Integration of new perspective/behavior into one’s life

  28. Traditional CE Efforts Potential for mismatch between the intervention and learner readiness • Over-emphasis on encouraging action • Ask them to take the leap because we say so! • Presumes learners minds are open

  29. Documenting the Outcomes • A taxonomy of outcomes • Awareness of problem • Knowledge/skill changes and Practice changes • Improved herd outcomes Aware Improve Change Slotnick 2001

  30. Taxonomy of Outcomes • Taxonomy based on stages in learning episodes • Evaluation: Learner decides whether to learn the solution to the problem at hand • Learning: Learner gains required skills and knowledge • Gaining experience: Learner familiarized with the skills, knowledge & problems. Aware Learn Evaluation Improve Change Gain Experience

  31. Documenting Change • Hypothesis:It should be possible to document changes in prevalence of learners at different stages in addressing an educational program.

  32. How to Document Outcomes • Present clinical/practical vignettes. • Ask respondents to indicate their ‘learning status’ for each vignette/problem. • Vignette example:Mary was a 58-year-old woman who had been in good health generally. She did not exercise regularly and was about 20 pounds over her ideal weight, but her diet was relatively low in fat, she did not smoke, and she had no history of diabetes. She had been taking estrogen and progesterone hormone replacement therapy for 6 years. Her recent blood lipid panel results were as follows: (blah blah blah). She was taking hydrochlorothiazide to treat hypertension, and her mother had experienced an MI at 60 years of age. Mary was seen at the emergency department for a prolonged episode of left shoulder and upper back pain which had begun after dinner that evening.

  33. Responses • Evaluation: I’m comfortable that I can handle the problem in the vignette; all I might be interested in is information so I can decide when next to update.” • Learning: I need to update myself in the skills and knowledge needed to address this problem. • Gaining experience: I’ve recently updated skills and knowledge. All I might be interested in is hearing about others’ experiences with similar problems.

  34. How to Document Outcomes • The AMEE (Slotnick) study • Goal: Introduce the idea of estimating prevalence for people at different stages of learning in a lecture setting (90 minutes) • Measurement: Practice vignettes were examined pre- and post, some based on issues addressed in the lecture and some not. • Analysis: Expected to see stage-to-stage movement for the first set, but not the second.

  35. The Triangular Graph Each side represents a response – 3 possible responses.

  36. Evaluation Eval Dominated 50% 50% Learning Domin. Gain Exp. Gaining Experience 50% Learning

  37. Evaluation 50% 50% Gaining Experience 50% Learning

  38. Evaluation 50% 50% Gaining Experience 50% Learning

  39. How to Document Outcomes • The AMEE study. • 3 Vignettes where no changes were expected. Evaluation “The task force chair needs a list of physicians who might be called on in an emergency. She asks you to locate such a list.” Gaining Experience Learning

  40. How to Document Outcomes • The AMEE study. • 4 Vignettes where changes were expected. “You and two others are to put together a pencil-and-paper survey questionnaire to determine what doctors know about preventing, diagnosing, and managing anthrax and small pox.” Evaluation Gaining Experience Learning

  41. Dairy Veterinarian Study

  42. Dairy Informatics Vignettes • “A 500-cow dairy client is going to expand his operation to add 500 more cows. To help support his application for a loan, the bank has asked him for information on the productivity of the dairy herd, including milk production and herd replacements. The bank would like information for the last 3 years of the dairy’s operation.” Developed 8 practice vignettes: An example --

  43. Moore’s Study Evaluation Learning Gaining Experience

  44. Moore’s Study Evaluation Learn Gain Experience

  45. Conclusions • Evaluate group changes dependent on stage of learning • Documents the changes • Provides for parsimonious graphical display of change • Currently working on statistical methods

  46. Post-Test • Take out the Post-Test • Take 5 minutes to complete • Turn in to Pat

  47. What did we learn? • There are different levels of program evaluation • Clearly-defined program objectives lead to easier evaluations • There are a number of mechanisms of program evaluation • Learning stage theory helps us evaluate WHERE the learner is with regard to making a change

  48. Thank you for your attention

More Related