1 / 30

Program Evaluation Session 1

Program Evaluation Session 1. Karen V. Mann, PhD Professor Division of Medical Education Dalhousie University Harvard Macy Program for Health Professions Educators January 20, 2009. Objectives. To review elements of program evaluation

Download Presentation

Program Evaluation Session 1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Program EvaluationSession 1 Karen V. Mann, PhD Professor Division of Medical Education Dalhousie University Harvard Macy Program for Health Professions Educators January 20, 2009

  2. Objectives • To review elements of program evaluation • To consider frameworks for planning and conducting program evaluation • To apply them to individual scholars’ projects

  3. Evaluation and Assessment as Part of a System Program Implementation Program Development Program Evaluation Reflection and analysis

  4. Approaches to Evaluation • Student-oriented approaches • Program-oriented approaches • Institution-oriented approaches • Stakeholder-oriented approaches Wilkes & Bligh, 1999.

  5. What Are We Evaluating? The curriculum in action (taught) The curriculum on paper (planned or formal) b a The curriculum: students’ experience (learned) c Fowell et al, 1999

  6. Approaches to Evaluation • Kirkpatrick levels of outcomes • Objectives model • Program Logic model • Experimental evaluation

  7. Kirkpatrick’sLevels of Outcomes Kirkpatrick’s hierarchy of levels of evaluation. Complexity of behavioural change increases as evaluation of intervention ascends the hierarchy. Evaluationof results(transfer or impact on society) Evaluation of behaviour(transfer of learning to workplace) Evaluation of learning(knowledge or skills acquired) Evaluation of reaction(satisfaction or happiness) Kirkpatrick, 1967. In Hutchison, 1999.

  8. Kirkpatrick’sLevels of Outcomes (Adapted) 4b. Benefits to learners/patients 4a. Change in organizational practice 3. Behavioural change 2a. Acquisition of knowledge, skills 2b. Modification of attitudes, perceptions 1. Evaluation of reaction A Critical review of Evaluations of Health Professions Education (2002) (Freeth, Barr et al.)

  9. Key Questions in Program Evaluation • Why is the evaluation needed? • What is the focus of the evaluation? • Who will be affected by the results? • Who should receive the results? • How will the evaluation results be used?

  10. With a partner, briefly outline and discuss your project in relation to the above questions Application to Your Project

  11. Formative Evaluation Focus on improvement • Information-sharing • Mid-course corrections • Ongoing monitoring • Program activities • Short-term outputs

  12. Summative Evaluation Focus on demonstrating results • Intermediate/long-term outcomes • Demonstrating value

  13. Program objectives • Before identifying objectives, determine the broad focus of the evaluation: 1. Is it learning? - individual and/or group? 2. Is it behaviour change? – individual, group, and/or organization? 3. Is it processes of teaching/ learning/ change? - at individual, group, organization levels?

  14. Aspects of Evaluation • Process evaluation measures • Outcome evaluation measures • Methods of evaluation

  15. Application to your project • What outcomes do you wish to evaluate? • Where do they fit at each of the Kirkpatrick levels? • Are there additional outcomes you can/ should evaluate?

  16. Application to Your Project With a partner, discuss your own project regarding: Methods of evaluation/ data collection approaches (using attached table) Timing of measures 17

  17. Application to Your Project What are the factors that will influence the evaluation you conduct?

  18. Factors influencing Evaluation Purpose of the evaluation Priorities for information Resources required Limitations of information Preparation required Potential barriers Potential enabling factors 19

  19. Ethical Considerations in Evaluation

  20. Principles of Evaluation • Comprehensive and ongoing evaluation is desirable • Users of evaluation information must be identified • Communication mechanisms must be built in • Feedback loops must be identified • Structure must be present to use information, to address problems • All aspects of the program should be systematically evaluated

  21. Contemporary Views Regarding Program Evaluation • Many different research strategies may be employed • The information gained should be useful to decision-makers • Stakeholder involvement increases the likelihood that findings will be used

  22. Contemporary Views Regarding Program Evaluation (cont’d) Questions should inform program planning and activities Programs are not static Evaluation is important to program accountability Evaluation should assist understanding how a program produces the effects seen 23

  23. References Bennett J (2003). Evaluation methods in research. Continuum Research Methods Series. London UK: Continuum. Bond SL, Boyd SE, Montgomery DL (1997). Taking stock: a practical guide to evaluating your own programs. Chapel Hill, NC: Horizon Research, Inc. Creswell J (2002). Research design. Qualitative, quantitative and mixed methods approaches. Thousand Oaks CA: Sage. Des Marchais JE, Bordage G (1998). Sustaining curricular change at Sherbrooke through external, formative program evaluations. Acad Med 73:494-503. Fowell SL, Southgate LJ, Bligh JG (1999). Evaluating assessment: the missing link? Med Educ 33: 276-281. 24

  24. References (cont’d.) Freeth D, Hammick M, Kippel I, Reeves S and Barr H. (2002). A critical review of evaluations of interprofessional education. The interprofessional education joint evaluation team. London: Learning and Teaching Support Network for Health Sciences and Practice. Henry Gary T (2002). Choosing criteria to judge program success. Evaluation 8(2): 182-204. Kern DE, Thomas PA, Howard DM, Bass EB (1998). Curriculum development in medical education. a six step approach. Step 6: evaluation and feedback. Baltimore MD: Johns Hopkins University Press, pp. 70-98. Kirkpatrick DL (1994). Evaluating training programs. The four levels. San Francisco CA: Berrett-Koehler 25

  25. References (cont’d.) • Knox AB (2002). Evaluation for continuing education: a comprehensive guide to success. San Francisco: John Wiley and Sons. • Musick DW (2006). A conceptual model for program evaluation in graduate medical education. Acad Med 81(8): 759-765. • Norman G, Keane D, Oppenheimer L (2008). Compliance of medical students with voluntary use of personal data assistants for clerkship assessments. Teach Learn Med 20(4):295-301. • Parlett M., Hamilton D (1976) Evaluation as illumination: a new approach to the study of innovative programs. In Glass GV (ed.) Evaluation studies review annual vol. 1. Beverly Hills CA. Sage.

  26. References (contd.) • Reissdorff EJ, Hayes OW, Carlson DJ, Walter GL (2001). Assessing the new general competencies for residency education: a model from an Emergency Medicine program. Acad Med 76(7): 753-757. • Stake R (1986). Evaluating Educational Programs. In Hopkins D (ed.). Inservice training and educational development. London: Croom Helm. • Stufflebeam D (2000). Chapter 16 -- The CIPP model for evaluation. In Stufflebeam D, Madaus GF and Kelleghan T (eds.) Evaluation model, 2nd ed. Boston: Kluwer Academic. • Weiss C (1998). Evaluation, 2nd ed. Upper/Saddle River NJ: Prentice Hall. • Wilkes M, Bligh J (1999). Evaluating educational interventions. BMJ 318:1269-1272.

  27. References: Web-Based Resources • Program development and extension. University of Wisconsin Extension • This website contains a range of program evaluation documents, that may be downloaded from the website. Many aspects of program evaluation are addressed. http://www.uwex.edu/ces/pdande/evaluation/evaldocs.html.

  28. References: Web-Based Resources (cont’d.) • W.K. Kellogg Foundation • Logic Model Development Guide • Evaluation Guide http://www.wkkf.org

  29. Examples of Program Evaluation • Alford DP, Bridden C, Jackson AH, Saitz R, Amodeo M, Barnes HN, Samet JH (2008). Promoting substance use education among generalist physicians: an evaluation of the Chief Resident Immersion Training (CRIT) Program. J Gen Intern Med Oct 21 (Epub ahead of print) • Antepohl W, Domeij E, Forsberg P, Ludvigsson J (2003). A follow-up of medical graduates of a problem-based learning curriculum. Med Educ 37(2): 155-162 • Frattatelli LC, Kasuya R (2003). Implementing and evaluation of a training program to improve resident teaching skills. Am J Obstet Gynecol 189(3):670-673. • Gaba ND, Blatt B, Macri CJ, Greenburg L (2007). Improving teaching skills in obstetrics and gynecology residents: evaluation of a residents-as-teachers program. Am J Obstet Gynecol 196(1):e1-7. • Kumagai AK, White CB, Ross PT, Perlman RL, Fantone JC (2008) The impact of facilitation of small-group discussions of psychosocial topics in medicine onfaculty growth and development. Acad Med 30(10):976-981.

More Related