1 / 81

Assessment, Feedback and Evaluation

Assessment, Feedback and Evaluation. Vinod Patel & John Morrissey. Learning outcomes. By the end of this session you will be able to : Define assessment, feedback and evaluation Discuss how these are related and how they differ Discuss the application of each in clinical education.

stu
Download Presentation

Assessment, Feedback and Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment, Feedback and Evaluation Vinod Patel & John Morrissey

  2. Learning outcomes By the end of this session you will be able to : • Define assessment, feedback and evaluation • Discuss how these are related and how they differ • Discuss the application of each in clinical education. • Begin to apply them in practice

  3. Definitions Assessment : theory & practice Tea break Feedback Evaluation : theory & practice Questions and close Lesson Plan

  4. Definitions ? • Assessment ? • Feedback • Evaluation ?

  5. Assessment : definition “The processes and instruments applied to measure the learner’s achievements, normally after they have worked through a learning programme of one sort or another” Mohanna K et al (2004) Teaching Made Easy – a manual for health professionals

  6. Feedback : definition “Specific information about the comparison between a trainee’s observed performance and a standard, given with the intent to improve the trainee’s performance” Van de Ridder JM et al (2008) Med Educ 42(2): 189

  7. Evaluation : definition “A systematic approach to the collection, analysis and interpretation of information about any aspect of the conceptualisation, design, implementation and utility of educational programmes” Mohanna K et al (2004) Teaching Made Easy – a manual for health professionals

  8. Part 1 Assessment

  9. In this section: • Purposes of assessment • Miller’s pyramid • The utility function

  10. Why assess ?

  11. Why assess ? 1 of 2 • To inform students of strengths and weaknesses. • To ensure adequate progress has been made before students move to the next level. • To provide certification of a standard of performance.

  12. Why assess ? 2 of 2 • To indicate to students which parts of the curriculum are considered important. • To select for a course or career. • To motivate students in their studies. • To measure the effectiveness of teaching and to identify weaknesses in the curriculum.

  13. Summative Formative

  14. Clinical Education : Assessment Methods • Written Assessments • Observed clinical practice • Others : • Vivas • Portfolios • …

  15. How a skill is acquired • Cognitive phase • Fixative phase • Practice • Feedback • Autonomous phase Fitts P & Posner M (1967) Human Performance

  16. Does Shows how Knows how Knows Miller GE (1990) Acad Med (Suppl) 65 : S63

  17. Clinical Work Observed ACAT, CbD, CeX, Does OSLER Shows how OSCE Short Answer-Reasoning Knows how Written Exams Knows MCQ Miller GE (1990) Acad Med (Suppl) 65 : S63

  18. Question : How can we tell whether these tests are any good or not ? Answer : We do the maths .

  19. Utility function • U = Utility • R = Reliability • V = Validity • E = Educational impact • A = Acceptability • C = Cost • W = Weight U = wrR x wvV x weE x waA x wcC Van der Vleuten, CPM (1996) Advances in Health Sciences Education 1, 41-67.

  20. The Assessment Pentagram Validity Reliability Acceptability Feasibility Educational Impact

  21. Validity & reliability • Validity : the extent to which the competence that the test claims to measure is actually being measured. • Reliability : the extent to which a test yields reproducible results. Schuwirth & van der Vleuten (2006) How to design a useful test : the principles of assessment

  22. Validity : another definition “The degree to which empirical evidence and theoretical rationales support the adequacy and appropriateness of inferences and actions based on test scores or other modes of assessment.” Messick (1994) Educational Researcher 23 : 13

  23. Some causes of low validity • Vague or misleading instructions to candidates. • Inappropriate or overcomplicated wording. • Too few test items. • Insufficient time. • Inappropriate content. • Items too easy or too difficult. McAleer (2005) Choosing Assessment Instruments

  24. Some causes of low reliability • Inadequate sampling. • Lack of objectivity in scoring. • Environmental factors. • Processing errors. • Classification errors. • Generalisation errors. • Examiner bias. McAleer (2005) Choosing Assessment Instruments

  25. Types of validity • Face • Predictive • Concurrent • Content • Construct

  26. “The examination fairly and accurately assessed my ability”

  27. “The examination fairly and accurately assessed the candidates’ ability”

  28. Problem : Appearances can be deceptive.

  29. Types of reliability • Test-retest • Equivalent forms • Split-half • Interrater and intrarater

  30. The Assessment Pentagram Validity Reliability Acceptability Feasibility Educational Impact

  31. Utility function • U = Utility • R = Reliability • V = Validity • E = Educational impact • A = Acceptability • C = Cost • W = Weight U = wrR x wvV x weE x waA x wcC Van der Vleuten, CPM (1996) Advances in Health Sciences Education 1, 41-67.

  32. Does Shows how Knows how Knows Miller GE (1990) Acad Med (Suppl) 65 : S63

  33. Clinical Work Observed ACAT, CbD, CeX, Does OSLER Shows how OSCE Short Answer-Reasoning Knows how Written Exams Knows MCQ Miller GE (1990) Acad Med (Suppl) 65 : S63

  34. FY Workplace Assessment • Mini-CEX (from USA): Clinical Examination • DOPS (developed by RCP): Direct Observation of Procedural Skills • CBD (based on GMC performance procedures): Case-based Discussion • MSF (from industry): Multi-Source Feedback Carr (2006) Postgrad Med J 82: 576

  35. Practical Exercise

  36. Educational interventions • How will we assess? • How will we feedback? • How will we evaluate?

  37. Educational interventions • Communication skills for cancer specialists • 2nd year medical speciality training • Medical humanities SSM for medical students • Masters-level pharmacology module • Procedural skills for medical students • Clinical Officers: ETATMBA

  38. Communication skills

  39. Speciality training

  40. Medical humanities SSM

  41. M-level pharmacology module

  42. Procedural skills

  43. Clinical Officers

  44. The ideal assessment instrument : • Totally valid. • Perfectly reliable. • Entirely feasible. • Wholly acceptable. • Huge educational impact.

  45. Part 2 Feedback

  46. Feedback : definition “Specific information about the comparison between a trainee’s observed performance and a standard, given with the intent to improve the trainee’s performance” Van de Ridder JM et al (2008) Med Educ 42(2): 189

More Related