1 / 35

Standard setting for clinical assessments

The Third International Conference on Medical Education in the Sudan. Standard setting for clinical assessments. Katharine Boursicot, BSc, MBBS, MRCOG, MAHPE Reader in Medical Education Deputy Head of the Centre for Medical and Healthcare Education St George’s, University of London.

angeni
Download Presentation

Standard setting for clinical assessments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Third International Conference on Medical Education in the Sudan Standard setting for clinical assessments Katharine Boursicot, BSc, MBBS, MRCOG, MAHPE Reader in Medical Education Deputy Head of the Centre for Medical and Healthcare Education St George’s, University of London

  2. WHAT are we testing in clinical assessments? • Clinical competence • What is it?

  3. A popular modern model: elements of competence • Knowledge • factual • applied: clinical reasoning • Skills • communication • clinical • Attitudes • professional behaviour Tomorrow’s Doctors, GMC 2003

  4. Behaviour~ skills/attitudes Does Knowshow Cognition~ knowledge Knows Showshow Another popular medical model of competence Professional authenticity Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S67.

  5. Assessment of competence • A review of developments over the last 40 years

  6. Knows Does Showshow • 1960: National Board of Medical Examiners in the USA introduced the MCQ • MCQs conquered the world • Dissatisfaction due to limitation of MCQs Knows how Knows

  7. Knows how Does Shows how Knows how • 1965: Introduction of PMP Patient Management Problem Knows

  8. Clinical Scenario Action Action Action Action Action Action Action Action Action Action Action Action Action Action Action Patient Management Problem

  9. Knows how Does Shows how Knows how • 1965: Introduction of PMP • Patient Management Problem • Well constructed SBA format MCQs can test the application of knowledge very effectively Knows

  10. Shows how Does Shows how Knows how • 1975: Introduction of Objective Structured Clinical Examination (OSCE) • OSCEs are conquering the world Knows

  11. Does Does Shows how Knows how • > 2000: emerging new methods • WBAs – Workplace-Based Assessments • Mini Clinical Examination Exercise • Direct Observation of Practical Procedure • OSATS • Masked standardized patients • Video assessment • Patient reports • Peer reports • Clinical work samples • ……… Knows

  12. Mini CEX (Norcini, 1995) • Short observation (15-20 minutes) and evaluation of clinical performance in practice using generic evaluation forms completed by different examiners (cf. http://www.abim.org/minicex/)

  13. Example of mini-CEX form

  14. DOPS – Direct Observation of Practical Procedure

  15. OSATS – Objective structured Assessment of Technical Skills

  16. WBAs – Workplace-Based Assessments • All based on the principle of an assessorobserving a student/trainee in a workplace or practice setting

  17. Performance assessment in vivo: Mini-CEX, DOP, OSATS, ….. Does Performance assessment in vitro: OSCEs Knows how (Clinical) Context based tests: SBA, EMQ, MEQ….. Knows Factual tests: SBA-type MCQs….. Shows how Past 40 years: climbing the pyramid..... Does Shows how Knows how Knows

  18. Standard setting – why bother? • To assure standards • At graduation from medical school • For licensing • For a postgraduate (membership) degree • For progression from one grade to the next • For recertification

  19. At graduation from medical school • To award a medical degree to students who meet the University’s standards (University interest) • To distinguish between the competent and the insufficiently competent (Public interest) • To certify that graduates are suitable for provisional registration (Regulatory/licensing body interest) • To ensure graduates are fit to undertake F1 posts (employer interest)

  20. Definition of Standards • A standard is a statement about whether an examination performance is good enough for a particular purpose • a particular score that serves as the boundary between passing and failing • the numerical answer to the question “How much is enough?”

  21. Standard setting All methods described in the literature are based on ways of translating expert (clinical) judgement into a score

  22. ‘Classical’ standard setting methods • For written test items: • Angoff’s method • Ebel’s method • For OSCEs: • Borderline group method • Regressions based method

  23. Performance-based standard setting methods • Borderline group method • Contrasting group method • Regression based standard method Kramer A, Muijtjens A, Jansen K, Düsman H, Tan L, van der Vleuten C Comparison of a rational and an empirical standard setting procedure for an OSCE, Medical Education, 2003 Vol 37 Issue 2, Page 132 Kaufman DM, Mann KV, Muijtjens AMM, van der Vleuten CPM. A comparison of standard-setting procedures for an OSCE in undergraduate medical education. Acad Med 2000; 75:267-271.

  24. The examiner’s role in standard setting • Uses the examiner’s clinical expertise to judge the candidate’s performance • Examiner allocates a global judgementbased on the candidate’s performance at that station • Remember thelevelof the examination Pass Borderline Fail

  25. Borderline Group Method Test score distribution Checklist 1. Hs shjs sjnhss sjhs sjs sj 2. Ksks sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk dd 6. Hskl;s skj sls ska ak akl ald 7. Hdhhddh shs ahhakk as TOTAL     Borderline score distribution  Pass, Fail, Borderline Passing score

  26. Contrasting groups method Test score distribution Checklist 1. Hs shjs sjnhss sjhs sjs sj 2. Ksks sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk dd 6. Hskl;s skj sls ska ak akl ald 7. Hdhhddh shs ahhakk as TOTAL    Fail Pass   Pass, Fail, Borderline P/B/F Passing score

  27. 1 2 3 4 5 Regression based standard Checklist X= passing score 1. Hs shjs sjnhss sjhs sjs sj 2. Ksks sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk dd 6. Hskl;s skj sls ska ak akl ald 7. Hdhhddh shs ahhakk as TOTAL  Checklist Score  X    Overall rating 1 2 3 4 5 Clear Borderline Clear Excellent Outstanding fail pass 1 = Clear fail 2 = Borderline 3 = Clear pass 4 = Excellent 5 = Outstanding

  28. Work Based Assessment tools • No gold standard standard setting method!

  29. Standard setting • Standards are based on informed judgments about examinees’ performances against a social or educational construct e.g. • competent practitioner • suitable level of specialist knowledge/skills

  30. Standard setting for Work Based Assessment tools • Based on descriptors for a particular level of training • Information gathering relying on descriptiveand qualitative judgemental information • Descriptors agreed by consensus/panel of clinical experts • Purpose of WBA tools: formative rather than summative: feedback

  31. Feedback • Giving feedback to enhance learning is some form of judgement by the feedback giver on the knowledge and performance of the recipient • It is a very powerful tool!

  32. WBAs and feedback • Underlying principle of WBA tools is FEEDBACK from • Teacher/supervisor • Peers/team members • Other professionals • Patients

  33. Conclusions • It’s not easy to set standards for Work Based Assessments (in the ‘classic’ sense) • Expert professional judgement is required • Wide sampling from different sources: range of tools, contexts, cases and assessors • Feedback to the trainee

  34. Thank you

More Related