1 / 37

Competency and Performance: What Should We Measure and How

Golden Jubilee College of Physicians and Surgeons of Pakistan Karachi, Pakistan 10 th November 2012. Zubair Amin Assoc Professor; Dept of Pediatrics Senior Consultant; Dept of Neonatology National University of Singapore National University Hospital paeza@nus.edu.sg.

gwyneth
Download Presentation

Competency and Performance: What Should We Measure and How

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Golden JubileeCollege of Physicians and Surgeons of PakistanKarachi, Pakistan10th November 2012 Zubair Amin Assoc Professor; Dept of Pediatrics Senior Consultant; Dept of Neonatology National University of Singapore National University Hospital paeza@nus.edu.sg Competency and Performance: What Should We Measure and How

  2. Agendas of the Talk • Is there a difference between competency and performance? • Why do we need to assess competency and performance? • How should we assess performance in our own context? What are the models and methods available for the performance measurement?

  3. Is there a difference between competency and performance?

  4. Areas of Confusion • To perform: “To portray a role or demonstrate a skill before an audience” (Nelson’s Canadian Dictionary. Toronto: Nelson 1997; cited in Hodges B; Medical Education. 2003) • “Performance-based assessment” • Clinical examinations: Short cases, long cases • OSCE

  5. Areas of Confusion Miller G. The assessment of clinical skills/competence/performance. Acad Med. 1990.

  6. Areas of Confusion Miller G. The assessment of clinical skills/competence/performance. Academic Medicine. 1990.

  7. Competency versus Performance • Competency: • What doctors “can do” in controlled representations of professional practice • What a doctor is capable of doing • Performance: • What doctors “do” in actual professional practice • Assessment of day-to-day practices undertaken in the working environment Rethan et al. Medical Education. 2002

  8. Adopted from Lambert Schuwirth

  9. Challenges for Performance Assessment • Isolating outcomes that are directly attributable to a doctor • In complex healthcare delivery, outcomes of the patients are not solely attributable to an individual • Variability in the complexity of the patients • Complexity of the patient varies, more competent doctors tend to see more complex patients with multiple co-morbidities • “Lake Wobegon Effect” • “Where all the women are strong, all the men are good looking, and all the children are above average.” • Conventional global ratings of supervisors are hopelessly unreliable (Streiner 1995; Gray 1996) • No universally accepted norms for standard of care • Numbers are approximation and guess works

  10. Why do we need to assess competency and performance? Knowing that performance assessment is more challenging, does it add any additional value to assessment of competency?

  11. Hidden Incompetency Hodges. Medical Teacher, 2006

  12. Need for Competency and Performance Measurement • Differences exist between what doctors can do in high stake controlled environments and what they actually practice in real-life working environments. • Correlation between competency (testing conditions) and performance (real practice) varies from being negative to moderate to high. • (Rethans J-J et al. Med Ed. 2002) • Performance assessment can have a unique developmental role.

  13. Relationship between Competency and Performance • Set-up: Family Medicine Practice • Four representative cases: tension headache, acute diarrhea, shoulder pain, and NIDDM • Standard domains: history, physical examination, guidance and advice, treatment, and return visit • Obligatory, intermediate, and superfluous Rethans J-J. BMJ. 1991. 303. 1377-80.

  14. Rethans J-J. BMJ. 1991. 303. 1377-80.

  15. We are more through and express more compassion under testing conditions • We play by the rule! • We spend less time with patients in actual practice than what we do in tests • We take short cuts! • We are more efficient in real practice • We use experience to guide clinical decision making!

  16. Assessment of Clinical Competency Using OSCE • Participants: Students, Resident, and Physicians • Naturalistic instruction: “Do what you normally do. This is not an examination. We are trying to understand how you function under normal condition.” • OSCE instruction: “This is an OSCE station I which your questions and interactions are recorded in detail on checklist.” • Hodges et al. OSCE checklists do not capture increasing levels of expertise. Acad Med. 1999

  17. Assessment of Clinical Competency in OSCE with Checklist Hodges et al. Acad Med. 1999

  18. Assessment of Clinical Competency in OSCE with Global Scores

  19. Students are more through during their examination compared to expert clinicians • Many of us would fail in the clinical examinations that we create for our students! • In ‘subjective’ global rating expert clinicians would do better than the students • Checklists do not capture many attributes that we associate with clinical expertise!

  20. Developmental Aspects of Performance Assessment • Typical examination set-ups rarely provide opportunities for feedback and improvement • Practice-based performance assessments provide un-paralleled opportunity for direct observation and feedback

  21. Lack of Observation and Feedback • 82% residents were engaged in only one directly observed clinical encounters in their first year of training. (Dey et al, 1990) • 80% of PG trainees never or only infrequently received feedback based on directly observed performance. (Isaacson et al, 1995)

  22. Efficacy of Observation and Feedback • 500 meta-analyses, 1800 studies, and 25 million students on classroom education. • Effect size of classroom education as intervention is 0.40 • 12 meta-analyses of the effect of feedback • Effect size is 0.79 • Feedback is one of four most powerful influences on achievement • Hattie et al 1999 quoted in Norcini and Bursch 2007

  23. Developmental Aspect of Performance Based Assessment • “Lack of assessment and feedback, based on observation of performance in the workplace, is one of the most serious deficiencies in current medical education practice.” • Norcini and Burch, Workplace Based Assessment as an Educational Tool. AMEE Guide 31. 2007. • “It is useful to consider feedback as part of an ongoing programme of assessment and instruction rather than a separate educational entity.” • Norcini and Burch, Workplace Based Assessment as an Educational Tool. AMEE Guide 31. 2007.

  24. Competency and Performance Assessment

  25. How should we assess performance in our own context? What is the practical way of assessing performance?

  26. Screening Program (all) with Moderate Rigor Good Performers (majority) with Minimal Rigor Poor Performers (minority) with Maximum Rigor Rethans et al. Med Ed. 2002

  27. Screening Program (all) Comprehensive, periodic review; most feasible performance and competency assessments Good Performers (majority) Choice, Reflection, continuous quality improvement Poor Performers (minority) Diagnostic tests, remediation, rehabilitation Rethans et al. Med Ed. 2002

  28. Methods of Data Collection Clinical Records Domains Assessed Administrative Data Outcomes of Care Process of Care Practice Volume Diaries, logs, reflections Observations Norcini JJ. Work-based Assessment. In ABC of Teaching and Learning. BMJ. 2003.

  29. Outcomes of Care: Examples • Mortality and morbidity • Critical complications rate • Disease outcomes

  30. Process of Care: Examples • Adherence to or deviance from protocols • Screening rate • Preventive care provided by the doctor

  31. Practice Volume • Nature of the patients seen • Nature of the procedure performed • Profile of the patient Quality of the care directly related to number of patients seen.

  32. Tools available for Performance Assessment • Naturally occurring data sets: • Patient outcomes, profile of practice, administrative tools • Observation of clinical activities: • mini-clinical evaluation exercise (mini-CEX) and direct observation of procedural skills (DOPS) • Observations and feedback from peers and patients: • 360 degree assessment, multi-source feedback (MSF) • Direct interactions with patient: • Incognito Standardized Patients

  33. Clinical Records Administrative Data Diaries, log Observations Outcome of Care Process of Care Practice Volume Portfolio & Practice Profile Experts’ Judgments

  34. General Principles in Performance Assessment • Have a higher tolerance for more subjective, experts’ judgments • Always take into the account unique contextual variables of the practice • Focus on the holistic profiling rather than individual instruments • Emphasize feedback and developmental aspects of performance assessment

  35. “I have suffered on several occasions from inefficient or ungentlemanly residents foisted upon me by the competitive examination plan and I would here enter my warmest protest against it.”William Osler

  36. Thank You!

More Related