1 / 37

Assessment – Why and How?

Assessment – Why and How?. Stephanie B. Jones, MD Associate Professor, Harvard Medical School Vice Chair for Education Department of Anesthesia, Critical Care and Pain Medicine Beth Israel Deaconess Medical Center. Problems. No one teaches us how to assess We judge as we were judged

helmut
Download Presentation

Assessment – Why and How?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment – Why and How? Stephanie B. Jones, MD Associate Professor, Harvard Medical School Vice Chair for Education Department of Anesthesia, Critical Care and Pain Medicine Beth Israel Deaconess Medical Center

  2. Problems • No one teaches us how to assess • We judge as we were judged • Ideal vs reality • Keeping up with a moving target (ACGME) • Faculty time and energy • We work in a time-based system • Should residency be truly competency based?

  3. Assessment – Why and How? • Definitions • Including difference between feedback and assessment • Tools • Global, checklists, 360°, portfolios • Limitations and Questions

  4. Assessment – a definition “…process of collecting, synthesizing, and interpreting information to aid decision-making.” • successful completion of a rotation • promotion • remediation Airasian PW. Classroom assessment, 3rd ed. 1997

  5. Feedback vs Assessment • Feedback  formative evaluation • Provide information for improvement • Directly from the source • Assessment  summative evaluation • How well a goal has been met • Judgment after the fact • Reality  overlap Ende J. JAMA 1983;250:777-781

  6. Feedback and Assessment • Can use same tools for both • Allows learner to practice, know goals • End result shouldn’t be a surprise Duffy et al. Acad Med 2004;79:495-507 Van der Vleuten & Schuwirth. Med Educ 2005;39:309-317

  7. Feedback and Assessment • Fundamentals of laparoscopic surgery (FLS) • MCQ exam • 5 skill stations • BIDMC • PGY 4 surgery residents must pass to advance to PGY 5 year

  8. Barriers to feedback • Faculty aren’t taught how to do effectively • Rosenblatt & Schartel 1999 • Only 20% of programs offered formal training • Need reinforcement • Residents often not directly observed • Impact on faculty evaluations • Time and money

  9. Assessment • ACGME • Core competencies • Patient care • Medical knowledge • Practice-based learning and improvement • Interpersonal and communication skills • Professionalism • Systems-based practice • ABA • Certify that the graduate has “demonstrated sufficient professional ability to practice competently and independently in the field of Anesthesiology”

  10. Competency “….the ability to handle a complex professional task by integrating the relevant cognitive, psychomotor, and affective skills”. Van der Vleuten & Schuwirth. Med Educ 2005;39:309-317

  11. An analogy • “Diagnosing” whether resident should be promoted, graduated, etc • Just like a complex patient, may need multiple tests and opinions to make the “diagnosis” Joyce B. ACGME

  12. What should happen • Design an assessment system • Collection of assessment tools • Decide who does evaluations • Decide what will be evaluated • Evaluation schedule

  13. What does happen • Continue to use existing system • Open ACGME toolbox • New tools • More data • Are the tools reliable and valid? • Do we ever use the extra data? http://www.acgme.org/Outcome/assess/Toolbox.pdf

  14. Definitions • Validity – Does assessment measure what it intends to measure • Reliability – Scores from assessment are reproducible (consistency)

  15. Choosing assessment tools • Valid data • Reliable data • Feasible • External validity • Provide valuable information • Some compromise will be involved Lynch and Swing, ACGME www.acgme.org/outcome

  16. The assessment system • Consistent with program objectives • Objectives are representative • You can’t assess everything • Multiple tools • Multiple observations • Looking for patterns • Doesn’t mean you need to add more questions Lynch and Swing, ACGME www.acgme.org/outcome

  17. The assessment system • Multiple observers • Improves reliability • Assessed according to pre-specified criteria • Goals and objectives • Faculty training • Fair Lynch and Swing, ACGME www.acgme.org/outcome

  18. Summing up… “A good assessment programme will incorporate several competency elements and multiple sources of information to evaluate those competencies on multiple occasions using credible standards.” Van der Vleuten & Schuwirth. Med Educ 2005;39:309-317

  19. Some challenges • Setting “passing” criteria for qualitative information • Mixed messages • Differentiating performance in a testing situation versus performing with real patients Holmboe ES. Acad Med 2004;79:16-22

  20. Global evaluations • The “old” standard • Usual end-of-rotation evaluation • More useful with behavior-based descriptions or anchors Professionalism and Honesty 1) Residents must demonstrate a commitment to carrying out professional responsibility, adherence to ethical principles. Demonstrate respect, compassion, and integrity; a responsiveness to the needs of patients. Demonstrate a commitment to confidentially of patient information, informed consent, departmental polices and guidelines Unsatisfactory, Below Expectations, Good, Above Expectations, Excellent, N/A

  21. Global evaluations • Useful in context of summative assessment • Williams et al, SIU – 3 item global evaluation • Clinical performance • Professional behavior • Overall performance in comparison to peers Williams et al. Surgery 2005;137:141-7

  22. Global evaluations • Can be used for more specific feedback/assessment • Doyle et al, British Columbia • Technical skills in OR, GRITS • Vassiliou et al, McGill • Assessment of laparoscopic skills, GOALS Doyle et al. Am J Surg 2007; 193:551-5 Vassiliou et al. Am J Surg 2005;190:107-113

  23. Global rating index for technical skills “GRITS”

  24. “GRITS”

  25. Halo effect? • Global evaluations clearly subject to “halo effect” • Vogt et al, University of TN • Gyn surgical skills • Videotaped “hand only” and “waist up” • Scores differed between 2 views (both directions) Vogt et al. Am J Obstet Gynecol 2003; 189:688-91.

  26. Checklists • Simulation • Scavone et al, Northwestern • Simulated GA for csxn • CA3 vs CA1, 150 vs 128 points • Murray et al, Washington University • Series of studies on acute skills performance • Multiple scenarios tested • Senior residents scored best, varied with scenario Scavone et al Anesthesiology 2006;105:260-6 Murray et al Anesth Analg 2005;101:1127-34

  27. Checklists • Standardized patients • OSCE • Observations of skills • Preanesthesia consult • Machine checkout De Oliveira Filho and Schonhorst. Anesth Analg 2004;99:62-9

  28. 360° evaluation • Derived from business world • Multisource evaluation • Different perspectives • Lends credibility • Can/should include self-evaluation • Time-sensitive

  29. 360° evaluation • Resident position in hierarchy not fixed • Change rotations • Change attendings • Change types of rotations • But can still create action plans based upon results • PBLI and SBP Massagli et al. Am J Phys Med Rehabil 2007;86:845-52

  30. 360° evaluation • Opportunity to include patient feedback • More “real” than standardized patients? • Overcomes limitation of observer not knowing how patient really feels • Staff evaluation of interpersonal and communication skills often based upon interactions with staff, not patients. Duffy et al. Acad Med 2004;79:495-507

  31. 360° evaluation • Worth the trouble? • Brinkman et al • Pediatrics • Added parents and nurses • Better feedback for communication skills and professionalism • Weigelt et al • Surgery, trauma/CC rotation • Added RNs, NPs, ICU fellows, Chief resident, trauma nurse clinicians • No change in ratings with added groups Arch Pediatr Adolesc Med 2007:161:103-4; Curr Surg 2004;61:616-26

  32. 360° evaluation • Self-evaluation • Adult learning theory • Curriculum should be “learner-centric” • Have to “know what you don’t know” • Often poor correlation between self-assessment and external measures • 360 ° allows opportunity to reconcile conflict • Not “how good am I?”, but “how can I get better?” Schneider et al. Am J Surg 2008;195:16-19.

  33. Portfolios • Requires reflection and self-assessment • Skills needed for life-long learning • But… • Need mentor who can facilitate or just becomes a bunch of stuff in a folder • If used for summative assessment, must be very clear about requirements

  34. Structured portfolio Holmboe et al. Am J Med 2006;119:708-714

  35. Structured portfolio Holmboe et al. Am J Med 2006;119:708-714

  36. Remaining questions • Are residents truly “adult learners”? • What is the best way to assess the assessments? • Does any of this really improve outcomes in a time-limited residency? • How can we assess residents after graduation?

More Related