1 / 25

OSCEs

OSCEs. Kieran Walsh . OSCE . Means “objective structured clinical examination” Assesses competence Developed in light of traditional assessment methods . Long case . One hour with the patient Full history and exam Not observed

lee
Download Presentation

OSCEs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OSCEs Kieran Walsh

  2. OSCE Means “objective structured clinical examination” Assesses competence Developed in light of traditional assessment methods

  3. Long case One hour with the patient Full history and exam Not observed Interrogated for 20 min afterwards Bad cop … worse cop Taken back to the patient

  4. Long case – holistic assessment BUT Unreliable Examiner bias .. examiner stringency .. unstructured questioning … little agreement between examiners Some easy patients .. some hard ones Some co-operative patients … some not Case specificity In the NHS when do you ever get an hour with a patient? (not valid) Not a test of communication skills

  5. Objective structured Long Case Examination Record (OSLER) 30 min with patient Observed Prior agreement on what is to be examined Assessed on same items Case difficulty taken into account Comm skills assessed Two examiners

  6. OSLER More reliable than long case One study showed it to be highly reliable But need 10 cases 20 examiners 300 minutes (? Feasibility / cost / acceptability)

  7. Short cases 3-6 cases Have a look at this person and tell us what you think… A few minutes per station

  8. Short cases fine BUT Different students saw different patients Cases differed in complexity Halo effect Not structured Examiners asked what they wanted Communication with patient “incidental”

  9. OSCEs • Clinical skill – history, exam, procedure • Marking structured and determined in advance • Time limit • Checklist/global rating scale • Real patient/actor • Every candidate has the same test

  10. OSCEs – reliable Less dependent on examiner’s foibles (as there are lots of examiners) Less dependent on patient’s foibles (as there are lots of patients) Structured marking More stations … more reliable Wider sampling – clinical, comm skills What is better more stations with one examiner per station or fewer stations with two examiners per station?

  11. OSCEs – valid Content validity – how well sampling of skills matches the learning outcomes of the course Construct validity – people who performed well on this test have better skills than those who did not perform well Length of station should be “authentic”

  12. OSCEs – educational impact Checklist – remember the steps in a checklist Global rating scale – holistic Both If formative – feedback

  13. OSCEs - cost Planning Examiners Real patients Actors Manikins Admin staff Tech staff Facilities Material Catering Collating and processing results Recruitment Training Cost effective?? If used for correct purpose

  14. OSCEs – acceptability • Perceived fairness – examiners and examinees • Become widespread

  15. OSCE design - blueprinting • Map assessment to curriculum • Adequate sampling • Feasibility – real patients, actors. manikins

  16. OSCE design – station development • In advance • Trial it • Construct statement • Instructions for candidate • Instructions for examiner • List of equipment • Personnel (real patient or …) • Semi-scripted scenarios • Marking schedule (global rating scale/checklist/both)

  17. OSCE design – assessing • Process skills • Content skills • Clinical management

  18. OSCE design – piloting • Return to slide 15 and go through again

  19. OSCE design – simulated patients • Consistency – reliability • Training • Briefing • Debriefing • Database of actors • Scenarios in advance • Practice with each other and with examiner

  20. OSCE design - examiners • Training • Consulted

  21. OSCE design – real patients • Consistency – must give the same history each time • Can fall sick • Develop new signs / lose old ones • Can get tired (10 students/day)

  22. Practical considerations • Single rooms/hall with partitions • One rest for 40 minutes of assessment • Rooms to rest – candidates/patients/examiners • Floor plan • Ideally each station – same duration • Signs • Floor map • Stopwatch and bell • Catering • Transport • Pay • Acknowledgement

  23. OSCE – standard setting – establish passmark • Most tools developed for MCQs • All are complex and time consuming • None ideal • “Borderline group” method emerging

  24. OSCEs • Should we run OSCE assessment ourselves? • Should we run OSCE preparation courses? • What is the difference between these two in practical terms? • Should we film OSCEs? • Should we do podcast OSCEs? • Are distant OSCEs possible (e.g. via Skype) and could we be an enabler?

More Related