1 / 55

OSCE

OSCE. Kathy Boursicot Train the Trainer Assessment Workshop October 29, 2003. Hong Kong International Consortium. OSCE. Format Purpose Advantages Writing principles Training observers Scoring considerations. What is an OSCE?. O bjective S tructured C linical E xamination.

Download Presentation

OSCE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OSCE Kathy Boursicot Train the Trainer Assessment Workshop October 29, 2003 Hong Kong International Consortium

  2. OSCE • Format • Purpose • Advantages • Writing principles • Training observers • Scoring considerations

  3. What is an OSCE? • Objective • Structured • Clinical • Examination Harden RG and Gleeson FA Assessment of clinical competence using an objective structured clinical examination (OSCE) Medical Education,1979, Vol 13: 41-54

  4. Observed Stations: clinician examiners OSCE test design

  5. …. Traditional OSCE …. SP-based test …. Station couplets …. Integral consultations Varieties of OSCEs Patient-based Clinical task Written task

  6. OSCE • Format • Purpose • Advantages • Writing principles • Training observers • Scoring considerations

  7. Professional authenticity Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S7. Simple model of competence Does Shows how Knows how Knows

  8. Does Knows how Knows Shows how Testing formats Professional practice Behaviour~ attitude/skills OSCEs EMQs, SEQs Cognition~ knowledge MCQs

  9. OSCE-Objective • All the candidates are presented with the same test • Specific skill modalities are tested at each station • History taking • Explanation • Clinical examination • Procedures

  10. OSCE - Structured • The marking scheme for eachstation isstructured • Structured interaction between examiner and student

  11. OSCE – Clinical Examination • Test of performance of clinical skills • candidates have to demonstrate their skills, not just describe the theory

  12. OSCE • Format • Purpose • Advantages • Writing principles • Training observers • Scoring considerations

  13. Characteristics of assessment instruments • Utility = • Reliability • Validity • Educational impact • Acceptability • Feasibility Van der Vleuten, C. The assessment of professional competence: developments, research and practical implications, Advances in Health Science Education, 1996, Vol 1: 41-67

  14. Test characteristics • Reliability of a test / measure • reproducibility of scores across raters, questions, cases, occasions • capability to differentiate consistently between good & poor students

  15. Test Sample Test Sample Sampling Domain of Interest  

  16. Reliability • Competencies are highly domain-specific • Broad samplingis required to obtain adequate reliability • across content, i.e., range of cases/situations • across other potential factors that cause error variance, i.e., • testing time, number of cases, examiners, patients, settings, facilities

  17. Test characteristics • Validity of a test / measure • The content is deemed appropriate by the relevant experts • The test measures the characteristic (e.g. knowledge, skills) that it is intended to measure • The performance of a particular task predicts future performance

  18. Test characteristics • Validity of a test / measure

  19. OSCE • Format • Purpose • Advantages • Writing principles • Training observers • Scoring considerations

  20. Advantages of using OSCEs in clinical assessment • Careful specification of content = Validity • Observation of wider sample of activities = Reliability • Structured interaction between examiner & student • Structured marking schedule • Each student has to perform the same tasks = Acceptability

  21. OSCE • Format • Purpose • Advantages • Writing principles • Training observers • Scoring considerations

  22. OSCE Station Writing

  23. How to start • Decide what tasks you • want to • can • should test in an OSCE format • OSCEs test performance, not knowledge

  24. Constructive alignment • Need to know the learning objectives of your course / programme • Map these across : • Subject areas • Knowledge areas • Skill areas

  25. Blueprinting • Content of the assessment should align with the learning objectives of the course • Blueprinting • allows mapping of test items to specific learning outcomes • ensures adequate sampling across subject area and skill domains

  26. OSCE blueprint: systems-based

  27. OSCE blueprint: discipline-based

  28. Key features of success in designing OSCEs • Feasibility • Congruence

  29. Feasibility • Is it a reasonable task to expect the candidates to perform? • Can the task be examined at an OSCE station? • Can the task be performed in the time allowed?

  30. Feasibility • Is it a reasonable task to expect the candidates to perform? Is it authentic? • Can the task be examined at an OSCE station? • Match clinical situations as closely as possible • Some tasks may require simulated patients • Some tasks may require manikins • Some tasks simply cannot be examined in this format

  31. Feasibility • Can task be performed in time allowed? • Pilot the stations to see if they are feasible • Check equipment /helpers/practicalities

  32. Congruence • Is it testing what you want it to test? • Station construct: describe what station is testing

  33. Congruence • Ensure that all parts of station coordinate • Candidate instructions • Marking schedule • Examiner instructions • Simulated patient instructions • Equipment

  34. Station construct • This station tests the candidates ability to …………………………

  35. Candidate instructions • State circumstances: e.g. outpatient clinic, ward, A & E, GP surgery • Specify the task required of the candidate: e.g. take a history, perform a neurological examination of the legs, explain a diagnosis • Specify tasks NOT required • Instruct on summing up: e.g. tell the patient, tell the examiner

  36. Examiner instructions • Copy of candidate instructions • Specific instructions appropriate to the task: • e.g., do not prompt, explicit prompts, managing equipment

  37. Simulated patient instructions • Give as much detail as possible so they can be consistent • try to leave as little as possible for them to ad lib! • Give enough information to enable them to answer questions consistently • Be specific about affect in each role • Specify patient demographics • i.e., gender, age, ethnicity, social class, etc.

  38. Marking schedule • Ensure marks are allocated for tasks the candidates are asked to perform • Decide relative importance of diagnosis vs process (history taking, examination) • Separate checklist for process skills

  39. Equipment • Be detailed • Think of • Chairs + table / couch / bench • Manikins - specify • Medical equipment • Stethoscope, ophthalmoscope, sphyg, suturing materials, etc

  40. Designing stations • Use your blueprint • Be clear what you are testing: define the construct • Check for congruence • Pilot for feasibility

  41. OSCE • Format • Purpose • Advantages • Writing principles • Training observers • Scoring considerations

  42. Training observers • Understand the principles of OSCEs • Enhance inter-rater consistency

  43. Techniques • Examiners must train together • Videos • ‘live’ stations • Discussion of marking inconsistencies

  44. Training observers • General training • Station-specific training

  45. OSCE • Format • Purpose • Advantages • Writing principles • Training observers • Scoring considerations

  46. Scoring considerations • Global vs checklist scoring • Weighting • Standard setting

  47. Checklist scoring • Advantages • Helps examiner know what the station setters are looking for • Helps the examiner be objective • Facilities the use of non-expert examiners • Disadvantages • Can just reward process/thoroughness • May not sufficiently reward the excellent candidate • Ignores the examiners expertise

  48. Global scoring • Advantages • Utilises the expertise of the examiners • They are in a position to make a (global) judgement about the performance • Disadvantages • Examiners have to be expert examinersi.e. trained • Examiners must be familiar with expected standards for the level of the test

  49. Weighting • In a checklist, some items may be weighted more than others • More complicated scoring system • Makes no difference to very good & very bad candidates • Can enhance discrimination at the cut score

  50. Standard setting • No perfect method! • Should be criterion-referenced method • e.g. Angoff, Ebel, etc. But • are these suitable for performance based tests?

More Related