1 / 1

Do your OSCE marks accurately portray your students’ competency?    A . de Villiers, E. Archer

Time=After. Scatterplot of A against Rating. Data - Narrow in Analysis - 15Sep2010.stw 15v*24c. A = 19.446+0.6314*x. 85. 80. 75. 70. 65. Participants (A). 60. 55. 50. 45. 40. 30. 40. 50. 60. 70. 80. 90. Standard (Rating). Time=Before.

Download Presentation

Do your OSCE marks accurately portray your students’ competency?    A . de Villiers, E. Archer

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Time=After Scatterplot of A against Rating Data - Narrow in Analysis - 15Sep2010.stw 15v*24c A = 19.446+0.6314*x 85 80 75 70 65 Participants (A) 60 55 50 45 40 30 40 50 60 70 80 90 Standard (Rating) Time=Before Box & Whisker Plot: Difference From Standard 40 30 20 10 DifferenceFromStandard 0 -10 -20 -30 Mean Yes No Mean±SE Health Sciences Education Mean±1.96*SE Do your OSCE marks accurately portray your students’ competency?    A. de Villiers, E. Archer Stellenbosch University CONTEXT Objective Structured Clinical Examination (OSCE) examiner training is widely utilised to address some of the reliability and validity issues that accompany the use of this assessment tool. An OSCE skills course was developed and implemented at the Stellenbosch Faculty of Health Sciences and the influence on participants (clinicians) was evaluated. Figure I: Participants’ rating (A) against Standard (Rating) METHOD Participants attended the OSCE skills course which included theoretical sessions (about topics such as standard-setting, examiner influence and assessment instruments), as well as two staged OSCEs, one at the beginning and the other at the end of the course. During the OSCEs each participant examined a student role-player performing a technical skill while being video recorded. Participants’ behaviour during, and assessment results from the two OSCEs were evaluated, as well as the feedback from participants regarding the course and group interviews with student role players. RESULTS There was a significant improvement in inter-rater reliability (see figure I) as well as a slight decrease in inappropriate examiner behaviour, such as teaching and prompting during the second OSCE. Furthermore, overall feedback from participants and perceptions of student role-players were positive. Figure II: The effect of Health Sciences education on participants’ rating An interesting finding was that participants who had previously undergone Health Sciences Education training generally deviated less from the standard, than did those participants who had not (p=0.01). See figure II. CONCLUSION In this study, examiner conduct and inter-rater reliability was positively influenced by the following interventions: examiner briefing; involvement of examiners in constructing assessment instruments as well as viewing (on DVD) and reflection of assessment behaviour by examiners. This study proposes that the development and implementation of an OSCE skills course is a worthwhile endeavour in improving validity and reliability of the OSCE as an assessment tool. adeledev@sun.ac.za

More Related