1 / 33

n MRCGP Workplace-based Assessment

n MRCGP Workplace-based Assessment. April 2007. Workplace-based assessment. “The evaluation of a doctor’s progress over time in their performance in those areas of professional practice best tested in the workplace.”. Some principles of assessment. Validity Reliability Educational impact

manasa
Download Presentation

n MRCGP Workplace-based Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. nMRCGPWorkplace-based Assessment April 2007

  2. Workplace-based assessment “The evaluation of a doctor’s progress over time in their performance in those areas of professional practice best tested in the workplace.”

  3. Some principles of assessment • Validity • Reliability • Educational impact • Acceptability • Feasibility

  4. Lessons learnt • Need to build up a whole picture • No single method is a panacea • Content important, not format • Less structure, more sampling • Focus on a programme of assessment, not individual methods Van der Vleuten C & Schuwirth L Assessing professional competence from methods to programmes. Medical Education 2005. 39; 309-317

  5. Why workplace-based assessment? • Tests something important and different from other components • Reconnects assessment with learning • Has high educational impact • Valid and reliable • In keeping with PMETB guidance

  6. The WPBA framework • Competency-based • Developmental • Evidential • Locally assessed progress monitoring • Triangulated • Nested within an “e-portfolio” • Applies over the entire training envelope

  7. What is being assessed?

  8. Competency-based • 12 competency areas • Best tested in the workplace setting • Developmental progression for each competency area • Competency demonstrated “when ready”

  9. The 12 competency areas 1. Communication and consulting skills 2. Practising holistically 3. Data gathering and interpretation 4. Making a diagnosis/ making decisions 5. Clinical management 6. Managing complexity and promoting health 7. Primary care administration and IMT 8. Working with colleagues and in teams 9. Community orientation 10. Maintaining performance, learning and teaching 11. Maintaining an ethical approach to practice 12.Fitness to practice

  10. Insufficient evidence Needs further development Competent Excellent Developmental progression

  11. Evidential • Notion of multiple sampling • From multiple perspectives • Tool-box of “approved” methods • Sufficiency of evidence defined

  12. Local assessment of progress • Assessed by clinical supervisor in hospital or general practice setting • Regular reviews at 6 month intervals by trainer/educational supervisor • Review all the assessment information gathered • Judge progress against competency areas • Provide developmental feedback

  13. Gathering the evidence about the learner’s developmental progress

  14. Evidence from • Specified RCGP tools and… • Naturally occurring information

  15. Specified tools CBD (case based discussion) COT (consultation observation tool) PSQ (patient satisfaction questionnaire) MSF (multi-source feedback) mini-CEX (clinical evaluation exercise) DOPS (direct observation of procedural skills)

  16. Case-based discussion • Structured oral interview • Designed to assess professional judgement • Across a range of competency areas • Starting point is the written record of cases selected by the trainee • Will be used in general practice and hospital settings

  17. COT • Tool to assess consultation skills • Based on MRCGP consulting skills criteria • Can be assessed using video or direct observation during general practice settings

  18. Mini-CEX • Used instead of COT in hospital settings

  19. DOPS • For assessing relevant technical skills during GP training: • Cervical cytology • Complex or intimate examinations (e.g. rectal, pelvic, breast) • Minor surgical skills • Similar to F2 DOPS

  20. MSF • Assessment of clinical ability and professional behaviour • ST1 Rated by 5 clinical colleagues, 2 occasions ST3 Rated by 5 clinical and 5 non-clinical colleagues on 2 occasions • Simple web based tool • Is able to discriminate between doctors BUT • Needs skill of trainer in giving feedback

  21. PSQ • Measures consultation and relational empathy (CARE) • 30 consecutive consultations in GP setting • Can differentiate between doctors BUT • Needs skill of trainer in giving feedback

  22. Naturally occurring evidence • From direct observation during training • “tagged” against appropriate competency headings • Other practice-based activities • Clinical supervisor’s reports (CSR)

  23. Making sense of it all at the 6m reviews

  24. Workplace-based assessment ST1 6M 6M Deanery panel Interim review Based on evidence: 3 x mini-CEX or COT* 3 x CBD 1 x MSF DOPS ** Clinical supervisors’ report ** Interim review Based on evidence: 3 x mini-CEX or COT* 3 x CBD 1 x MSF 1 x PSQ * DOPS ** Clinical supervisors’ report ** * if GP post ** if appropriate

  25. Workplace-based assessment ST2 6M 6M Deanery panel Interim review Based on evidence: 3 x mini-CEX or COT* 3 x CBD DOPS ** Clinical supervisors’ report ** Interim review Based on evidence: 3 x mini-CEX or COT* 3 x CBD DOPS ** Clinical supervisors’ report ** * if GP post ** if appropriate

  26. Workplace-based assessment ST3 6M month 34 Deanery panel Interim review Based on evidence: 6 x COT 6 x CBD 1 x MSF DOPS ** Clinical supervisor’s report** Final review Based on evidence: 6 x COT 6 x CBD 1 x MSF DOPS ** PSQ * if hospital post ** if appropriate

  27. Moderation by Deanery Panels

  28. Deanery Panels • Chaired by nominee of GP Director • ‘Gold Guide’ compliant • Membership: • Nominated Chair • Experienced GP trainer/Programme Director • RCGP assessor • Lay member

  29. ST1/ST2 • Deanery panel review of e-portfolios of all trainees • Face to face review with trainees if problem with progress identified • Annual assessment outcome report generated (AAO) • Logged within e-portfolio

  30. The Final Judgement • Recommendation of satisfactory completion of WPBA by educational supervisor to Deanery Director • Based on attainment of competence in each of 12 competency areas • Deanery panel review of e-portfolio +/- face to face interview • e-portfolio “sign off” or recommendation for further training

  31. Quality control • Deanery • Specification • Training • Calibration • Moderation • National • Sampling of ETR • Verification and audit • Appeals process

  32. The End

More Related