1 / 39

Faculty development

Faculty development. Work place based assessments. Aims. Rehearse the purpose of WBA Review the tools Discuss the feedback process and it’s potential benefits Outline the requirements for WBAs Allow opportunity for practise and discussion. Faculty development strategy. Produce materials

gremillion
Download Presentation

Faculty development

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Faculty development Work place based assessments

  2. Aims • Rehearse the purpose of WBA • Review the tools • Discuss the feedback process and it’s potential benefits • Outline the requirements for WBAs • Allow opportunity for practise and discussion

  3. Faculty development strategy • Produce materials • Train the trainers • Cascade training • Single most important thing to improve training

  4. Faculty Observation / Rating Skills Ratings based mostly on perceived knowledge and personality Little evidence of direct observation Tendency to score high Poor inter rater reliability Gray, Thompson, Haber, Grant, etc.

  5. Approaches to assessor training Behavioral Observation Training- training to observe and look for actions Performance Dimension Training- familiarise Faculty with elements of competency and agree definitions Frame of Reference Training - having defined the performance dimensions – agree those clinical situations which might discriminate between different levels of trainee Direct Observation of Competence Training- combination of the above using live practice

  6. Training by direct observation Shown to Increase satisfaction with the process by trainers Increased comfort in observation Changed rating behavior at 8 months Increased accuracy in identifying unsatisfactory performance Holmbroe

  7. Your experiences • When was WBA good • For you • For the trainee • What made it good • What training have you already had • What are the outstanding issues you want addressed

  8. Purpose of Assessment? • To aid learning through constructive feedback: • Assessment for Learning (formative) • Done frequently e.g. WBA • To check knowledge or skill has been learned: • Assessment of Learning (summative) • Done infrequently e.g. Exams

  9. Who is interested in the results? • To distinguish between the competent and the insufficiently competent (Public/NHS Interest) • To ensure trainees are fit to progress with their training (Deanery/Faculty) • To ensure that trainees are performing satisfactorily within the precepts of the licensing body (GMC interest) • To show acheivement(Trainee interest)

  10. Does Millers pyramid WBA Knows how Shows how OSCE SAQ /viva MCQ Knows The College of Emergency Medicine

  11. In other words…. Assessment can be effective in improving performance if; • It looks at the whole professional behaviour • It looks at performance vs competence • It encourages practice (to make perfect) • It focuses on continuous improvement – not final snapshot

  12. The subtext of WBA • Encourages reflection by trainee • Should support personal development plans for trainees – and therefore avoid exam disaster • Should enhance trainer-trainee contact • Should allow the trainer to calibrate themselves • It may provide CPD for the assessor

  13. Essential elements of beneficial WPBAs • Culture of support it is not an exam • Recorded and that record reviewed • Encouraging- identifying learning opportunities • Includes constructive thoughtful feedback • There is an expectation of completion • Standardisation and trajectories- low scores the norm at the beginning of training. • Explicit end points and standards • Open transparent “scoring” • No retrospective recording and trainee must have been present

  14. What assess- the Emergency physician skills

  15. Invisible elements of practice that influence outcomes of assessment The significance of context The doctor’s professional values The kinds of knowledge used The clinical/critical thinking The professional judgement exercised The therapeutic relationship developed The learning by reflection

  16. The tools • Mini-CEX • DOPs • CbD • MSF • ACAT-EM • Leadership tool (Lynsey Flowerdew)

  17. What assess?

  18. What assess?

  19. What assess? CBD

  20. What assess? CBD

  21. What assess? CBD

  22. Assessment theory • Assessment drives learning • Formative vs summative • Tools fit for purpose • Reliable • Valid • Feasible • Educational • Acceptable • Assessment reflects syllabus • Structure is reliable – ie multiple sampling (fatigue)

  23. Formative • Termed coined in 1967 • A bi-directional process • Allows you to modify your training • Allows you to set trainee goals • Furnishes trainee with lifelong skills • Motivates trainees to learn • Assessment for learning

  24. Summative • Reflects learning up to that point • Provides information on the ‘product’ • Needs well designed evaluation tools • Needs to have objectivity • Needs to have multiple components • Assessment of learning

  25. Reliability of individual assessment • Familiarity with the tool • Observational skills • Knowledge of standard

  26. Validity of assessments • Content  • Face ?- Hawthorne effect • Predictive ? – too early

  27. Feasibility • Core trainees – 1 per 2.5 weeks (1 per week counting holidays etc) • Higher trainees – 1 per 3 weeks • Feasibility therefore depends on your total number of trainees

  28. MiniCEX • Purpose – the clinical encounter. Focus on clinical skills – history taking, examination and analysis of information to come to diagnosis • Reliability – shown to correlate with performance in senior docs in USA • Problems – standards, commitment to use, conflict between summative/formative and repetitive, rarely translate into an action plan • Solutions – enhanced feedback, use it to plan developments, use it to enhance contact

  29. DOPS • Purpose – to identify problems with procedural skills • Reliability – variable • Problems – tick box, undervalued – most people “ok”, not enough procedures to go round, no construct with complexity of procedure • Solutions – experts to do this assessment, define the standards more clearly

  30. CbD • Purpose – to explore the thinking behind decisions, to review the content of a case and outcomes, safety, use of investigations etc • Reliability – good at identifying cognitive problems • Problems – depends on trainer- requires preparation, lateral thinking, can be reduced to tick box • Solutions – understand purpose

  31. MSF • Purpose - “hold a mirror up”, provide trainee with opportunity to focus on behaviour • Reliability – strongest evidence base, most valued, shown to predict those with problems, failing trainees • Problems – lack of participation, difficulty feeding back • Solutions – practice , culture -

  32. ACAT - EM • Purpose – collect ongoing evidence, interactions with many people, opportunity for NTS feedback • Reliability – unknown • Problems – time consuming, uncertain how to conduct, standard not clear • Solutions – training, practice, feedback

  33. Feedback • “Feedback is frequently reported to be too general or to late to be helpful” Westberg et al. 2001 • 8% of MiniCEX resulted in documented constructive feedback Holmbroe 2004

  34. Why don’t assessors give feedback? • Focus on assessment process • Lack of skill / training • Documentation design • Lack of understanding of role of feedback Holmbroe 2004

  35. What is ‘good’ feedback? • Specific • Timely • Balanced • Based on observed facts • Non-judgemental • Promotes reflection • Results in an action plan Norciniand Burch 2007

  36. Feedback • Feed forward • Process of reflection • Pendleton’s rules - reflect on good and then what could be done better (trainee /assessor) • Use descriptors/domains • Construct an action plan • Return to evidence after a period of time - evidence that actions have been completed

  37. Challenges • Time pressure • Space / privacy • Breaking bad news • Trainee defensiveness • Trainer ambivalence • Halo effect

  38. Questions

  39. Summary • Multiple tools – must be familiar with them • Feedback and action plans crucial • WBA for learning • Demands time and energy but is rewarding (and an eye opener)

More Related