1 / 58

Clinical Assessment Dr. H. Javdani Assistant professor of Psychiatry QUMS

Clinical Assessment Dr. H. Javdani Assistant professor of Psychiatry QUMS. introduction. ارزیابی : Assessment ارزشیابی : evaluation. Important points in any assessment: Importance Sample Validity Reliability. Important points in any assessment: Objectivity feasibility

esthersnead
Download Presentation

Clinical Assessment Dr. H. Javdani Assistant professor of Psychiatry QUMS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Clinical AssessmentDr. H. JavdaniAssistant professor of PsychiatryQUMS

  2. introduction

  3. ارزیابی : • Assessment • ارزشیابی : • evaluation

  4. Important points in any assessment: • Importance • Sample • Validity • Reliability

  5. Important points in any assessment: • Objectivity • feasibility • Acceptability • Limitations

  6. Domains of learning : • Cognitive : knowledge , comprehension , application , analysis , synthesis , evaluation (judging) • Psychomotor • Affective

  7. Miller’s pyramid

  8. Miller’s pyramid

  9. Bloom’s taxonomy :

  10. performance-based assessments: • OSCE • Mini CEX • DOPS • PBA • MSF

  11. OSCEObjective Structured Clinical Exam

  12. OSCE : • Performance based test which allows for standardized assessment of clinical skills • Assess the same skills in the same area

  13. OSCE : • Increasing the number of skills to be evaluate • Increasing the number of raters • Scoring with clear & objective criteria

  14. OSCE : • Fun ! • Need to practice ! • Not going to pass by reading books ! • More students failed than MCQ !

  15. Structure of OSCE : • Candidates rotate through the stations & performing a series of clinical tasks • Use SPs, mannequins or simulations • Raters assess performance using checklists

  16. Structure of OSCE : • Stations built upon a blueprint to ensure adequate sampling of the content domain & process skills • Evaluate both clinical knowledge & communication or psychomotor skills

  17. Advantages of OSCE : • Assess at “shows how” level • Testing of complex skills without any supervision • Standardized • Objective

  18. Limitations of OSCE : • Small sample of content • Time-consuming • Need trained item writers & raters • Need trained SPs • Need practice

  19. Why OSCE ? • Measurement of communication skills • Integration of communication skills with clinical knowledge • Safety of patients • Standard assessment • Uniquely capable of evaluation of clinical skills on simulated real-world environment

  20. Why not OSCE ? • Not efficient for assessment of knowledge alone • Expense issues • Time-consuming • feasibility

  21. OSCE with: • High : objectivity , acceptability • Moderate to high : validity , reliability • low: : feasibility

  22. What dose OSCE assess? • History taking skills • Performance of physical exam • Interpretation of data & complementary exams • Counseling • Dealing with particular situations (e.g: giving bad news, dealing with anger , etc)

  23. validity-response process: • Training of SPs • Rater training • Practicing SP & rater before the exam • Increasing objectivity (only y/n checklist)

  24. principles for use of OSCE : • Performance based assessment must be matched with performance based teaching • “Format anxiety” must be addressed • Standardization of simulations must be achieved

  25. development of OSCE: • Blueprinting • Station development • Case writing • Case review & validation • Standard setting • Piloting • Assembly of OSCE

  26. Blueprinting: • Defining content domain & process skills to be assessed • Based upon outcomes, expectations or standards • Level & Range: complexity of problem & performance

  27. SP ?! • Standardized patient • A person who is trained to show a specific clinical problem • Participates in evaluation the candidate’s communications skills

  28. How to develop OSCE? • Consider the relevant objectives • Write them down • Is the OSCE a suitable format to evaluate the objectives? • Consider each station for only one objective

  29. How to develop OSCE? • Write a scenario for each station • Develop a suitable rating tool for each station • Write an instruction for examinee • Write an instruction for standardized patient

  30. How to develop OSCE? • Try to simulate nearer to the real situation • Have several practices • Have a pilot study

  31. How to develop OSCE? • Run “OSCE” • have a suitable feedback

  32. Checklists : • Number of items : 8-25 • Items must be clear & dichotomous • Weighting of items • Killer items

  33. Mini CEXMini Clinical Evaluation Exercise

  34. Mini CEX: • Assessor observes a trainee with a patient in any settings: • Out-patient clinic • In-patient service • Emergency department • Trainer performs a focused task

  35. Mini CEX: • Assessor rates along several dimensions on a form • Feedback • Multiple encounters expected • Intended to be short & routine • Takes 15-20 minutes, feedback 5-7 min.

  36. Clinical skills evaluated: • Medical interview • Physical examination • Informed decision-making or counseling • Clinical judgment or reasoning

  37. Mini CEX forms: • Medical interviewing & history taking • Physical examination skills • Humanistic qualities / professionalism: (shows respect, compassion, empathy, establishes trust, attends to patient’s needs of comfort, modesty, confidentiality, information)

  38. Mini CEX forms: • Clinical judgment : (selectively order & considers risks/benefits) • Counseling skills • Overall clinical competence

More Related