Assessment of  Knowledge and Performance

Assessment of Knowledge and Performance PowerPoint PPT Presentation


  • 122 Views
  • Uploaded on
  • Presentation posted in: General

Goals: Assessment of Knowledge and Performance. 1. Clarify 2 distinct uses for assessments of knowledge and performance 2. Define 3 aspects of validity for all knowledge and performance assessment methods3. Compare and contrast 3 techniques for assessing clinical knowledge and performance4. Ident

Download Presentation

Assessment of Knowledge and Performance

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


1. Assessment of Knowledge and Performance John Littlefield, PhD University of Texas Health Science Center at San Antonio

2. Goals: Assessment of Knowledge and Performance 1. Clarify 2 distinct uses for assessments of knowledge and performance 2. Define 3 aspects of validity for all knowledge and performance assessment methods 3. Compare and contrast 3 techniques for assessing clinical knowledge and performance 4. Identify poorly written multiple choice test items and write a key features test item 5. Describe 3 options for scoring OSCE performance 6. Describe three elements of a clinical performance assessment system 7. Critique a clinical performance assessment system that you use

3. Agenda: Assessment of Knowledge and Performance Exercise: Warm-up for assessing clinical knowledge and performance Presentation: Quality assurance when assessing clinical knowledge and performance Exercise: Take then critique a multiple choice test Presentation: Key features test items Exercise: Write several key features test items Presentation: Widening the lens on SP assessment Exercise: Strengths & weaknesses of a clinical performance assessment system that you use Presentation: Improving clinical performance assessment systems Exercise: Critique your clinical performance assessment system

4. Recall a student/resident whose clinical performance made you uneasy Was the student/resident aware of your concern? Yes No 2. What action did you take? a. Talk with faculty colleagues about your concerns: Yes No b. Write a candid performance assessment and send it to clerkship/residency director: Yes No 3. Did any administrative action occur related to your concern? Yes No 4. Do you think the performance assessments in your clerkship/ residency files reflect faculty candid performance appraisals? Yes No

5. What concerns do you have about clinical knowledge and performance assessment? Smart but not professional Doesn’t have technical skills Heterogeneity of evaluator skills (fairness / accuracy) How to motivate evaluators Options for remediation How to validate the exam Oral exams really worth it How many evals needed before making a decision

6. example: learning to drive vs... taking a driving test faculty often want to conduct both types of evaluations at the same timeexample: learning to drive vs... taking a driving test faculty often want to conduct both types of evaluations at the same time

7. Validity of Knowledge & Performance Assessments * Content: Does the assessment method measure a representative cross section of student/resident competencies? Reliability of scores: Does student/resident perform at about the same level across 5 to 7 different patients / case problems? Does student receive similar ratings from different faculty? 3. Latent process: Does the context surrounding the assessment evoke the domain of cognitive processing used by a clinician?

8. Content of a Clinical Performance Assessment Which clinical competencies are addressed by the performance assessment? How thoroughly will you assess each competency? How will you score performance?

9. Reliability of Physician Performance Scores on Multiple Patients

10. Latent Process Aspect of Validity: Four Levels of Performance Assessment *

11. Compare and Contrast Three Assessment Techniques (Multiple choice exam, OSCE, Global ratings) M.C.E. OSCE Global rtgs. Content +++ ++ + Reliability 5 to 7 case problems +++ ++ + agreement among +++ ++ + raters 3. Latent process + ++ +++ + = adequate ++ = good +++ = excellent

12. Interim Summary of Session Session thus far Two uses of knowledge and performance assessments: Formative and Summative Validity of all knowledge and performance assessment techniques Compare and contrast 3 assessment techniques Coming up Take and critique a 14 item multiple choice exam Presentation on Key Features items

13. How are Multiple Choice Items Selected for an Exam?

14. Sample Exam Blueprint based on Clinical Problems

15. Key Features of a Clinical Problem 1 Definition: Critical steps that must be taken to identify and manage a patient’s problem focuses on a step in which examinees are likely to make an error is a difficult aspect in identifying and managing the problem Example: For a pregnant woman experiencing third-trimester bleeding with no abdominal pain, the physician should: generate placenta previa as the leading diagnosis avoid performing a pelvic examination (may cause bleeding) avoid discharging from clinic or emergency room order coagulation tests and cross-match

16. Test Items based on a Clinical Problem and its Key Features

17. Scoring the Placenta Previa Clinical Problem Key Feature 1: To receive one point, must list placenta previa or one of the following synonyms: marginal placenta or low placental insertion Key Features 2-4: Receive 1/3 point for listing each of the following: 1. Avoid performing a pelvic exam, 2. Avoid discharging from clinic, 3. Order coagulation tests and cross match Total Score for Problem: Add scores for items 1 and 2 and divide by 2 (range: 0 - 1)

18. Steps to Develop Key Features Problems1 1. Assemble problem-writing group Select a problem and define its key features Usually chosen from an exam blueprint Think of several real cases of the problem in practice What are the essential steps in resolving this problem (must be done)? Typical decisions or actions: Elicit hx., Interpret symptoms, Make dx. Define qualifiers such as urgency or decision-making priority Select a case scenario Select question formats Specify number of required answers Prepare scoring key Pilot test the problems

19. Interim Summary of Session Session thus far Two uses of knowledge and performance assessments: Formative and Summative Validity of all assessment techniques Compare and contrast three assessment techniques Take and critique a 14 item multiple choice exam Write a Key Features item Coming up Scoring performance on an SP exam

20. Schematic Diagram of a 9 Station OSCE

21. OSCE Stations: Standardized Patient or Simulation

22. Scoring OSCE Performance Traditional scoring of SP assessment focuses on numerical data typically from checklists Checklist scoring may not accurately assess clinical performance quality of residents and expert clinicians 1 Dimensions of the SP exam 2 basic science knowledge (organize the information) physical exam skills (memory of routines) establishing a human connection role of the student (appear knowledgeable) existential dimension of the human encounter (balance one’s own beliefs with the patient’s) Clinical competence – mixture of knowledge and feeling, information processing and intuition

23. Interim Summary of Session Session thus far Two uses of knowledge and performance assessments: Formative and Summative Validity of all assessment techniques Compare and contrast three assessment techniques Take and critique a 14 item multiple choice exam Write a Key Features test item Use global ratings and narrative comments when scoring OSCE performance Coming up Improving clinical performance assessment systems

24. Bubble Diagram of a Resident Performance Assessment System 1. Organizational infrastructure a. Enlisted the Department Chair’s strong support as a first step b. Alerted departmental administrators about attending faculty that were consistently delinquent in completing forms c. Revised the form to list common behaviorally specific statements that could be marked d. Established computer databases to store data and make it easily available to the program director 2. Program director a. Brainstormed strengths and weaknesses of the current performance assessment system b. Synthesized performance assessment data before administrative reviews to facilitate decision making 3. Individual evaluators a. Conducted training programs on how completed forms affect administrative decision making about residents b. Gave positive feedback to attending faculty who assigned candid numeric ratings and wrote insightful narrative comments. 1. Organizational infrastructure a. Enlisted the Department Chair’s strong support as a first step b. Alerted departmental administrators about attending faculty that were consistently delinquent in completing forms c. Revised the form to list common behaviorally specific statements that could be marked d. Established computer databases to store data and make it easily available to the program director 2. Program director a. Brainstormed strengths and weaknesses of the current performance assessment system b. Synthesized performance assessment data before administrative reviews to facilitate decision making 3. Individual evaluators a. Conducted training programs on how completed forms affect administrative decision making about residents b. Gave positive feedback to attending faculty who assigned candid numeric ratings and wrote insightful narrative comments.

25. Diagnostic Checklist for Clinical Performance Assessment System

26. Three Year Study to Improve the Quality of Resident Performance Assessment Data What median percentage of each resident’s rotations returned one or more completed forms? 2. How precise were the scores marked on the returned forms? 3. What median percentage of each resident’s rotations returned one or more forms with behaviorally-specific written comments?

27. Results of the Study

28. Making Evaluation Decisions

29. Goals: Assessment of Knowledge & Performance 1. Clarify 2 distinct uses for assessments of knowledge and performance 2. Define 2 aspects of validity for all knowledge and assessment methods 3. Compare and contrast 3 techniques for assessing clinical knowledge and performance 4. Identify poorly written multiple choice test items and write a key features test item 5. Describe 3 options for scoring OSCE performance 6. Describe three elements of a clinical performance assessment system 7. Critique a clinical performance assessment system that you use

  • Login