1 / 37

Fundamentals of Assessment and Grading

Fundamentals of Assessment and Grading. APGO Clerkship Directors’ School. Alice CHUANG , MD Department of Obstetrics and Gynecology University of North Carolina-Chapel Hill Chapel Hill, NC AOE Basic Teaching Skills Curriculum April 16, 12:00 PM, Bondurant G010.

masao
Download Presentation

Fundamentals of Assessment and Grading

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Fundamentals of Assessment and Grading APGOClerkship Directors’ School Alice CHUANG, MD Department of Obstetrics and Gynecology University of North Carolina-Chapel Hill Chapel Hill, NC AOE Basic Teaching Skills Curriculum April 16, 12:00 PM, Bondurant G010

  2. Neither I nor my spouse has any financial interests to disclose related to this talk.

  3. Objectives Understand reliability and validity Contrast formative and summative evaluation Compare and contrast norm-referenced and criterion referenced assessments Improve delivery of feedback Understand the NBME exam Be familiar with different testing formats, their uses and their limitations

  4. Terminology • Validity: Are we measuring what we think we’re measuring • Content: Does the instrument measure the depth and breadth of the content of the course? Does it inadvertently measure something else? • Construct: Does the evaluation criteria or grading construct allow for true measurement of the knowledge, skills or attitudes taught in the course? Is any part of the grading construct irrelevant? • Criterion: Does the outcome correlate with true competencies? Relate to an important current or future events? Is the assessment relevant to future performance? http://pareonline.net/getvn.asp?v=7&n=10

  5. Examples • Validity • Content: A summative ob/gyn test which covered only obstetrics • Construct: You allow students to use their textbook for a knowledge-based multiple choice test of foundational information on prenatal care. • Criterion: New Coke v. Old Coke

  6. Terminology • Reliability: Are our measurements consistent? The score should be the same no matter when it was taken, who scored it, or when it was scored. • Interrater reliability: Is a student’s score consistent between evaluators? • Intrarater reliability: Is a student’s score consistent with the same rater even if rated under different circumstances? • Scoring rubric: standardized method of grading to increase interrater and intrarater reliability http://pareonline.net/getvn.asp?v=7&n=10

  7. Examples: In general, if you repeat the same assessment, will you get the same answer? Interrater: 3 individuals are asked to go to the beach and estimate how many seagulls they see from 6-7AM and come up with 200, 800 and 1200. Intrarater: A particular food critic always gives low scores for food quality if the server is female.

  8. Examples: Show Choir Audition Rubric

  9. Formative v. summative assessments Formative: on-going assessment, designed to help improve educational program as well as learner progress Summative: designed to evaluate student overall performance at end of educational phase and evaluate effectiveness of teaching http://fcit.usf.edu/assessment/basic/basica.html

  10. Examples Formative: short multiple choice exam written in house that is pass/fail; answers are reviewed with class at end of testing session Summative: NBME exam

  11. Formative v. summative assessments • ED30: The directors of all courses and clerkship must design and implement a system of formative and summative evaluation of student achievement in each course and clerkship. Those responsible for the evaluation of student performance should understand the uses and limitation of various test formats, the purposes and benefits of criterion-referenced vs. norm-referenced grading, reliability and validity issues, formative vs. summative assessment, etc….

  12. Formative v. summative assessments ED31: Each student should be evaluated early enough during a unit of study to allow time for remediation ED32: Narrative descriptions of student performance and of non-cognitive achievement should be included as part of evaluations in all required courses and clerkships where teacher-student interaction permits this form of assessment.

  13. Formative v. summative assessments

  14. Characteristics of feedback Effective Feedback: • given with the goal of improvement • timely • honest • respectful • clear • issue-specific • objective • supportive • motivating • action-oriented • solution-oriented Destructive Feedback: • unhelpful • accusatory • personal • judgmental • subjective It also • undermines the self-esteem of the receiver • leaves the issue unresolved • the receiver is unsure how to proceed. http://www.expressyourselftosuccess.com/the-importance-of-providing-constructive-feedback/

  15. Feedback…from APGO/CREOG 2011 When you… You give the impression… I would stop… I would recommend…instead

  16. Norm-referenced v. criterion- referenced assessments Rickets C. A plea for the proper use of criterion-referenced tests in medical assessment. Med Educ, Vol 43, Issue 12. • Norm-referenced • Purpose is to classify students in order of achievement from low to high • Allow comparisons of students • May not give accurate information regarding student abilities • Half of the students should score above midpoint score and the other half should score below midpoint score

  17. Norm-referenced v. criterion- referenced assessments Rickets C. A plea for the proper use of criterion-referenced tests in medical assessment. Med Educ, Vol 43, Issue 12. • Criterion-referenced • Purpose is to evaluate students knowledge and skills compared to a pre-determined goal performance level • Gives information about a student’s achievement of certain objectives • Should be possible for everyone to earn a passing score

  18. Example Norm-referenced: Soccer tryouts where 11 players are chosen out of 40 Criterion-referenced: Test for driver’s license

  19. Norm-referenced v. criterion- referenced assessments Be sure your assessment is appropriately norm-referenced or criterion referenced. Be sure that your assessment is designed with this in mind. Most assessments in medical education are criterion-referenced. Norm-referenced tests should emphasize variability; criterion-referenced tests should emphasize accuracy of tested material.

  20. NBME • Exams • Developed by committees and content experts • Same protocol used to build Step 1 and Step 2 • In general • Subject exams provided to all 130 LCME accredited medical school is US • 8 Canadian medical schools • 8 osteopathic medical school • 22 international medical schools

  21. NBME Scaled to have a mean of 70 and SD of 8 based on 9000 first-time test takers from 80+ schools who took exam as end-of-clerkship exam in 1993-94 Scores do not reflect percentage of questions answered correctly.

  22. NBME: What do those scores mean? A score of 60 in the fourth quarter means that 2% of the examinees in the fourth quarter scored 60 or below!

  23. NBME: Academic purpose for exam

  24. NBME: Weight given the subject exam

  25. NBME 2008 Clerkship Survey Results

  26. NBME • 2004 and 2009 survey of performance guidelines across clerkship • Recommend setting an absolute versus a relative standard for performance • Angoff Procedures: item-based, judges provide guess of minimally proficient examinees that answer each question correctly • Hofstee Method: judges determine minimum and maximum scores for passing and percentage of failures…then plotted against a graph made up of exam score and failure rate

  27. NBME

  28. Testing Formats Casey et al, To the point: reviews in medical education – the Objective Structured Clinical Examination. AJOG, Jan 2009. Multiple choice exam (MCQ) Objective structured clinical examination (OSCE) Oral examination Direct observation Simulation Standardized patient Patient/procedure log Medical record reviews Written essay questions

  29. Testing format: MCQ Use distractors which could plausibly represent correct answer Use a question format, not complete-the-statement format Emphasize higher-level thinking, not strict memorization Keep option length consistent within a question Balance the placement of the correct answer Use correct grammar Avoid clues to the correct answer Highly reliable and valid for assessing knowledge http://testing.byu.edu/info/handbooks/14%20Rules%20for%20Writing%20Multiple-Choice%20Questions.pdf

  30. Testing format: OSCE Examinees rotate through circuit of stations (5-10 minutes each) One-on-one examination (with examiner or trained or simuated patient) List of criteria for successful completion of each station Each station test a specific skill or competency Good for examining higher-order skills, clinical and technical skills Requires large amount of resources

  31. Testing format: Oral Exam • Portfolio based: similar to case-based portion of Oral Boards • Poor inter-rater and intra-rater reliability • Scores higher when scored live verses on video • Teaching students how to do better on oral exam does not improve scores • Practicing oral exams does improve scores • Mock public oral exam improves performance • Limitations • Halo effect (grade reflects not only performance on exam but also previous experience) • Subconscious consensus grading: examiners take subconscious cues from each other. Burch & Seggie, 2008; Kearney et al, 2001; Buchard et al, 2007; Jacobsohn et al, 2006

  32. Testing format: Oral Exam Is an oral exam justified? Is there an advantage? Does the material lend itself to open questioning? How will communication skills, delivery of information be graded? Will only content be graded? Is the examiner experienced? Will he/she skew grades in any way? How will you prepare students for the exam? Is there enough time for every student to examine them adequately? How much prompting/assistance is allowed for oral examination? How much time will you allow for “thinking?” How will you ensure consistency in these areas for all examinees? http://testing.byu.edu/info/handbooks/14%20Rules%20for%20Writing%20Multiple-Choice%20Questions.pdf

  33. Testing format: Direct observation Formalized criteria Various observers True-to-life clinical setting (versus simulated) Numerical scores Comment anchored Improve reliability with multiple perspectives Consider 360 evaluation (including self, patient and other staff members)

  34. Testing format

  35. General rules of thumb Lynch and Swing. Key Considerations for Selecting Assessment Instruments and Implementing Assessment Systems. ACGME. Be sure your assessment • Provides reliable data • Provides valid data • Provides valuable data • Is feasible • Can be incorporated into the systems in place (hospital, clinic, curriculum, etc) • Is consistent with course objectives • Utilizes multiple instruments, multiple assessors and multiple points of assessment • Aligns with pre-specified criteria • Is fair

  36. References Bond, Linda A. (1996). Norm- and criterion-referenced testing. Practical Assessment, Research & Evaluation, 5(2). Accessed at http://pareonline.net/getvn.asp?v=5&n=2 Burch VC, Seggie JL. Use of a structured interview to assess portfolio-based learning. Med Ed 2008: 42: 894-900. Burchard K et al. Is it live or is it Memorex? Student oral examinatinos and the use of video for additional scoring. Am J Surg. 193 (2007), 233-236 Casey et al, To the point: reviews in medical education – the Objective Structured Clinical Examination. AJOG, Jan 2009. Jacobsohn E , Kock PA, Avidan M. Poor inter-rater reliability on mock anesthesia oral examinations. Kearney RA et al. The inter-rater and intra-rater reliability of a new Canadian oral examinatino format in anesthesia is fair to good. Can J Anesth 2002; 49:3, 232-236. Lynch and Swing. Key Considerations for Selecting Assessment Instruments and Implementing Assessment Systems. ACGME. MethenyWP, Espey EL, Bienstock J, et al. To the point: Medical education reviews evaluation in context: Assessing learners, teachers, and training programs. Am J Obstet Gynecol. 2005;192(1):34-37. Moskal, Barbara M. & Jon A. Leydens (2000). Scoring rubric development: validity and reliability. Practical Assessment, Research & Evaluation, 7(10). Retrieved December 29, 2009 from http://PAREonline.net/getvn.asp?v=7&n=10 Rickets C. A plea for the proper use of criterion-referenced tests in medical assessment. Med Educ, Vol 43, Issue 12.

  37. References 14 Rules for Writing Multiple Choice Questions. Brigham Young University 2001 Annual Conference. Accessed at http://testing.byu.edu/info/handbooks/14%20Rules%20for%20Writing%20Multiple-Choice%20Questions.pdf Formative vs. Summative Assessments. Classroom Assessment. Accessed at: http://fcit.usf.edu/assessment/basic/basica.html NBME 2008 Clinical Clerkship Director Survey Results. Accessed at https://portal.nbme.org/web/medschools/home?p_p_id=62_INSTANCE_dOGM&p_p_action=0&p_p_state=maximized&p_p_mode=view&p_p_col_id=column-1&p_p_col_count=1&_62_INSTANCE_dOGM_struts_action=%2Fjournal_articles%2Fview&_62_INSTANCE_dOGM_keywords=&_62_INSTANCE_dOGM_advancedSearch=false&_62_INSTANCE_dOGM_andOperator=true&_62_INSTANCE_dOGM_groupId=1172&_62_INSTANCE_dOGM_searchArticleId=&_62_INSTANCE_dOGM_version=1.0&_62_INSTANCE_dOGM_name=&_62_INSTANCE_dOGM_description=&_62_INSTANCE_dOGM_content=&_62_INSTANCE_dOGM_type=&_62_INSTANCE_dOGM_structureId=&_62_INSTANCE_dOGM_templateId=&_62_INSTANCE_dOGM_status=approved&_62_INSTANCE_dOGM_articleId=817480 Objective Structured Clinical Examination. Wikipedia. Accessed at http://en.wikipedia.org/wiki/Objective_structured_clinical_examination Reliability and Validity. Classroom Assessment. Accessed at: http://fcit.usf.edu/assessment/basic/basicc.html Talk about teaching:  Significant issues in Oral Examinations. Contributed by Meryl Carlson, Concordia College, Moorhead, MN. Accessed at http://www.cord.edu/faculty/ulnessd/oral/MCarlson/questions.html

More Related