1 / 39

Assessment in Education

Assessment in Education. Patricia O’Sullivan Office of Educational Development UAMS. Last Month Dr. Allen described a situation about an individual asking her how to get to Freeway Medical Center. How would you determine how successful Dr. Allen was?. Objectives.

Download Presentation

Assessment in Education

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS

  2. Last Month Dr. Allen described a situation about an individual asking her how to get to Freeway Medical Center. How would you determine how successful Dr. Allen was?

  3. Objectives • Provide terminology and principles fundamental to assessment • Introduce various assessment approaches • Review tips for a common assessment: Multiple Choice Questions • Review elements for a less common assessment: Portfolio

  4. What is Assessment? • Appraising performance through measurement to make an evaluation • For what purpose? • Formative • Summative

  5. What Assessments are Used at UAMS?

  6. What Makes a Good Summative Assessment? • Fair and practical • Based on appropriate criteria that are shared with the learners • Valid • Reliable

  7. Validity • Key to the interpretability and relevance of scores • Do the scores have a plausible meaning? • What are the implications of interpreting the scores? • What is the utility of the scores? • What are the actual social consequences?

  8. Validity Evidence for Proposed Interpretation • Content/Construct Evidence • Content matches the objectives & teaching • Score is shown to be meaningful • Criterion/Predictor Evidence • Score relates to future outcomes

  9. Reliability • Assessment data must be reproducible • Written tests • Internal consistency Cronbach’s alpha or KR-20 (values range 0-1) • Test-retest Correlation ranging from 0-1

  10. Reliability (cont.) • Rater-based assessments • Interrater consistency or agreement (generally correlation coefficient or intraclass correlation) • OSCE & performance assessments • Generalizability Theory

  11. Improve Validity • Match assessment with objectives • Increase the sample of objectives and content on the assessment • Review blueprint (or produce one) • Use test methods appropriate to objectives • Ensure test security

  12. Improve Reliability • Clear questions • Appropriate time allotted • Simple, clear and unambiguous instructions • High quality scoring • Increase number of observations or questions

  13. Norm or Criterion Referenced • Norm-referenced (relative standard) • Use the results of the assessment to set the standards (e.g. proportion to receive each grade); performance judged by comparison with others • Criterion-referenced (absolute standard) • Learners achieve some minimal standard of competency

  14. Standard Setting Procedures • Judgments based on • Holistic impressions of the exam or item pool • Content of individual test items • Examinee’s test performance

  15. Anghoff Standard Setting Procedure • Judges are instructed to think of a group of “minimally acceptable” persons • For each item estimate the proportion of this group that will answer item correctly • Sum up proportions to get the minimum passing score for a single judge • Average across judges

  16. Assessment Methods • ACGME Toolbox • Additional methods • Other written assessments • Essay • Short answer and computation • Patient management problems • Modified essay questions • Create a game • Self-assessment

  17. Matching Assessment Method and Objectives • Examine ACGME matrix

  18. Alternative Testing Modes • Take-home tests • Open-book tests • Group exams • Paired testing

  19. Reducing Test Anxiety • Make the first exam relatively easier • Give more than one exam • Give advice on how to study • Encourage study groups • Give a diagnostic test early

  20. Reducing Test Anxiety • Before a test explain format • Provide sample items • Provide a pool of final items with syllabus

  21. Technology and Assessment • Computer based testing • Use of images and video clips • Computer Adaptive Testing (CAT)

  22. Take a Test

  23. Multiple Choice Questions: Writing Tips • http://www.nbme.org/about/itemwriting.asp • Simple suggestions: • Design for one best answer • Place hand over stem. Can you guess the answer? • Use a template • Attend to Graveyard of Test Items

  24. Template Examples • A (patient description) is unable to (functional disability). Which of the following is most likely to have been injured? • Following (procedure), a (patient description) develops (symptoms and signs). Laboratory findings show (findings). Which of the following is the most likely cause?

  25. Graveyard • B-type: matching heading with words or phrases (can use heading more than once) • D-type: complex matching—two step process • K-type: stem with four options and choices are • A=1,2,3 only B=1,3 only C=2,4 only • D=4 only, E=all are correct

  26. Portfolios

  27. Case Records Conferences Evaluations Logs

  28. Portfolios • Definition • A purposeful collection of student work that exhibits to the student (and/or other) the student’s efforts, progress, or achievement in (a) given area(s).

  29. Reflection Criteria Selection • This collection must include student participation in • Selection of portfolio content • The criteria for selection • The criteria for judging merit • Evidence of student reflection

  30. Learning Showing progress Showing effort Showing remediation Assessment Showing competence Portfolios

  31. Presentations Logs Case Records Evaluations

  32. Criteria ______ ______ ______

  33. Criteria

  34. Portfolio Reflection • Why selected • How reflects skills and abilities

  35. Using Portfolios the Learner Engages in Assessment • Making choices • Critically self-examining • Explaining what they are doing

  36. Well-structured Portfolio • Menu • Selection criteria • Scoring rubric • Benchmarks • Trained raters

  37. Determine Reliability and Validity • Reliability: Generalizability theory • Found needed 6 entries with 2 raters • Validity: number of studies • Resident performance across years • Resident performance in areas • Resident perception • Resident performance when program made an intervention • Correlation of portfolio performance and other areas

  38. 10-Step Approach • Define the purpose • Determine competencies to be assessed • Select portfolio materials • Develop an evaluation system • Select and train examiners

  39. 10-Step Approach • Plan the examination process • Orient learners • Develop guidelines for decisions • Establish reliability and validity evidence • Design evaluation procedures

More Related