1 / 46

You Have to Know the Score: Best Practices and Issues for Scoring and Reporting Subscores

You Have to Know the Score: Best Practices and Issues for Scoring and Reporting Subscores. Topics. Scoring and reporting in support of the purpose and structure of licensure/credentialing examinations. Scoring and reporting best practices

janney
Download Presentation

You Have to Know the Score: Best Practices and Issues for Scoring and Reporting Subscores

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. You Have to Know the Score: Best Practices and Issues for Scoring and Reporting Subscores

  2. Topics Scoring and reporting in support of the purpose and structure of licensure/credentialing examinations. Scoring and reporting best practices Addressing common challenges and pitfalls that licensure and certification programs encounter.

  3. Learning Objectives 1-2 Understand the relationship of scoring and reporting to the goals of the certification/licensure program. Understand the importance of scoring design and how design strategies should be linked to intended scores and subscores.

  4. Learning Objectives 3-4 Understand the basics of measurement error, including less precision for extreme scores. Understand how to evaluate the added value of providing subscores.

  5. Learning Objectives 5-6 Understand pros and cons of score reporting options, such as providing scaled scores, providing numerical scores to all or only failing candidates, and providing subscores. Understand the language of score interpretations that can be provided to candidates, including illustrations.

  6. Guidance from Standards ANSI/ISO/IEC 17024 National Council for Certifying and Credentialing Agencies (NCCA) Standards for Educational and Psychological Testing (AERA, APA, and NCME).

  7. ANSI/ISO/IEC 17024 “Each score reported that is associated with a pass-fail standard should be based on sufficient data...to attain high levels of reliability (internal consistency and decision-related) and validity. Subscores or partial scores…should be linked to the content specification or other document that provides the content structure for the test. If scaled scores are reported, candidates should have access to an interpretive explanation.”

  8. NCCA Standard 19 “The certification program must employ and document sound psychometric procedures for scoring, interpreting, and reporting examination results.”

  9. NCCA Standard 19 additional info “All candidates must be provided with information on their overall performance on an examination.” “Failing candidates must be provided information about their performance in relation to the passing standard. If the program provides feedback to candidates such as domain level information, candidates must be provided guidance about limitations in interpreting and using that feedback.”

  10. NCCA Standard 19 details “The certification program should provide candidates with an explanation of the types of scores reported, appropriate uses and misuses of reported score information….”

  11. NCCA Standard 19 details continued “If domain level information has low reliability, programs are advised against reporting it to candidates and other stakeholders. When domain level or other specific feedback is given to candidates, the certification program should give estimates of its precision and/or other guidance.”

  12. Joint Standards (AERA, APA, NCME) “In many instances, different test takers…receive equivalent forms that have been shown to yield comparable scores, or alternative test forms where scores are adjusted to make them comparable.” (p. 111)

  13. Joint Standards 6.10 “When score information is released, those responsible for testing programs should provide interpretations appropriate to the audience. The interpretations should describe in simple language what the test covers, what scores represent, the precision/reliability of the scores, and how scores are intended to be used.”(p.119)

  14. Summary Standards provide guidance to best practices for scoring and reporting related to licensure and certification examinations. At the ground level, organizations must consider their available options and implement their own scoring methods and reporting formats.

  15. Speaker Contact Information George T Gray, EdD Independent Testing and Measurement Consultant. gtrumongray@gmail.com

  16. Measurement Issues In Score Reporting

  17. Score Reports Must Provide meaningful information aligned to the intended use of scores Guide end-users on the interpretation of scores to avoid unintended uses Allow for consistent interpretation of performance across forms and years Be understandable to all intended audiences

  18. Meeting Measurement Expectations Make score reporting plans early in the test design or redesign process Think through options and evaluate against the criteria taken from the Standards given the intended use of scores Develop the test in a way that will provide the information needed to support score reporting

  19. Passing Point FOCUS OF EXAM CONTENT Score Scale 0 Max

  20. What level of information should be reported? Test-level Domain-level Same for all examinees? More information for failing examinees?

  21. Domain X Domain Y Domain Z Passing Point Score Scale Max 0

  22. How will this information be reported? Scores Raw scores (number correct, percent) Scaled scores Performance levels

  23. Test Level Reporting Options PASS FAIL 102 230 85% Passing Point Score Scale 85 93 0 120 0% 100% 71% 200 300 100

  24. Domain Level Reporting Options All the same options as test-level: - Raw score - Percent score - Scaled score - Pass/Fail Other options - Normative: in relation to all/failing examinees - Graphical - Score band

  25. How should examinees be guided to interpret score reports correctly? Purpose of the Test Reliability/Measurement Error Written guidance

  26. Speaker Contact Information Susan Davis-Becker, Ph.D Partner – ACS Ventures LLC sdavisbecker@acsventures.com

  27. Reporting Results and Feedback • Different strategies and approaches are currently used by testing agencies • Approaches are based on: • Content considerations • Psychometric considerations • Design and format considerations • Target audience

  28. Types of Test Results Provided Results and Score Standing or Ranking • Results: reported as Pass/Fail • Standing: an indication of how far away examinee is from the pass/fail cut score • Ranking: of an individual examinee’s performance in comparison to other failing candidates Eg: quartile or decile breakdown and individual standing

  29. PEBC MCQ Pass/Fail FeedbackQuartile Ranking Ranking in Overall Performance in Comparison to Other Failing Candidates Your Score: F3

  30. Types of Feedback Provided Domain Level Performance (including Subscores): • Subject area, competency area or task type • Options: • Pass/Fail performance levels within the domain • Domain scores (raw, %, scaled) • Results can be represented by: • Numerical score or by categories of performance

  31. Types of Feedback Provided Domain Level Performance (including Subscores): • Challenges with each option: • Interpretation • Reliability/confidence • Domain interdependence or overlap

  32. Pharmacist MCQ Blueprint and Feedback (PEBC)

  33. Pharmacist OSCE Feedback Global Ratings

  34. Pharmacist OSCE Competency Feedback

  35. National Board of Medical Examiners

  36. MCC Part I Score Profile

  37. Approaches in Subscore Reporting Best approach: • Report standardized subscores with confidence intervals or score bands rather than simple raw scores or percent scores • Subscores must be reliable with data to support development of these scales/measurement specificity. • Report subscores relative to a performance standard or consistent reference group average

  38. Approaches in Subscore Reporting • Example: National Board of Medical Examiners exam reports score band profiles which indicate the extent to which a physician knowledge is above or below average compared to consistent reference group • Alternate approach: Certified Public Accountant licensure exam reports subscore information as either weaker, comparable, or strongercompared to a reference group of just passing candidates

  39. Approaches in Subscore Reporting • A different approach is to report augmented (combined) subscores or weighted average subscores under certain conditions: • if sections that can be meaningfully combined and distinguished from other subscores • if information about the reliability of such composites can be provided

  40. Design and Layout of Result Reports • Goodman and Hambleton studied the design and layout of educational score reports (2004) • Identified five general areas of weaknesses that apply to certification reports: • Information provided was either excessive or lacking • Lack of information about the precision of test scores(measurement error)

  41. Design and Layout of Result Reports • Presence of unnecessary psychometric, statistical or technical jargon • Lack of definition and interpretationof key terms (need to understand terminology) • Cluttered reports(lack of space and too much information presented) - difficult to read and interpret

  42. Design and Layout of Result Reports • Layout should be clean and simple, with key results prominent or highlighted • Careful consideration of language in reports is critical • Use clear and concise language • Simplify and streamline communications • Avoid jargon and highly technical terms • Use illustrations: standalone or comparative information

  43. Conclusions • Many strategies for reporting candidate feedback • Important to decide: • What approaches align with the purpose of the test and with candidates’ needs? • What approaches facilitate understanding? • What approaches are most useful? • What approaches provides reliable feedback that avoids misinterpretation?

  44. Conclusions • Test purpose, design, results and score reports need to be aligned – plan with the end in mind • Test reports should include appropriate information in a format and language that is clear • Information is needed about: • Content/competencies assessed (blueprint) • Meaning of scores (and subscores/levels) • Precision of scores (and subscores/levels) • Proper use and interpretation including limitations

  45. Speaker Contact Information John Pugsley, B.Sc. Phm., ACPR, Pharm.D. Registrar-Treasurer – PEBC jpugsley@pebc.ca

More Related