1 / 38

Panel 4 Testing Integrity Practices and Procedures for Online and Computer-based Assessments

Panel 4 Testing Integrity Practices and Procedures for Online and Computer-based Assessments. Panelists Wayne Camara : College Board John Fremer : Caveon Test Security Wes Bruce : Indiana Department of Education Tony Alpert : SMARTER Balanced Assessment Consortium.

ramla
Download Presentation

Panel 4 Testing Integrity Practices and Procedures for Online and Computer-based Assessments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Panel 4Testing Integrity Practices and Procedures for Online and Computer-based Assessments Panelists Wayne Camara: College Board John Fremer: Caveon Test Security Wes Bruce: Indiana Department of Education Tony Alpert: SMARTER Balanced Assessment Consortium

  2. Testing Integrity Practices and Procedures for Online and Computer-based Assessments Wayne J. Camara College Board

  3. CBT vs. Paper • Online testing offers numerous advantages over P&P testing, including features which can improve test security. • As with all assessments, the intended purpose and potential consequences is suggestive of the types of threats to test integrity we need to focus upon. • Threats to all assessments: • item exposure, • candidate authenticity, • data transmission & storage, • proctor and personnel integrity, • system integrity (prevent interruptions and irregularities)

  4. Assessment Purposes and Threats to Testing Integrity • Cheating increases with age of student, bandwidth & distance (Rowe, 2004). • Summative assessments – different threats emerge for different intended uses of scores: • School and district accountability • Student rewards (endorsed diploma, entry into college credit bearing course) • Teacher and educator accountability (financial incentives or penalties, disciplinary-based actions) • Student barriers (graduation, retention, mandatory developmental programs, college remediation courses)

  5. Testing Integrity: Unique risks with CBT

  6. Processes/policies that could mitigate risks to integrity of CBT test results Processes and policies must be tailored to the types of risks or threats to test integrity that are anticipated based on the intended use, stakes and consequences for school, students and educators. Reduce risk of item exposure via – extended testing windows with same form present the biggest security threat when tests used for high stakes: • More robust item banks and spiraling • Use of multistage adaptive models • Linear forms require more forms for the same testing window or single use Reuse of items operationally, for equating or pretesting: • Reuse of scenarios, simulations, or extended performance tasks can more easily be captured and hence have less validity when exposed for any length of time. • Limit disclosure and reuse over several years. • Limit reuse of performance tasks (extended multi-year window w/out release or develop hundreds of tasks to pool from). • Limit retesting – different forms/item pools. -

  7. Recommendations: Processes/policies that could mitigate risks to integrity of CBT test results Administration and Scoring • Reduce opportunity for cheating – send message cheating is not tolerated. • Classroom teachers should not be administering tests to students in their classes – there is simply too much temptation. • Proctors should have ‘no stake’ in outcome or risk collusion. • Environment should preclude copying responses from students seated adjacent (spiraling, different forms, or some physical obstruction); Document seating and proctors. • Mandatory training of proctors and administrators handling test materials; verify understanding of appropriate test procedures and consequences of unauthorized procedures. • Student reads and signs statement like an honor code or integrity policy. • Prohibit all handheld electronic devices (smartphones, calculators). • Employ variety of item formats & constructed response tasks to reduce ease of cheating. • Impose conditions on retest opportunities – beware of students unplugging equipment to restart or retest.

  8. Recommendations: Processes/policies that could mitigate risks to integrity of CBT test results Technology • Prepare for unexpected – it will occur. • Ensure students can not access web resources (outside the system). • Items and data are encrypted and stored on secure server (not desktops). • Paper forms use different item banks and chain of custody established. • Audit social networks, school preparation, blogs. • Ensure high system reliability – outages, interruptions and irregularities which • require candidates to stop and start, retest, or complete paper forms. • Guard against ‘sniffers that decipher and read items/responses and attempts • to have test administrators disclose passwords (McClure et al, 2001). • Disable network capabilities, printers. Conduct formal web crawling before/after. • Use Intrusion Detection Software to catch attacks prior to their occurrence. • Backup grade book or roster in case of attack and chances. Statistical • Checks on aberrance rates, retest or score volatility statistics (individual, site) – does data conform to test response models? • Check on irregular latencies, response patterns. • High/Low Aberrance score, Cheating index, Thresholds (Impara et al, 2005) • Distance assessments - When online performance exceeds traditional tests – Have some traditional assessments (Rowe, 2004).

  9. National Council for Measurement In Education (NCME) Draft Guidelines on Testing &Data Integrity • Data integrity is shared ethical and professional requirement. • Need to develop and implement a comprehensive data integrity policy and why its important. • Tailored to use of test. • Training for all levels with examples of unacceptable behaviors (nondisclosure, confidentiality, participation forms) • Proactive prevention – eliminate opportunities. • Comprehensive data collection and maintenance. • Comprehensive policies for reporting cheating, security breaches, suspicious activities (dB & investigations). • Biometrics, data forensics, statistical patterns, etc.

  10. Thank you Wayne Camara, wcamara@collegeboard.org

  11. NCES Sponsored Symposium on Testing Integrity February 28, 2012 Dr. John Fremer President Caveon Consulting Services

  12. State Assessments in Transition The Perfect Test Security Storm

  13. State assessments face an impending Perfect Test Security Storm • mandated assessments tied to federal funding • teacher evaluations tied to test scores • use of State tests as a graduation requirement • more students/teachers admit to cheating on tests • cheating techniques becoming more sophisticated • CBT test windows increasing test item exposure

  14. CBT will reduce some test security risks Other test security risks will remain Some risks will actually increase

  15. CBT will reduce some test security risks • lost or stolen test books • unauthorized access to tests • copying during testing • tampering with answer sheets

  16. Other test security risks will remain • pre-knowledge of exam content • assisting during an exam • stealing/memorizing test questions • collusion among test takers • technology-assisted cheating

  17. Some risks will actually increase • accessing secure data during transmission • exposure of items for extended periods • pre-knowledge later in testing windows • stealing items for an underground market • reduced funds allocated to test security due • to increased development costs

  18. 21st Century Solutions Advances in the detection of security anomalies and investigative data forensics, enabled through CBT, provide sophisticated means to heighten security Available detection technologies and techniques should be incorporated as routine, standard practice

  19. 21st Century Solutions (cont) Economies of scale and experience will make these security safeguards • cost effective • affordable • easy to understand

  20. Advanced Security Analysis and Detection Techniques for CBT • Unusual Gains Analysis • Occurrence of Perfect Scores • Similarity of Responses • Response Pattern Aberrance Analysis • Answer Changing Analyses • Response Time Analyses • Web Monitoring

  21. Ten Recommendations Moving Forward 1. Acknowledge the seriousness of security issues 2. Expect cheating and plan to be proactive 3. Use multiple detection methods and forensic statistics 4. Minimize testing windows 5. Strengthen the chain of custody

  22. Ten Recommendations Moving Forward 6. Increase the emphasis on security training 7. Allocate adequate resources for test security 8. Pilot techniques for detection of cheating 9. Continue to learn from others • Monitor new advances in anomaly detection and • prevention (e.g. “Epidemiological Model”)

  23. State Assessments in Transition The Perfect Test Security Storm

  24. NCES Sponsored Symposium on Testing Integrity February 28, 2012 Dr. John Fremer President Caveon Consulting Services

  25. Transitioning Testing Integrity from Paper to Computer Wes Bruce Indiana Department of Education

  26. Be thoughtful about the transition • Usually the move from paper to CBT is phased • By grade, content or school • So be thoughtful in how you will transition the measures of test integrity • You want specific strategies for online • Some are the same, some complementary, some unique • But the field must feel that there is a singular system in place - combined reporting (KISS)

  27. CBT Security is Different • Leverage the differences of CBT • Infinitely more data is available on every student • Your challenge is to determine how much of that you can turn into useful “information” • What will you systematically use and what will be in your “back pocket” • Time spent per item • Time spent per “session” • The “system” time of each response • The actual order in which students answer test items • The “real” pattern of item response changes

  28. CBT Security is not Unique • Many of the metrics that we use with paper are equally valid for CBT • Score change metrics • School and student • Part to whole • Analysis of items correct vs. item difficulty • School, class and student • Perfect Score Reports

  29. Illustration – Erasure Analysis • Paper (Generic) • Scanners detect when (if) multiple responses have been selected for a single item • If one is “darker” it is seen as the final “answer” • Lighter response flagged as an “erasure” • In “Erasure Analysis” logic and statistics are applied to these multiple “marks” • If lighter one is “wrong” and darker is “right” item is flagged as W to R • If a student, class or school exceeds threshold value (4 sd) they are flagged/flogged • Anybody take statistics in college?

  30. Illustration – Erasure Analysis 2 • Concerns • We do not “know” what the actual pattern of student responses was. W-R or R-W-R • We do not know when the “change” was made or how long the student took to make that change • CBT can provide more information for analysis • Potential for fewer false positives, you can identify “true” W-R • Can factor in other dimensions (i.e. filter on “when”) • But it is still statistical & subject to the same limitations

  31. Illustration – Erasure Analysis 3 • We provide a single combined “Erasure Analysis” for schools • (Even though there are no “erasures” on CBT) • “Identical” fields for paper and CBT • Same “flagging” criteria for both • Same expectations for investigation and reporting • Trying to make these exceptions easy to understand and communicate • Think about the context and the cognitive load

  32. CBT Security is not Omnipotent • Be careful • Still inferential • It may provide “stronger” or additional evidence • But it “proves” nothing • You may know “what” but you still do not know who or how • Investigations still matter • Press loves a scandal and CBT can help you create an even bigger one

  33. Wes Bruce wbruce@doe.in.gov

  34. Secure Testing on Computers Testing Integrity Symposium Tony Alpert – Smarter Balanced Assessment Consortium (SBAC)

  35. State Supports as Prerequisite • Model rigorous implementation by making sure the system works as described • Establish a culture of security within the Department and across the state • Establish policies that address • The larger network of adults that are involved in CBT vs. paper • The additional complexities of logistics • The additional complexities of new item types

  36. State Supports as Prerequisite (cont) Delineate minimum training requirements based on roles and responsibilities Provide practice versions of the applications early enough Establish help-desk supports consistent with longer testing windows Conduct user acceptance testing in the schools

  37. Local Supports as Partner Be aware of which adults can be in the secure testing environment Use the opportunities for sample tests/applications Provide clear expectations for which individuals must attend trainings Provide clear path for identification and resolution of problems

  38. Local Supports as Partner Be aware that CBT can be overwhelming for new teachers and substitutes Don’t expose Secure Student Identifiers Provide clear path for identification and resolution of problems

More Related