1 / 28

Is Your Data Management System Flexible for Quality Control Activities?

This article discusses the implementation of visual editing standards in a data management system to ensure high quality and accurate cancer data. It also explores the historical perspective of the standards and introduces a sampling plan for visual editing. Additionally, the article presents an audit module for recoding and provides insights into the issues and considerations associated with implementing these changes.

smcginnis
Download Presentation

Is Your Data Management System Flexible for Quality Control Activities?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Is Your Data Management System Flexible for Quality Control Activities? Winny Roshala, CTR Data Standards and Quality Control Unit NAACCR: June 13-19, 2009, San Diego, CA

  2. CCR Visual Editing Standards: Accuracy Rates • Implemented January 1, 2000 with 100% visual editing on 13 data items • Automated software was developed to calculate accuracy rates • Accuracy Rate Standard: 97%

  3. CCR Visual Editing Standards: Purpose • Assure high quality data for analysis • Provide consistency in the visual editing process • Quantify the accuracy of cancer data from cancer reporting facilities • Standardize accuracy rates • Standardize format for reporting rates to registrars/facilities

  4. CCR Visual Editing Discrepancies • Defined as the quality or state of being discrepant, i.e., disagreeing, being at variance • A discrepancy arises when a more appropriate code should have been selected for a data item based on submitted documentation

  5. CCR Visual Editing Discrepancies • Discrepancies are counted prior to cases being linked or consolidated • Each data item is considered one potential discrepancy with the following exceptions: • Site/subsite • LN’s Pos/Examined • Site Specific Factor fields

  6. CCR Visual Editing Standards: Calculation of Accuracy Rates • Percent Discrepant: Number of discrepancies divided by the number of abstracts, multiplied by the number of data items • Accuracy Rate: 100% less the percent discrepant

  7. CCR Visual Editing Standards:Historical Perspective • In December 2005, in order to reduce a backlog, admissions from abstractors with an accuracy rate of 99% were no longer visually edited • This “push through” represented approximately 64% of admissions

  8. CCR Visual Editing Standards:Historical Perspective • Due to budget cuts, this percentage was reduced again by adding abstractors with an accuracy rate of 98% to the admissions no longer visually edited

  9. CCR Visual Editing Standards:Historical Perspective • In February 2008, due to a further reduction in state funding, the CCR changed it’s approach to reducing the proportion of cancer registry abstracts that are visually edited • Instead of focusing on individuals, the CCR went to a random sampling of cases for visual editing, reducing it from 100 percent to 40 percent

  10. CCR Visual Editing Standards • Quality for the remaining 60% of abstracts will be monitored by targeted visual editing and through recoding and reabstracting audits • Hospital registrars continued to receive monthly Discrepancy Reports

  11. Visually Edited Data Items • County of Residence at Diagnosis • Sex • Race • Spanish/Hispanic Origin • Date of Diagnosis • Diagnostic Confirmation • Site/Subsite* • Laterality (Only paired sites listed in Volume I) • Histology • Grade • CS Tumor Size • CS Extension • CS Lymph Nodes • Number of Regional Nodes Positive/Examined* • CS Metastasis at Diagnosis • CS Site Specific Factors 1-6* • Class of Case * Counted as one discrepancy

  12. VISUAL EDITING SAMPLING PLAN • Run the edits against the admission • Set a flag indicating whether any edit errors exist • Check the admission as to whether it qualifies for required review and set a flag indicating true/false • Check the site to determine whether it is one of the sites that require 100% visual editing and set a flag indicating true/false

  13. The following sites had a high discrepancy rate and will continue to undergo 100% visual editing: • Lip • Nasal Cavity & Middle Ear • Accessory Sinuses • Thymus • Heart, Mediastinum, and Pleura • Retroperitoneum and Peritoneum • Adrenal Glands • Other Endocrine Glands • Other Ill-Defined Sites • Unknown Primary Site

  14. VISUAL EDITING SAMPLING PLAN • If the site is determined not to require 100% VE, using the system function Random, generate a number between 0 and 99  • Set a flag indicating VE Required to true if the number generated is 37 or less, else set the flag to false • Once all of the checks are complete, and if all flags are set to false, the admission will bypass Visual Editing

  15. Issues to Consider • Percent of cases randomly selected for visual editing • List of sites which require 100% visual editing • New review tasks that need to be added to the database

  16. Issues to Consider • Programming changes may require little time, however the deployment of programming changes may require a full build • Deployment may be delayed to comply with a scheduled release

  17. What About the 60% of Cases Bypassing Visual Editing? Eureka Recoding Audit Module (RAM)

  18. RAM Features • Ability to select data items and text fields • Accessibility • Audit current data • Automatically sends cases from the primary auditor to the secondary auditor • Generates reports • Multi-purpose tool

  19. Audit Sample Request

  20. Audit Sample Request

  21. Primary Auditor Recoding Screen

  22. Reconciliation Screen

  23. RAM Disposition Report

  24. Summary • With diminishing resources, changes in the CCR visual editing practices were necessary • Due to the flexibility in our data management system, Eureka, we have the ability to quickly refine the sampling plan for visual editing as needed • This has resulted in our ability to redirect resources while carefully monitoring quality control on our data

  25. Summary • The development of Eureka RAM has enabled us to focus on the cases bypassing the visual editing process • RAM is instrumental in quickly identifying problem areas in coding and instruction • Training efforts can be mobilized earlier • Future uses for RAM may include training of new staff, targeted visual editing and special studies

  26. Acknowledgements • Nancy Schlag, CCR Operations Section Chief • Andrew Sutliff, Eureka Programmer Analyst • Kyle Ziegler, Audit Coordinator, Quality Control Specialist • Vic Belen, Administrative Assistant

  27. Contact Information • Winny Roshala, BA, CTR • California Cancer Registry • 1825 Bell St., Suite 102 • Sacramento, CA 95825 • Phone: (916) 779-0313 • Email: wroshala@ccr.ca.gov

  28. Thank You!

More Related