1 / 26

Data Quality Toolbox for Registrars

Data Quality Toolbox for Registrars. MCSS Workshop December 9, 2003 Elaine Collins. Quality Data Toolbox. Artisan Registrar Medium Computerized data Raw Materials Medical information Shaping tools Knowledge, skills Directions Standards Measuring tools Editing “tools”

eve
Download Presentation

Data Quality Toolbox for Registrars

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Quality Toolbox for Registrars MCSS Workshop December 9, 2003 Elaine Collins

  2. Quality Data Toolbox • Artisan Registrar • Medium Computerized data • Raw Materials Medical information • Shaping tools Knowledge, skills • Directions Standards • Measuring tools Editing “tools” • Final Product Cancer record • Goodness Match to standards

  3. Quality Data - Goodness • Accurate • Consistent • Complete • Timely • Maintain shape across transformation and transmission

  4. Measuring Tools • Reabstracting studies • Structured queries and visual review • Text editing • EDITS • MCSS routine review

  5. Exercises • MCSS reabstracting study – 2003 • Sites: Breast, Corpus uteri, Lung, Melanoma, Testis, Soft tissue sarcoma • 2000 diagnosis year • 12 facilities • Review of reported data – Structured query • Review of reported data – Text editing

  6. Reabstracting Studies • Compares original medical record with reported cancer record • Considered the “gold standard” • Labor-intensive; all records used at initial abstracting may not be available; biased by reabstractor’s training and skills

  7. Structured Queries • Compares coding across series of records sorted by selected characteristics • Useful for finding pattern discrepancies across many records • Manual process; some comparisons may be converted to automated edits

  8. Text Editing • Compares text with coded values for individual records • Useful for immediately identifying coding problems • Manual process; most effective on completion of each individual case

  9. EDITS • Checks range validity for many fields, comparability of few fields for individual records • Automated process, can be applied on completion of each record or on preparation of batch report; warnings and over-rides are alternatives to failures • Expansion of interfield edits requires careful logic

  10. Edits Analysis • Edits to be included in MCSS Set • Edits in Hospital/Staging Edit Sets – C edits are included in confidential data set • No Text Edits displayed • Criteria • Valid codes/dates • Alpha/numeric • Timing • Interfield comparisons • Absolute conditions

  11. MCSS Review • Requests values for missing or unknown data; resolves conflicts between data items from multiple facilities and between data items updated by single facility • Allows incorporation of information from multiple facilities • Review for limited number of conditions

  12. Cancer Registrar – Resource for Quality Data ICD-O Medical Record Physician COC Patient AJCC Facility System SEER Other Registries Registrar NAACCR Facility Staff Committees Central Registry Protocols Quality Monitors NCDB Cancer Control CDC Cancer Research Public NAACCR

  13. Data Inputs • Patient data from facility systems • Medical record reports and notes • Pathology reports • Staging forms • Communication with physician offices • Communication with other registries • Communication with patients

  14. Process Inputs • Registrar training, knowledge, skills • Coding standards – ICD-O-3, COC, AJCC, SEER, NAACCR • Interpretations of standards – I&R, SEER Inquiry, Ask NAACCR • Medical literature – printed and online • Registry software data implementations

  15. Sources of Error • Patient data from facility systems • Medical record reports and notes • Pathology reports • Staging forms • Communication with physician offices • Communication with other registries • Communication with patients

  16. Sources of Error • Registrar training, knowledge, skills • Coding standards – ICD-O-3, COC, AJCC, SEER, NAACCR • Interpretations of standards – I&R, SEER Inquiry, Ask NAACCR • Medical literature – printed and online • Registry software data implementations

  17. Types of Errors • Missing/conflicting data • Shared data errors • Timing/coding errors • Standards and interpretations – ambiguities, omissions, confusions, contradictions • Discrepancies among local/central registry practice and national standards

  18. Software Implementations • Discrepancies between implementations and national standards • Lack of registrar knowledge/training on correspondence between registry and exported data • Logic errors in matching registry data to reporting formats • Conversion errors

  19. AJCC Staging Dilemma • Are pathologic nodes required for pathologic stage grouping? • How do Minnesota registrars answer this question?

  20. Clinical/Pathologic Staging in Study

  21. Collaborative Staging • Provides specific rules for coding known vs unknown staging elements • Accommodates “best” stage for AJCC stage assignment

  22. AHIMA 75th Annual ConferenceOctober, 2003 Minneapolis:Coming Events • Data mining • ICD-10-CM • SNOMED • Natural language processing

  23. AHIMA 75th Annual ConferenceOctober, 2003 Minneapolis: Challenges • What is our professional purpose? • How do we envision ourselves as professionals?

  24. Foundation for Quality Data • Registrar’s commitment to registry purpose • Registrar’s knowledge, understanding of cancer data • Registrar’s management of communication technologies • Registrar’s advocacy for data use

  25. SUMMARY • Consistent recording and reporting of quality cancer data requires commitment. • Routine and regular review of data patterns facilitates data knowledge and quality. • Passing EDITS assists but does not ensure data quality. • Data standards change, use the manuals. • Welcome Collaborative Stage.

More Related