1 / 68

DQ Review List/Statement Completeness

DQ Review List/Statement Completeness. Question B.5.a. (DQ Statement question 1.a.) In the reporting month (include only B*** and FBN* accounts):

keran
Download Presentation

DQ Review List/Statement Completeness

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DQ Review List/StatementCompleteness • Question B.5.a. (DQ Statement question 1.a.) In the reporting month (include only B*** and FBN* accounts): • a) What percentage of appointments were closed in meeting your “End of Day” processing requirements, “Every appointment – Every day?” (B.5.(b)) Source is BDQAS Number of closed appointments Total appointments for the month • > 97% = Green> 80% < 97% = Yellow < 80% = Red

  2. DQ Review List/StatementBDQAS • https://bdqas.brooks.af.mil/index2.htm • Biometric Data Quality Assurance Service (BDQAS) is a source for many DQ Statement questions • Updated on the 10th or 11th for non-EAS data • EAS data on BDQAS is updated between the 16 thru 20th • If EAS transmission did not occur on-time, questions that are applicable to EAS will need to be manually calculated and annotated on Statement as such

  3. Select Data Metrics

  4. Select Data Quality Statement Reports

  5. Select the command for your MTF

  6. Select your MTF and then the data month These are the questions and percentages for each question BDQAS pulls Note: Manual procedures in DQ MUG if needed

  7. DQ Review List/StatementTimeliness • Question B.6. (DQ Statement question 2.) In accordance with legal and medical coding practices, have all of the following occurred: • B.6.a (DQ Statement question 2.a.) What percentage of Outpatient Encounters, other than APVs, has been coded within 3 business days of the encounter? Source is BDQAS • B.6.b (DQ Statement question 2.b.) What percentage of APVs have been coded within 15 days of the encounter? Source is BDQAS • B.6.c (DQ Statement question 2.c.) What percentage of Inpatient records have been coded within 30 days after discharge? Source, run inpatient timeliness adhoc found on BDQAS • > 95% = Green> 80% < 95% = Yellow < 80% = Red

  8. DQ Review List/StatementValidation and Reconciliation • Question C.1. (DQ Statement questions 3 series) Medical Expense and Performance Reporting System for Fixed Military Medical and Dental Treatment Facilities Manual (MEPRS Manual), DoD 6010.13-M, dated April 7, 2008, paragraph C3.3.4, requires report reconciliation. • C.1.a (DQ Statement question 3.a.) Was monthly MEPRS/EAS financial reconciliation process completed, validated and approved prior to monthly MEPRS transmission? Source is MEPRS Manager and RMO Office • C.1.c. (DQ Statement question 3.c.) Were the data load status, outlier/variance, WWR-EAS IV, and allocations tabs in the current MEWACS document reviewed and explanations provided for flagged data anomalies? Source is MEPRS Manager (note: need to answer C.1.c.1 through C.1.c.4) • Yes = Green No = Red (comments required) Do not use N/A

  9. DQ Review List/StatementMEWACS • Proactively identify, investigate, and resolve MEPRS data anomalies in a timely, systematic manner • Data Quality Statement question 3b. • Data that is identified as erroneous should be fixed and retransmitted • MEWACS is normally updated approximately on the 16th of each month • TMA centrally tracks site “hits” by base…compare outliers to hits • AFMOA MEPRS uses Vector Check to help identify outliers prior to them becoming outliers on MEWACS

  10. Click on MEWACS and then MEWACS Online

  11. Click on Click here to launch MEWACS Online!

  12. Select Data Load Status, then Summary Outliers, then WWR/EAS IV Outliers, and then Allocation Test

  13. Data Load Status

  14. Summary Outliers

  15. WWR/EAS IV

  16. Allocation Test

  17. AFMOA MEPRS Dashboard • Launched MEPRS Dashboard Oct 2009 • Objectives: • Identify variance • Evaluate processes • Provide training • Measures: • 20 Key data points with supporting detailed reports • Controls: • 1 or 2 standard deviations • Upper-Lower controls • Visibility: Resides on Vector Check – Enterprise-Wide Access https://vc.afms.mil/AFMOA/SGA/SGAR/SGAR_MEPRS/default.aspx 17

  18. AFMOA MEPRS Dashboard Nellis AFB, Nov 2009 Validated Error Error Error Research Error Error Errors 18

  19. AFMOA MEPRS Dashboard Nellis AFB, Apr 2010 Corrected Corrected Validated Corrected Research Research Corrected Corrected Corrections Pending 19

  20. DQ Review List/StatementValidation and Reconciliation • Question C.1.e. & f. (DQ Statement question 3.c) Continued…New Questions on Timecards submitted by close of business the Monday after the end of the pay period • C.1.e. (DQ Statement question 3.c.) For DMHRSi, what is the percentage of submitted timecards by the suspense date?Source is MEPRS Manager Number of Timecards Submitted On-time Total Number of Timecards for an MTF • C.1.f (DQ Statement question 3.c.) For DMHRSi, what is the percentage of approved timecards by the suspense date?Source is MEPRS Manager Number of Timecards Approved On-time Total Number of Timecards for an MTF • = 100% = Green < 100% = Red

  21. DQ Review List/Statement Comments • Question C.1.e. and C.1.f., comments are required if under 100% • Question C.1.f. • Not only about submission, but also approval • If the percentage you have here is less than the submitted percentage, then you need to explain what is the problem with the approval process • Cannot be greater than C.1.e. • This percentage is not calculated by the percentage of timecards approved that were submitted. This percentage is calculated out of the entire number of timecards that should have been submitted. • NOTE: TUG will require comments for all yellow/red thresholds

  22. DQ Review List/Statement CHCS Duplicate Patients • Question C.2. (DQ Statement question 10.) CHCS software used during the reporting month to identify duplicate patient registration records. (C.2a) • a)  What was the number of potential duplicate records in the reporting month? (NOTE: Only Host sites report up.)Source is Internal Process Run the CHCS standard report – “Potential Duplicate Patient Search”. • Report Ran = Green Report Not Ran = Red (comments required) • Only use N/A if your MTF is not a CHCS Host Site

  23. Patient Duplicate Reporting • For CHCS/AHLTA hosts only, what was the number of potential duplicate records in the data month for all MTFs under the host? Run the CHCS standard report – “Potential Duplicate Patient Search” • Report all potential duplicates regardless of service! • Even if you are not a ‘parent’ but someone uses your platform, your facility needs to report all the potential duplicates on your host • It is understood that running the CHCS Potential Duplicate Patient Report will give the total on the host server and individual MTFs can’t be shredded out by DMIS ID • However, the report will show who registered the patient so there is a way to identify who entered the duplicates incorrectly

  24. Patient Duplicate Reporting • Do you have a process to reduce the number of duplicate records? • Potential duplicate patient records can be minimized by performing DEERS validation checks. • Has your MTF determined how to correct the duplicate appointments/encounters and avoid the errors in the future? • Have trouble tickets been filed with MHS Helpdesk for duplicate records in CHCS/AHLTA that cannot be resolved at the MTF level? • List all sites being reported (including host) by DMIS ID and DMIS facility name in the comments section

  25. Patient Duplicate Reporting • DISCLAIMER: We know this is not catching all duplicate patients. Do not use this to gauge the health of your patient file on your CHCS platform. Would recommend on occasion running the “ALL” report and Registration report. However, for DQ reporting purposes, the Registration report number is what should be on the Statement. • Just because DQ is asking for the Potential Duplicate Patient Report, does not exclude a facility from running the required monthly PIT Error Discrepancy Report and working them separately. Two different requirements and two different problems. • Might see some crossover that the same patient’s are on both reports, but this is normal

  26. DQ Review List/Statement Compliance • Question C.3. (DQ Statement series 4 questions.) Compliance with TMA or Service-Level guidance for timely submission of data (C.3.).* • C.3.a. (DQ Statement question 4.a.) MEPRS/EAS (45 days) Source is MEPRS Manager/MEWACS • C.3.b. (DQ Statement question 4.b.) SIDR/CHCS (5th Duty of Day of the month) Source is BDQAS • C.3.c. (DQ Statement question 4.c.) WWR/CHCS (10th Calendar Day Following Month) Source is BDQAS • For C.3.a.-C.3.c.: Yes = Green No = Red (comments required) • C.3.d. (DQ Statement question 4.d.) SADR/ADM (Daily) Source is BDQAS • > 95% = Green> 80% < 95% = Yellow < 80% = Red

  27. DQ Review List/Statement Coding Accuracy Calculation • Use the following formulas for Q5b-d(Internal Process),6b-d(Audit Tool),7b-c(Audit Tool): • ICD-9: Number of correct ICD-9 codes Total number of ICD-9 codes • E&M: Number of correct E&M codes Total number of E&M codes • CPT: Number of correct CPT codes Total number of CPT codes

  28. AFMS Coding In Transition • Diverse MAJCOM and MTF contract solutions • Inconsistent coding • Staffing models • Workload requirements • Training practices • Performance/DQ • Funding sources with variable reliability • Self-Audits are common and not objective • Fragmented comm b/n MTF, MAJCOM, HQ AFMS • Suboptimal compliance w/ AMA/DoD/AFMS guidance • AFAA audit conclusion: “Medical coding effectiveness required significant improvement in all areas”

  29. AFAA: AFMS Coding Findings • “MTF personnel were inappropriately allowed to use the same contractor for both coders and auditors, creating a significant conflict of interest” • “All 9 MTFs had coding error rates of 50% or higher” • “36% of encounters contained diagnoses coding errors, increasing the potential for subsequent providers to incorrectly treat or delay patient treatment” • “Medical personnel did not identify specific system and coding training needs, develop a reliable plan to address providers’ needs, or adequately track provider training”

  30. AFAA Recommendations agreed to by AF/SG • “Create a centralized AFMS outpatient coding contract to place a pre-determined number of coding support personnel at every MTF” • “Require independent auditors accomplish AFMS audits” • “Establish and implement an Air Force-wide training plan to educate and train providers on coding requirements to include DoD, Air Force, and AMA standards” • “Implement procedures to track providers’ coding performance, to identify providers’ coding strengths and weaknesses, and provide individualized training to correct deficiencies”

  31. Coding Initiative Objectives Robust support is paramount Standardization is attainable and pragmatic but will not succeed without leadership endorsement Dispel the myth that “More coding is better” New focus on coding only what is needed Utilize IM/IT systems to fullest potential Target training based upon identified errors Ongoing provider training is vital! Endorse continued evaluation for Dynamic enhancements (address lessons learned) Incorporation of billing and coding into one contract

  32. Central Coding Contract Why • Centralization of coding and auditing contracts was driven by AFMS, AFAA, and AF/SG concerns • Centralization equivocally aligns the coding and auditing assets across all CONUS MTFs • Centralization answers the requirement to eliminate biased self-auditing practices • Provider workload driven by this change is small • Coding 100% of encounters is expensive and does not provide better data quality • Centralized concept saves AFMS dollars and provides higher fidelity coding

  33. AFMS Coding Contract Way Ahead • Centralized Coding Contract • Outpatient Coding support for CONUS* MTFs • All ER and APU (Procedure visit) encounters coded • All Billables/TPCs will be coded • Additional 10% of outpatient visits coded • Strong emphasis on Coding Trainers • Provide general & targeted instruction • 100% coding model not chosen because evidence does not show it improves DQ, and it is high cost ($40M+) • Centralized Auditing Contract • Remote cell apart from MTFs, utilize CAT/CARS • Audit, track, report and communicate training needs • Develop similar vehicle for true OCONUS sites

  34. AFMS Coding ContractWay Ahead (cont) • AFMS Centralized Coding and Auditing Contracts • Two distinct contracts • Answers 6 AFAA identified deficiencies • Standardizes workload and resources to all MTFs • Objectively audits coding at every MTF • Supports provider education and training needs • Coordinates management through AFMOA • Meets urgent expiration deadlines for 36 MTFs • $12.6M uniform solution vs. $20.5M disparate model • Maintaining the status quo would not address AFMS needs or AFAA recommendations

  35. AFMS Coding Contract Conclusion • Centralization of coding and auditing contracts was driven by AFMS, AFAA, and AF/SG concerns • Centralization equivocally aligns the coding and auditing assets across all CONUS MTFs • Centralization answers the requirement to eliminate biased self-auditing practices • Provider workload driven by this change is small • Coding 100% of encounters is expensive and does not provide better data quality • Centralized concept saves AFMS dollars and provides higher fidelity coding

  36. DQ Review List/Statement Compliance • Question C.5. (Data Quality Statement 5 series questions) Outcome of monthly inpatient coding audit • C.5.c) Percentage of inpatient records whose assigned DRG codes were correct? • C.5.f) Inpatient Professional Services Rounds encounters E & M codes audited and deemed correct? • C.5.g) Inpatient Professional Services Rounds encounters ICD-9 codes audited and deemed correct? • C.5.h) Inpatient Professional Services Rounds encounters CPT codes audited and deemed correct? • > 95% = Green> 80% < 95% = Yellow < 80% = Red

  37. DQ Review List/Statement Availability/Accuracy • Question C.5. Inpatient Records (continued) • C.5.i) What percentage of completed and current (signed within the past 12 months) DD Forms 2569 (TPC Insurance Info) are available for audit? (How the patient answered is only relevant to answering “Question 6f”) • The DD Forms 2569 need to be available and current at the time of the audit to be in compliance with the UBO program. • > 95% = Green> 80% < 95% = Yellow < 80% = Red • Options for filing DD Form 2569: • Maintain hardcopy DD Form 2569 in medical record • Scan DD Form 2569 and store electronically • Hardcopy DD Form 2569 stored in the MTF RMO/Business/TPC Office

  38. AFMS TPC Central Contract Transition

  39. DQ Review List/Statement Availability/Accuracy • Question C.5. Inpatient Records. CONT… • C.5.j) What percentage of available, current and complete DD Forms 2569s are verified to be correct in the Patient Insurance Information (PII) module in CHCS? Internal Process based on Question 6e. Does not apply to OCONUS bases. • > 95% = Green> 80% < 95% = Yellow < 80% = Red

  40. DQ Review List/Statement Availability/Accuracy • Question C.6. (Data Quality Statement series 6 questions) Outpatient Records • C.6.a) Is the documentation of the encounter selected to be audited available? Documentation includes documentation in the medical record, loose (hard copy) documentation or an electronic record of the encounter in AHLTA. (Denominator equals sample size.) • C.6.b) What is the percentage of E & M codes deemed correct? (E & M code must comply with current DoD guidance.) • C.6.c) What is the percentage of ICD-9 codes deemed correct? • C.6.d) What is the percentage of CPT codes deemed correct? (CPT code must comply with current DoD guidance.) Source for a, b, c, d is Audit Tool • > 95% = Green> 80% < 95% = Yellow < 80% = Red

  41. DQ Review List/Statement Availability/Accuracy • Question C.6. Outpatient Records (continued) • C.6.e) What percentage of completed and current (signed within the past 12 months) DD Forms 2569s (TPC Insurance Info) are available for audit? • Audit Tool Generated/Internal Process (This metric only measures whether or not a DD Form 2569 was collected/current in the record at the time of the encounter). The DD Forms 2569 need to be available and current at the time of the audit to be in compliance with the UBO program. • C.6.f) What percentage of available, current and complete DD Forms 2569s are verified to be correct in the Patient Insurance Information (PII) module in CHCS? Internal Process based on Question 6e.Does not apply to OCONUS bases. • > 95% = Green> 80% < 95% = Yellow < 80% = Red

  42. DQ Review List/Statement Availability/Accuracy • Question C.7. Ambulatory Procedure Visits (Data Quality Statement series 7 questions) (C.7.a,b,c,d,e) • Questions C.7.a,b,c,d,e Are the same as Questions C.6.a,c,d,e,f • > 95% = Green> 80% < 95% = Yellow < 80% = Red

  43. DQ Review List/Statement Completeness • Question C.8. (Data Quality Statement series 8 questions) Comparison of reported workload data. • C.8.a) # SADR Encounters (count only) / # WWR visits Source is BDQAS • C.8.b) # SIDR Dispositions / # WWR Dispositions Source is BDQAS • C.8.c) # EAS Visits / # WWR Visits Source is BDQAS • C.8.d) # EAS Dispositions / # WWR Dispositions Source is BDQAS • C.8.e) # of Inpatient Professional Services Rounds SADR encounters (FCC=A***)/#Sum WWR (Total Bed Days + Total Dispositions) Note: FY10 Goal is 80% (Will be graded red and green only) Source is Monthly Statistical Report (Internal Process) • > 95% = Green> 80% < 95% = Yellow < 80% = Red

  44. DQ Review List/Statement AHLTA Use • Question E.4.i. (Data Quality Statement question 9) System Design, Development, Operations, and Education/Training. • a. # of AHLTA SADR encounters / # of Total SADR encounters (ALL SADR encounters including APV and ER) Source is BDQAS Note: This question is to gauge the use of AHLTA at our MTFs. It is understood that not all clinical modules are deployed in the current version of AHLTA. • > 80% = Green < 80% = Red

  45. DQ Statement Awareness • Question 11. I am aware of data quality issues identified by the completed DQ Statement and DQMC Review List and when needed, have incorporated monitoring mechanisms and have taken corrective actions to improve the data from my facility. (Electronic Signature Authorized) • “Yes” = Green “No” = Red (comments required) • Do not use N/A

  46. Electronic DQ (eDQ) Review List and Statement • Automate DQ Review List and Statement production at the MTF • Eliminate repetitive consolidation at various higher HQ levels • Will enable all involved to spend more time correcting DQ, improving processes, enhance decision making • Will be housed on Vector Check • Way Ahead (No firm ECD, but it’s coming): • Prototype almost complete • AFMOA leadership recently funded to completion • Deploy at test sites/collect feedback…adjust…deploy AF-wide • Design/implement performance metrics

  47. eDQ Access via Vector Check

  48. eDQ Review List Main Page

  49. eDQ Review List Sample View

  50. eDQ Rejection Sample

More Related