1 / 56

Introduction to Encounter Data Validation

Introduction to Encounter Data Validation. Presenter : Thomas Miller, MA Executive Director, Research and Analysis Team. 1. Welcome. About me Rules for engagement Presentation overview The importance of encounter data Trends in Federal policy CMS protocols Florida EDV study. 2.

mika
Download Presentation

Introduction to Encounter Data Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Encounter Data Validation Presenter: Thomas Miller, MA Executive Director, Research and Analysis Team 1

  2. Welcome • About me • Rules for engagement • Presentation overview • The importance of encounter data • Trends in Federal policy • CMS protocols • Florida EDV study 2

  3. Objectives 3

  4. 4

  5. Importance of Encounter Data • Accurate and complete data are critical to success of managed care programs • Essential for overall management and oversight of Florida’s Medicaid program • Ability to monitor and improve quality of care • Establish performance measures • Generate accurate and reliable reports • Obtain utilization and cost information 5

  6. Importance of Encounter Data 6

  7. Importance of Encounter Data • Used by MCOs and the State for many purposes • Performance measure development and calculation • Performance improvement measurement • Focused studies/quality activities • Rate-setting • Compliance monitoring • Provider practice patterns 7

  8. Key Trends • Importance of Federal and State monitoring • Development of core measurement sets • Medicare versus Medicaid • Health care reform • Holding health care accountable • Data, not anecdotes 8

  9. Key Trends in the News 9

  10. Key Trends • Findings from a recent article in Medicare and Medicaid Research Review, Assessing the Usability of MAX 2008 Encounter Data for Comprehensive Managed Care • Objective: Assess availability, completeness, quality, and usability of encounter data • Results: High rates for reporting by key encounter data types • Conclusions: Completeness and quality of encounter data were high 10

  11. Objectives 11 11

  12. 12

  13. Objectives 13

  14. EQR Protocol • Developed and refined with the maturation of the External Quality Review program 14

  15. EQR Protocol • Specific guideline for External Quality Review Organizations (EQRO) to use when assessing completeness and accuracy of encounter data. • Data submitted by Managed Care Organizations (MCO) to the State 15

  16. EQR Protocol • State establishes standards for encounter data • State must establish the following standards: • Definition of “encounter” • Types of encounters • Data accuracy and completeness • Objective standards for data comparison 16

  17. EQR Protocol • Five core activities • Review state requirements • Review MCO’s capability • Analyze electronic encounter data • Review of medical records • Submission of findings and recommendations 17

  18. EQR Protocol • Attachment A: Encounter Data TablesTable 2: Data Element Validity Requirements 18

  19. EQR Protocol • Five core activities • Review state requirements • Develop understanding of State-specific policies and procedures for collecting and submitting encounter data • Identify data exchange protocols and layouts • Evaluate encounter data system interchange flows, including system edits and submission timelines • Review existing encounter data quality activities, requirements, and performance standards 19

  20. EQR Protocol • Five key activities, continued • Review MCO’s capability • Develop, conduct, and review MCO’s Information System Capabilities Assessment • Identification of IS vulnerabilities • Key findings address: • Data processing and procedures • Claims/encounter processing and system demonstration • Enrollment • Key informant interviews 20

  21. EQR Protocol • Five key activities, continued • Analyze electronic encounter data • STEP 1 - Develop data quality test plan to determine: • Magnitude and type of missing encounter data • Overall data quality issues • MCO data submission issues 21

  22. EQR Protocol • Five key activities, continued • Analyze electronic encounter data • STEP 2 - Verify integrity of encounter data • Macro-level analysis • Encounter file completeness and reasonableness • Volume and utilization by encounter type and service setting • Internal field consistency • General field completeness and validity 22

  23. EQR Protocol • Five key activities, continued • Analyze electronic encounter data • STEP 3 – Generate and Review Analytic Reports • Micro-level analysis • Encounter record completeness and reasonableness • Follows similar analysis as outlined in Step 2 • Analyzing volume/consistency by time, provider, service type 23

  24. EQR Protocol • Five key activities, continued • Analyze electronic encounter data • STEP 4 – Compare findings to state-identified standards • Identification of appropriate benchmark population 24

  25. EQR Protocol • Five key activities, continued • Review of medical records • Verification of the accuracy of coding • Protocol assumptions • STEP 1 – Determine sampling for medical record review • Identify valid sample size • Encounter- vs. recipient-based samples 25

  26. EQR Protocol • Five key activities, continued • Review of medical records • STEP 2 – Obtain and review medical records and document findings • Procurement efficiencies • Abstraction staff and training • Categorization of errors by level, type, and source • Procurement tracking and abstraction tools 26

  27. EQR Protocol • Five key activities, continued • Submission of findings • Narrative report summarizing findings from Activities 1-4 • Actionable recommendations for overall encounter data quality improvement 27

  28. Questions? Proto what? 28

  29. Objectives 29

  30. 30

  31. Objectives 31

  32. SFY 2013-2014 Encounter Data Validation (EDV) Study Agency for Health Care Administration Validation of Encounter Data 32

  33. SFY 2013-2014 Encounter Data Validation (EDV) Study • Review proposed encounter data validation process • Submitted as part of EQR RFP response • Will be conducted in alignment with CMS’ EQR Protocol 4 • Evaluates the accuracy and completeness of encounter data submitted to AHCA by capitated health plans 33

  34. SFY 2013-2014 Encounter Data Validation (EDV) Study • Background • Experience • Core competency evaluating data • Information system reviews • Comparative analyses of MCO and State Medicaid data • Medical/clinical record review • Methodology is constructed to provide an effective validation of the quality of data maintained by State agencies within resource requirements 34

  35. SFY 2013-2014 Encounter Data Validation (EDV) Study 35

  36. SFY 2013-2014 Encounter Data Validation (EDV) Study • Four key steps for conducting successful evaluations • Project implementation • Study design • Data collection &analysis • Reporting & recommendations 36

  37. SFY 2013-2014 Encounter Data Validation (EDV) Study • Project Implementation • Kick-off meeting with AHCA • Initiated during contract implementation period • Review and define overall scope of project • Discuss anticipated timelines • Define evaluation parameters • Number of MCOs included • Data requirements and limitations • Implementation procedures to validate AHCA’s encounter data • Kick-off meeting with participating MCOs • Description of project and finalized study methodology • Expectations for MCO involvement 37

  38. SFY 2013-2014 Encounter Data Validation (EDV) Study • Study design • Prepare draft methodology including: • Study objectives and research questions • Data source and collection procedures • Measurement methodology • Analytic methods • Timeline • Review and approval of methodology by AHCA • Develop of detailed analysis plan or technical companion document methodology 38

  39. SFY 2013-2014 Encounter Data Validation (EDV) Study • Data collection and analysis • Information systems review • Scope to be defined in collaboration with AHCA • Identify key encounter data policies and procedures • Selection of key evaluation fields, service groups, and encounter types • Identification of existing/proposed standards • Review of processes affecting data quality • Expected to be limited in scope • Focused on building contextual knowledge of systems to facilitate development of effective and actionable recommendations 39

  40. SFY 2013-2014 Encounter Data Validation (EDV) Study • Data collection and analysis • Information systems review, continued • Request for supplemental documents • Encounter data submission process • Previous studies conducted by AHCA • Documentation will be used to assess encounter data quality • Used of NCQA® Roadmap where appropriate 40

  41. SFY 2013-2014 Encounter Data Validation (EDV) Study • Data collection and analysis, continued • Encounter data source files • Review of State encounter data file layouts • Prepare data requirements documents • Receive, process, and load encounter data • Final status encounters from the Florida Medicaid Management Information System and Decision Support System (FMMIS/DSS) • Final status claims/encounters from MCO adjudication systems • Includes all claim/service types—i.e., inpatient/outpatient, physician visits, dental, and pharmaceutical 41

  42. SFY 2013-2014 Encounter Data Validation (EDV) Study • Data collection and analysis • Comparative data analysis of State and MCO encounter data • Evaluates the extent to which encounters submitted by MCOs to AHCA are accurate, complete, and reasonable • Preliminary file review • Ensures files are sufficient for processing • Involves the basic checks • Percentage present • Percentage valid • Percentage valid values 42

  43. SFY 2013-2014 Encounter Data Validation (EDV) Study • Data collection and analysis, continued • Comparison: State data to MCO data • Indicators to measure degree of completeness and accuracy for each encounter type • Overall record matching—percentage of state encounters present in MCO files • Field-level matching—percentage of state encounters with exact value match in MCO file for each select data element • Standard fields include: date of service, recipient ID, provider ID, primary diagnosis, procedure code(s), and payment fields 43

  44. SFY 2013-2014 Encounter Data Validation (EDV) Study 44

  45. SFY 2013-2014 Encounter Data Validation (EDV) Study 45

  46. Phew… Questions? 46

  47. SFY 2013-2014 Encounter Data Validation (EDV) Study • Data collection and analysis, continued • Medical record review • Represents the “gold standard” • Evaluation of service level accuracyand completeness • Proposed methodology • Only include MCOs operational as of January 2013 • EQRO Contract Years 1, 2, and 3 (7/1/2013-6/30/2016): review one-third of selected plans each year • EQRO Contract Years 4 and 5 (7/1/2016-6/30/2018): review one-half of selected plans each year • Procure and abstraction 25 percent of all sampled records each quarter • Minimum 50 cases reviewed per plan • Target professional, dental, and pharmacy encounters 47

  48. SFY 2013-2014 Encounter Data Validation (EDV) Study • Data collection and analysis, continued • Medical record review • Sample selection methodology • To generate list of randomly selected encounters for medical review, HSAG proposes using data files from comparative analyses • Two-stage stratified sampling design used to ensure: • Member’s record is selected only once • Number of encounters included in final sample covers all encounter types and proportional to total distribution of encounters 48

  49. SFY 2013-2014 Encounter Data Validation (EDV) Study • Data collection and analysis, continued • Medical record review • Sample selection methodology • Identify all users by encounter type per MCO • Determine required sample size of each encounter type based on total distribution of users • Randomly select users form each encounter type based on required sample size • Identify all encounters associated with applicable encounter types for the selected users • Final sample will consist of 50 cases randomly selected from applicable encounter types per MCO per year, OR1,200 cases for 1/3 of all MCOs being reviewed per year • For each encounter type, HSAG will define specific data elements for validation 49

  50. SFY 2013-2014 Encounter Data Validation (EDV) Study • Data collection and analysis, continued • Medical record review • Procurement of selected sample records • General Process • Once sample is selected, each MCO to receive list of its study cases • HSAG will match selected date of service for each sampled member with rendering provider • MCOs will procure and submit identified medical records to HSAG for review 50

More Related