1 / 68

Mary T. Tripp, Policy Unit Leader Accountability Team

An Update on Routine Provider Monitoring NC Health Information Management Association Behavioral Health Conference. Mary T. Tripp, Policy Unit Leader Accountability Team. June 11, 2014. Division of Mental Health, Developmental Disabilities and Substance Abuse Services. Focus of this Workshop.

coralie
Download Presentation

Mary T. Tripp, Policy Unit Leader Accountability Team

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An Update on Routine Provider MonitoringNC Health Information Management Association Behavioral Health Conference Mary T. Tripp, Policy Unit Leader Accountability Team June 11, 2014 Division of Mental Health, Developmental Disabilities and Substance Abuse Services

  2. Focus of this Workshop • The Impetus for Streamlining Provider Monitoring • What’s New or Different? • Overview of the Process • Internal Quality Assurance and the Records Professional • Accomplishments to Date • Continued Collaboration

  3. Streamlining Provider Monitoring

  4. Streamlining Provider Monitoring • What happened to Gold Star Monitoring? • Waiver Expansion • Continuous Quality Improvement • Reduce Administrative Burden on Providers and LME-MCOs per SL 2009-451 (SB 202) • Business Practices Subcommittee of the LME-MCO & Provider Standardization Committee

  5. We heard you!!

  6. DHHS-LME/MCO-Provider Collaboration Workgroup • DHHS • Division of Health Service Regulation (DHSR) • Division of Medical Assistance (DMA) • Division of Mental Health, Developmental Disabilities and Substance Abuse Services (DMH/DD/SAS)

  7. DHHS-LME/MCO-Provider Collaboration Workgroup • Stakeholder Groups • NC Council of Community Programs • LME/MCOs • Business Practices Subcommittee of the LME-MCO & Provider Standardization Committee • Benchmarks • NC Providers Council (NCPC) • Professional Association Council (PAC)

  8. DHHS-LME/MCO-Provider Collaboration Workgroup

  9. NC Council Benchmarks PAC Individuals & Families DMH/DD/SAS DMA NC Prov. Council DHSR LME-MCOs

  10. Best Possible Outcomes for Individuals and Families ↑ Quality Providers → Quality Services

  11. What’s New or Different? NC Provider Monitoring Process Gold Star Provider Monitoring

  12. What’s New or Different? • Frequency of Routine Monitoring • Routine monitoring occurs on a 2-year cycle rather than annually.

  13. What’s New or Different? • Scoring and Weighting • There have been changes in the scoring and weighting of some of the items on the review tools.

  14. What’s New or Different? • Multiple Sample Sizes and Sample Selection Processes • Incident Reporting • Restrictive Interventions • Complaints • Funds Management • Medication Management

  15. What’s New or Different? • Plans of Correction • Plans of correction are designed to address systemic issues and trends rather than isolated non-compliance issues.

  16. What’s New or Different? • Frequency of Unlicensed AFL Review • Annual reviews for Unlicensed AFLs under the Innovations Waiver • Unlicensed AFLs that are not under the Innovations Waiver are reviewed every 2 years.

  17. What’s New or Different? • Threshold for Passing the Routine Review • The overall passing score has been increased from 75% to 85%. • Thresholds have been set for passing each section of the routine review for agencies.

  18. What’s New or Different? • Health, Safety & Compliance Review • This site review is done when the service is first initiated (at implementation) or when the program moves to a new location. • This review is not required if the service is located at a site that is licensed by DHSR.

  19. What’s New or Different? • Names of Tools Simplified • Health & Safety Review Tool for Unlicensed AFLs → Unlicensed AFL Review Tool • Health, Safety & Compliance Review Tool → Health, Safety and Compliance Review Tool for Initial Reviews

  20. Overview of Routine Monitoring • Includes: • Routine Review • Post-payment Review • State-funded and Medicaid-funded services

  21. Overview of Routine Monitoring • Sample Selection Random Sampling

  22. Overview of Routine Monitoring • Versatility of the Review Tools • Any of the tools can be used in whole or in part to conduct program integrity activities or for targeted monitoring or investigations of incidents, complaints, or quality of care concerns.

  23. Overview of Routine Monitoring • The routine review tools are used with two provider types: • Licensed Independent Practitioners • Provider Agencies

  24. Overview of Routine Monitoring • Licensed Independent Practitioners • LIP Office Site Review Tool • Service Plan Checklist • LIP Routine Review Tool • LIP Post-Payment Review Tool

  25. Overview of Routine Monitoring • Licensed Independent Practitioners • LIP Office Site Review Tool • Service Plan Checklist • An on-site review is conducted before the LME-MCO contracts with the provider to assess compliance with state and federal requirements (e.g., confidentiality HIPAA and ADA). • A mock record review is conducted to determine the extent to which technical assistance is needed in order for the LIP to meet the records and documentation requirements for publicly-funded services. • The is used with Licensed Independent Practitioners (LIPs) in a solo practice or with LIPs in a group practice who bill under the same provider number. • 2)The sample size for a LIP in a solo practice or for LIPs who share the same office space where each LIP has their own contract and provider number with the LME-MCO is ten (10) service events. • 3)The sample size for LIPs in a group practice who bill under the same provider number is thirty (30) service events. • 4)The Routine Agency Tool and the generic post-payment review tool are used to monitor LIPs who are employed by an agency that provides other services besides basic benefit/outpatient services (e.g., CABHAs).

  26. Overview of Routine Monitoring • LIP Routine Review Tool • This tool is used with LIPs who provide outpatient treatment services or basic benefit services only: • in a solo practice or • in a group practice who bill under the same provider number.

  27. Overview of Routine Monitoring • LIP Routine Review Tool • Areas Reviewed: • Rights Notification • Coordination of Care • Service Availability • Storage of Records

  28. Overview of Routine Monitoring • Licensed Independent Practitioners • Routine Review Tool and Post-Payment Review Tool • The sample for both the routine review tool and the post-payment tool for LIPs is based on paid claims (i.e., service events for which the LIP has billed and been reimbursed).

  29. Overview of Routine Monitoring • Sample Size for LIP Review* • Solo practice: N = 10 service events • Group practice billing under the same provider number: N = 30 service events * Applies to the Routine Review and the Post-Payment Review

  30. Overview of Routine Monitoring • Provider Agencies • Routine Review Tool • Post-Payment Review Tools • Unlicensed AFL Review Tool • Health, Safety & Compliance Review Tool

  31. Overview of Routine Monitoring • Routine Review Tool for Agencies • This tool is used to monitor unlicensed services and services licensed under GS §122C that are not surveyed by DHSR-MHL on an annual basis. • This tool is used to monitor LIPs who are employed by an agency that provides other services besides basic benefit/outpatient services (e.g., CABHAs).

  32. Overview of Routine Monitoring • Provider Agency Review Tool • Basic Areas Reviewed: • Rights Notification • Incidents • Restrictive Interventions • Complaints • Coordination of Care • Service Availability

  33. Overview of Routine Monitoring • Provider Agency Review Tool • The following areas are reviewed based when the agency provides either of the following services to the individuals in their program: • Protection of Property • Management of Funds • Medication Administration

  34. Overview of Routine Monitoring • Post-Payment Review Tools for Provider Agencies • Innovations Waiver • Opiod Treatment • Diagnostic Assessment • Residential Services • Day Treatment • Psychiatric Residential Treatment Facilities (PRTF) • Generic PPR Tool

  35. Overview of Routine Monitoring • Post-Payment Review Tools for Provider Agencies • The selection of the PPR tool is determined by the specific service(s) included in the review sample. • Staff qualifications worksheets are provided to assist the reviewer in determining whether the person who provided the services meets the training or educational requirements according to the service definition.

  36. Overview of Routine Monitoring • Post-Payment Review Tools for Provider Agencies • The generic post-payment review tool is used to monitor LIPs employed by an agency that provides other services besides basic benefit/outpatient services (e.g., CABHAs). • The qualifications of the LIP is determined on the basis of the LIP’s licensure unless the service provided by the LIP has specific training or educational requirements.

  37. Overview of Routine Monitoring • Sample Selection for the Agency Review • The sample is randomly selected from: • Paid claims • Level I incidents • Level II & III incidents • Restrictive Interventions • Complaints • Funds Management • Medication Administration

  38. Overview of Routine Monitoring • Sample Size for the Routine Agency Review • Rights Notification; Coordination of Care; Service Availability; Post-Payment Review: N = 30 • Incidents; Restrictive Interventions; Complaints: N = 10 • Funds Management; Medication Administration: N = 5

  39. Some Monitoring Process Points

  40. Selection of the Review Period • How is the timeframe from which the sample is drawn determined?

  41. Selection of the Review Period • When the service event is based on paid claims: • Start 6 months before the scheduled on-site visit through the next 3 months (~ 90 days) to ensure claims have been fully adjudicated. • Example: The on-site is scheduled for May 1. The sample is drawn from randomly selected claims paid between December 1 – February 28.

  42. Selection of the Review Period • The sample selected for the following items can go back up to 1 year in order to obtain an adequate sample: • Incidents • Restrictive Interventions • Complaints • Funds Management • Medication Review

  43. Scheduling the On-Site Review • Providers will be notified in writing 21-28 calendar days prior to the date of the review.

  44. The Records Needed for the Review • No less than 5 business days prior to the date of the review, providers will be notified of the specific service records needed during the review.

  45. Internal Quality Assurance • Make sure all the documentation that is needed to score the items on the tools are organized and easily accessible.

  46. Internal Quality Assurance • Implement a system for routinely monitoring staff knowledge of and compliance with the requirements for service provision and documentation.

  47. Internal Quality Assurance • The tools can be used to conduct periodic self-assessments to identify and remediate compliance issues .

  48. Internal Quality Assurance • Make sure incidents are classified appropriately. • Keep all documentation of administrative reviews and follow-up activities in response to incidents, complaints, and investigations are kept in the same file for easy retrieval during an audit.

  49. Internal Quality Assurance • When requirements change, make sure all staff are informed of the change.

  50. Internal Quality Assurance • Make sure new procedures and requirements are being instituted promptly and consistently as of the effective date of the policy change.

More Related