1 / 44

Performance Improvement Projects (PIPs) Technical Assistance for Florida Medicaid NHDPs

WELCOME to the PIP Technical Assistance Training for Florida NHDPs We will begin shortly. Please place your phone on mute, unless you are speaking. Thank you. Performance Improvement Projects (PIPs) Technical Assistance for Florida Medicaid NHDPs. August 22, 2007. Christi Melendez, RN

blanca
Download Presentation

Performance Improvement Projects (PIPs) Technical Assistance for Florida Medicaid NHDPs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WELCOMEto the PIP Technical Assistance Training for Florida NHDPsWe will begin shortly. Please place your phone on mute, unless you are speaking. Thank you.

  2. Performance Improvement Projects (PIPs)Technical Assistancefor Florida MedicaidNHDPs August 22, 2007 Christi Melendez, RN PIP Review Team Project Leader

  3. Presentation Outline • Purpose • Review of PIP Activities II through X • Review PIP submission process for the 2007-2008 validation cycle • Questions and Answers

  4. PURPOSE • To provide technical assistance with examples for Activities receiving an overall score of Partially Met or Not Met for the 2006-2007 validation cycle. • PIP submission process for the 2007-2008 validation cycle.

  5. Activity Two: The Study Question HSAG Evaluation Criteria • The study question stated the problem to be studied in simple terms. • Was answerable. *In general, the question should illustrate the point of: Does doing X result in Y?

  6. Examples of Study Questions • Do targeted interventions increase the number of members completing an Advance Directive within the first 30 days of enrollment? • Will member interventions increase the rate of members who receive a flu vaccine? • Will targeted interventions decrease the rate of missed personal care aide visits?

  7. Activity Three: Selected Study Indicators HSAG Evaluation Criteria • Was well defined, objective, and measurable. • Was based on practice guidelines, with sources identified. If no practice guidelines were available for the topic, please specify. • Aligned with the study question(s) and allowed for the study question(s) to be answered.

  8. Activity Three: Selected Study Indicators HSAG Evaluation Criteria (cont.) • Measured changes (outcomes) in health or functional status, member satisfaction, or valid process alternatives. • There were data available to be collected on each indicator. • Included the basis on how each indicator was developed.

  9. Activity Three: Study IndicatorEXAMPLE

  10. Activity Four: The Study Population HSAG Evaluation Criteria • Was accurately and completely defined. • Included requirements for length of a members enrollment in the NHDP. If enrollment is not applicable, this should be stated in the PIP text. • Captured all members to whom the study question applies. • Included ICD-9 codes and procedure codes (if applicable).

  11. Activity Four: The Study Population Example: All members 65 years of age or older who were continuously enrolled in the NHDP for at least 30 days during 1/1/06-12/31/06.

  12. Activity Five: Sampling Techniques HSAG Evaluation Criteria • The true or estimated frequency of occurrence was provided and considered in the sampling technique. • Sample size was specified. • Confidence level was specified. • Acceptable margin of error was specified.

  13. Activity Five: Sampling Techniques HSAG Evaluation Criteria (cont.) • The sampling technique ensured a representative sample of the eligible population. • Sampling techniques were in accordance with generally accepted principles of research design and statistical analysis. Valid sampling techniques should be used for all study indicators, which can be replicated by using the reported sampling parameters.

  14. Activity Five: Sampling TechniqueEXAMPLE

  15. Study Implementation PhaseActivity V: Sampling Technique Example

  16. Activity Six: Data Collection HSAG Evaluation Criteria • Data elements to be collected were clearly identified. • The data sources were clearly identified. • A systematic method for data collection was outlined in the PIP documentation. • A timeline included both starting and ending dates for all measurement periods.

  17. Activity Six: Data Collection For Manual Data Collection • The relevant education, experience, and training of all manual data collection staff were described in the PIP text. • The manual data collection tool was included with the PIP submission. • A discussion of the interrater reliability process was documented in the PIP text.

  18. Activity Six: Data Collection HSAG Evaluation Criteria (cont.) • Written instructions for the manual data collection tool were clearly and succinctly written and included in the PIP documentation. • A brief statement about the purpose of the study (overview) was included in the written instructions for the manual data collection tool.

  19. Activity Six: Data Collection For Administrative Data Collection • Documentation should include a systematic process of the steps used to collect data. This can be defined in narrative format or with algorithms/flow charts. • The estimated degree of administrative data completeness should be included along with an explanation of how the percentage of completeness was calculated.

  20. Activity Six: Data Collection

  21. Activity Six: Data Collection

  22. Activity Seven: Improvement Strategies HSAG Evaluation Criteria • A completed causal/barrier analysis explanation of how the intervention(s) were related to causes/barriers identified through data analysis and quality improvement processes should be included in the PIP documentation. • System interventions that will have a permanent effect on the outcomes of the PIP should be documented in the text.

  23. Activity Seven: Improvement Strategies HSAG Evaluation Criteria (cont.) • If repeat measures do not yield statistically significant improvements, there should be an explanation of how problem solving and data analysis was performed to identify possible causes. • If quality improvement interventions were successful, it should be documented that the interventions were standardized and the interventions were monitored.

  24. How to perform a Causal/Barrier Analysis Determine why an event or condition occurs? • What’s the problem? • Define what the problem is and why it’s a concern. • Determine the significance of the problem. • Look at data and see how the problem impacts your members and/or health plan. • Identify the causes/barriers? • Conduct analysis of chart review data; surveys; focus groups. • Brainstorming at quality improvement committee meeting. • Literature review. • Develop/Implement interventions based on barriers identified.

  25. Causal/Barrier Analysis Methods andTools • Methods: Quality improvement committee Develop an internal task force • Tools: Fishbone Process mapping Barrier/intervention table

  26. Barrier/Intervention TableEXAMPLE

  27. Activity Eight: Data Analysis and Interpretation of Study Results • HSAG Evaluation Criteria • The data analysis: • Was conducted according to the data • analysis plan in the study design. • Allowed for generalization of the results • to the study population if a sample • was selected. • Identified factors that threaten internal • or external validity of findings (change in demographic population, acquiring another health plan’s members, change in the IS system, change in health plan).

  28. Activity Eight: Data Analysis and Interpretation of Study Results • HSAG Evaluation Criteria (cont.) • Included an interpretation of findings. • Was presented in a way that provides accurate, • clear, and easily understood information. • Identified initial measurement and • remeasurement of study indicators. • Identified statistical differences between initial • measurement and remeasurement.

  29. Activity Eight: Data Analysis and Interpretation of Study Results HSAG Evaluation Criteria (cont.) • Identified factors that affect the ability to compare initial measurement with remeasurement (changes to the methodology, change in time periods, seasonality, or a change in vendors). • Included the extent to which the study was successful.

  30. Activity Eight: Data Analysis and Interpretation of Study Results Example: Baseline Interpretation The baseline data collection yielded results of 14.1 percent of members who completed an Advanced Directive during the baseline timeframe of 1/1/05-12/31/05.

  31. Activity Eight: Data Analysis and Interpretation of Study Results Example: Remeasurement 1 The baseline rate of members completing an Advanced Directive at 14.1 percent increased to 21.7 percent in the first remeasurement (1/1/06 -12/31/06). This represents a statistically significant (p = 0.00167) increase.

  32. Activity Eight: Data Analysis and Interpretation of Study Results Example: Remeasurement 2 The first remeasurement rate increased from 21.7 percent to 27.6 percent in the second remeasurement (1/1/07 – 12/31/07). This increase was not a statistically significant increase (p = 0.0699).

  33. Activity Eight: Data Analysis and Interpretation of Study Results Overall Analysis: The rate of members completing an Advanced Directive increased each year with statistically significant results in the first remeasurement with no decline in performance. The study has been successful in increasing the rate of members completing an Advanced Directive and will be continued until the goal is met.

  34. Activity Nine: Assessing for Real Improvement HSAG Evaluation Criteria • The use of the same methodology for baseline and remeasurement should be documented. • If there was a change in methodology, the issue should be discussed in the PIP text that justifies the needed changes. • Documentation should include how intervention(s) were successful in affecting system wide processes or health care outcomes.

  35. Activity Nine: Assessing for Real Improvement HSAG Evaluation Criteria (cont.) • The improvement in performance as a result of the intervention(s) should be documented in the text of the PIP. • The PIP documentation should include calculations and reports on the degree to which the intervention(s) were statistically significant. • The table in Activity IX should be completely filled out for each measurement period. The actual p values should be documented and whether or not the value was statistically significant.

  36. Activity Nine: Assessing For Real Improvement Example: Completed Table

  37. Activity Ten: Assessing for Sustained Improvement HSAG Evaluation Criteria * This activity is not assessed until baseline and a minimum of two annual measurements have been completed. • Demonstrated improvement in all the study indicators should be explained in the text of the PIP. • If there was a decline in results, the PIP text should have an explanation of this decline and what follow-up activities are planned.

  38. New PIP submissions • New PIPs not submitted for the 2006-2007 validation cycle. • For new PIP submissions, it is important to contact HSAG to obtain the most current updated PIP Summary Form.

  39. How to submit continuing PIPs • On-going PIPs (submitted to HSAG for the 2006-2007 validation cycle). • Highlight, bold, or add text in a different color, and date any new information that is added to the existing PIP Summary Form. • Strikethrough and date any information that no longer applies to the PIP study. • Ensure all Partially Met and Not Met evaluation elements from the previous validation cycle have been addressed in the documentation.

  40. Resources • Frequently asked questions (FAQs) and PIP information - myfloridaeqro.com • NCQA Quality Profiles - http://www.qualityprofiles.org/index.asp • Institute for Healthcare Improvement – www.ihi.org • Center for Healthcare Strategies – www.chcs.org • Health Care Quality Improvement Studies in Managed Care Settings – A Guide for State Medicaid Agencies www.ncqa.org/publications • National Guideline Clearinghouse – www.guidelines.gov • Agency for Healthcare Research and Quality – www.ahrq.gov

  41. Deliverables September 7th: NHDPs notified electronically of submission date with instructions October 5th: Submit PIP studies to HSAG * HSAG will be validating two PIPs per NHDP; one clinical and one nonclinical. If the collaborative PIP is clinical, the other PIP chosen for validation will be nonclinical.

  42. PIP Tips • Complete the demographic page before submission. • Notify HSAG when the PIP documents are uploaded • to the secure ftp site and state the number of documents • uploaded. • 3. Label ALL attachments and reference them • in the body of the PIP study. • 4. HSAG does not require personal health • information to be submitted. Submit only aggregate • results. • 5. Document, document, and document!! • Go to myfloridaeqro.com for FAQ or contact Cheryl Neel • at cneel@hsag.com to answer any questions.

  43. HSAG Contacts For questions contact: • Cheryl Neel • cneel@hsag.com • 602.745.6201 • Denise Driscoll • ddriscoll@hsag.com • 602.745.6260

  44. Questions and Answers

More Related