Performance Improvement Project Validation Process
This presentation is the property of its rightful owner.
Sponsored Links
1 / 29

Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team PowerPoint PPT Presentation


  • 85 Views
  • Uploaded on
  • Presentation posted in: General

Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis. Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team. CMS PIP Protocol Changes. Activities III, IV, VII, and VIII have been reversed in order.

Download Presentation

Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Presenter christi melendez rn cphq associate director pip review team

Performance Improvement Project Validation Process Outcome Focused Scoring Methodologyand Critical Analysis

Presenter:

Christi Melendez, RN, CPHQ

Associate Director, PIP Review Team


Cms pip protocol changes

CMS PIP Protocol Changes

Activities III, IV, VII, and VIII have been reversed in order.

  • Activity III: Use a Representative and Generalizable Study Population

  • Activity IV: Select the Study Indicator(s)

  • Activity VII: Data Analysis and Interpretation of Results

  • Activity VIII: Improvement Strategies


Activity i choose the study topic

Activity I: Choose the Study Topic

HSAG Evaluation Elements: The Study Topic

  • Is selected following collection and analysis of data (critical element)

  • Has the potential to affect member health, outcomes of care, functional status, or satisfaction


Activity ii state the study question

Activity II: State the Study Question

HSAG Evaluation Elements: The Study Question

  • States the problem to be studied in simple terms and is in the recommended X/Y format (critical element)


Activity iii identify the study population

Activity III: Identify the Study Population

  • HSAG Evaluation Elements: The Study Population

    • Is accurately and completely defined and captures all members to whom the study question applies (critical element)


Activity iv select the study indicator

Activity IV: Select the Study Indicator

HSAG Evaluation Elements: The Study Indicator

  • Is well-defined, objective, and measures changes in health or functional status, consumer satisfaction, or valid process alternatives (critical element)

  • Includes the basis on which the indicator was adopted, if internally developed

  • Allows for the study question to be answered (critical element)


Activity v use valid sampling techniques

Activity V: Use Valid Sampling Techniques*

  • HSAG Evaluation Elements: Sampling Techniques

    • Specify the measurement period for the sampling methods used

    • Provide the title of the applicable study indicator

    • Identify the population size

    • Identify the sample size (critical element)

    • Specify the margin of error and confidence level

    • Describe in detail the methods used to select the sample

  • * Activity V is only scored if sampling techniques were used.


Activity vi define data collection

Activity VI: Define Data Collection

  • HSAG Evaluation Elements: Data Collection

  • The data collection procedures:

    • Identify the data elements to be collected

    • Include a defined and systematic process for collecting baseline and remeasurement data


Activity vi define data collection1

Activity VI: Define Data Collection

  • HSAG Evaluation Elements: Data Collection

  • The manual data collection procedures:

    • Include the qualifications of staff member(s) collecting manual data

    • Include a manual data collection tool that ensures consistent and accurate collection of data according to indicator specifications (critical element)


Activity vi define data collection2

Activity VI: Define Data Collection

  • HSAG Evaluation Elements: Data Collection

  • The administrative data collection procedures:

    • Include an estimated degree of administrative data completeness

    • Describe the data analysis plan


Activity vii analyze data and interpret study results

Activity VII: Analyze Data and Interpret Study Results

  • HSAG Evaluation Elements: Data Analysis

  • Is conducted according to the data analysis plan in the study design

  • Allows for the generalization of results to the study population if a sample was selected (critical element)

  • Identifies factors that threaten internal or external validity of findings

  • Includes an interpretation of findings

  • *Evaluation Elements 1-5, in Activity VII, are scored for PIPs that provide baseline data


Activity vii analyze data and interpret study results1

Activity VII: Analyze Data and Interpret Study Results

  • HSAG Evaluation Elements: Interpretation of Study Results

    • Is presented in a way that provides accurate, clear, easily understood information (critical element)

    • Identifies the initial measurement and the remeasurement of study indicators

    • Identifies statistical differences between the initial measurement and the remeasurement

    • Identifies factors that affect the ability to compare the initial measurement with the remeasurement

    • Includes an interpretation of the extent to which the study was successful


Activity viii implementing interventions and improvement strategies

Activity VIII:Implementing Interventions and Improvement Strategies

  • HSAG Evaluation Elements: Improvement Strategies

    • Are related to causes/barriers identified through data analysis and quality improvement processes (critical element)

    • Are system changes that are likely to induce permanent change

    • Are revised if the original interventions are not successful

    • Are standardized and monitored if interventions are successful


Activity ix real improvement

Activity IX: Real Improvement*

  • HSAG Evaluation Elements: Report Improvement

    • The remeasurement methodology is the same as the baseline methodology

    • There is documented improvement in processes or outcomes of care

    • There is statistical evidence that observed improvement is true improvement over baseline (critical element)

    • The improvement appears to be the result of planned intervention(s)

  • * Activity IX is scored when the PIP has progressed to Remeasurement 1 and will be scored on an annual basis until statistically significant improvement is achieved from baseline to a subsequent remeasurement for all study indicators. Once Evaluation Element 3 receives a Met score, it will remain Met for the duration of the PIP.


Activity x sustained improvement

Activity X: Sustained Improvement*

  • HSAG Evaluation Elements: Sustained Improvement

    • Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant (critical element)

  • *HSAG will not validate Activity X until statistically significant improvement has been achieved across all study indicators. Once statistically significant improvement is achieved, the MCO will need to demonstrate a subsequent remeasurement period demonstrating that they sustained that improvement to receive an overall Met validation status. 


Presenter christi melendez rn cphq associate director pip review team

PIP Tool Format

Old Tool Format

New Tool Format

10 Activities

37 Evaluation Elements

Activity III Study Population

Activity IV Study Indicator(s)

Activity VII Data Analysis

Activity VIII Interventions

12 Critical Elements

  • 10 Activities

  • 53 Evaluation Elements

  • Activity VII Interventions

  • Activity VIII Data Analysis

  • 13 Critical Elements


Outcome focused pip scoring

Outcome Focused PIP Scoring

HSAG Evaluation Tool

  • 37 Evaluation Elements Total

  • 12 Critical Elements (CE)

  • Activity VI:

  • Activity VII:

  • Activity VIII:

  • Activity IX:

  • Activity X:

1 CE

2 CE

1 CE

1 CE

1 CE

  • Activity I:

  • Activity II:

  • Activity III:

  • Activity IV:

  • Activity V:

1 CE

1 CE

1 CE

2 CE

1 CE


Outcome focused pip scoring1

Outcome Focused PIP Scoring

Changes

  • Activity VII

    • Evaluation Element 5 is critical

    • MCOs should ensure that data reported in all PIPs are accurate and align with what has been reported in its IDSS.

  • Activity IX

    • Evaluation Elements 3 and 4 have been reversed

    • New criteria for scoring Activity IX

  • Activity X

    • New criteria for scoring Activity X


  • Activity ix outcome focused pip scoring

    Activity IX:Outcome Focused PIP Scoring

    • HSAG Evaluation Elements: Assessing for Real Improvement

      • The remeasurement methodology is the same as the baseline methodology

      • There is documented improvement in processes or outcomes of care

      • There is statistical evidence that observed improvement is true improvement over baseline and across all study indicators

      • The improvement appears to be the result of planned intervention(s)


    Activity x outcome focused pip scoring

    Activity X:Outcome Focused PIP Scoring

    • HSAG Evaluation Elements: Assessing for Sustained Improvement

      • Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant across all study indicators


    Outcome focused pip scoring2

    Outcome Focused PIP Scoring

    • Activity IX

      • Repeated measurement of the indicators demonstrates meaningful change in performance

      • Improvement must be statistically significant for all study indicators to receive an overall Met validation status

      • Is scored on an annual basis until statistically significant improvement over baseline has been achieved for all study indicators

      • Once Evaluation Element 3 receives a Met score, it will remain Met for the duration of the PIP

      • Evaluation elements 3 and 4 are linked


    Outcome focused pip scoring3

    Outcome Focused PIP Scoring

    • Activity X

      • Repeated measurement of the indicators demonstrates sustained improvement

      • HSAG will not validate Activity X until Evaluation Element 3 of Activity IX is Met

      • Once statistically significant improvement has been achieved for all indicators, the MCO will need to document a subsequent measurement period demonstrating sustained improvement in order to receive a Met in Activity X


    Outcome focused pip rationale

    Outcome Focused PIP Rationale

    • Overall Met Validation Status

    • The changes align the actual outcomes of the project with the overall validation status

    • Emphasis on statistically significant, sustained improvement in outcomes


    Critical analysis

    Critical Analysis

    HSAG will be evaluating whether or not…

    • A current causal/barrier analysis was completed-MCOs should conduct an annual causal/barrier and drill-down analysis in addition to periodic analyses of their most recent data. MCOs should include the updated causal/barrier analysis outcomes in its PIPs.


    Critical analysis1

    Critical Analysis

    HSAG will be evaluating whether or not…

    • Barriers and interventions were relevant to the focus of the study and can impact the study indicator(s) outcomes


    Critical analysis2

    Critical Analysis

    For any intervention implemented, the MCO should have a process in place to evaluate the efficacy of the intervention to determine if it is having the desired effect. This evaluation process should be detailed in the PIP documentation. If the interventions are not having the desired effect, the MCO should discuss how it will be addressing these deficiencies and what changes will be made to its improvement strategies.


    Critical analysis3

    Critical Analysis

    The MCO should ensure that the intervention(s) implemented will impact the study indicator(s) outcomes.

    • Member-focused interventions will not impact a study indicator measuring the quality of service provided by a PCP- WCC HEDIS Measure (Childhood Obesity PIP)

    • Interventions focused on educating MCO staff on HEDIS measures will not impact members accessing care and seeking well-child visits


    Critical analysis4

    Critical Analysis

    The MCO should be cognizant of the timing of interventions. Interventions implemented in the last few months of the year will not have been in place long enough to have an impact on the results.


    Questions and answers

    Questions and Answers


  • Login