Performance Improvement Projects Technical Assistance
Download
1 / 64

Performance Improvement Projects Technical Assistance Nursing Home Diversion Programs Thursday, March 29, 2007 8:30 - PowerPoint PPT Presentation


  • 91 Views
  • Uploaded on

Performance Improvement Projects Technical Assistance Nursing Home Diversion Programs Thursday, March 29, 2007 8:30 a.m. – 10:30 a.m. Cheryl L. Neel, RN, MPH, CPHQ Manager, Performance Improvement Projects David Mabb, MS Sr. Director, Statistical Evaluation. Presentation Outline.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Performance Improvement Projects Technical Assistance Nursing Home Diversion Programs Thursday, March 29, 2007 8:30 ' - adonica


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

Performance Improvement Projects Technical AssistanceNursing Home Diversion ProgramsThursday, March 29, 20078:30 a.m. – 10:30 a.m.

Cheryl L. Neel, RN, MPH, CPHQ

Manager, Performance Improvement Projects

David Mabb, MS

Sr. Director, Statistical Evaluation


Presentation outline
Presentation Outline

  • PIP Overall Comments

  • Aggregate MCO PIP Findings

  • Aggregate NHDP Specific Findings

  • Technical Assistance with Group Activities

    • Study Design

    • Study Implementation

    • Quality Outcomes Achieved

  • Questions and Answers


Key pip strategies
Key PIP Strategies

  • Conduct outcome-oriented projects

  • Achieve demonstrable improvement

  • Sustain improvement

  • Correct systemic problems


Validity and reliability of pip results
Validity and Reliability of PIP Results

  • Activity 3 of the CMS Validating Protocol: Evaluating overall validity and reliability of PIP results:

    • Met= Confidence/High confidence in reported PIP results

    • Partially Met= Low confidence in reported PIP results

    • Not Met= Reported PIP results not credible




Aggregate valid percent met
Aggregate Valid Percent Met Activity

I

II

III

IV

VI

VII

VIII

IX

X

V


Nhdp specific findings
NHDP Specific Findings Activity

  • 20 PIPs submitted

  • Scores ranged from 17% to 75%

  • Average score was 40%

  • Assessed evaluation elements were scored as Met 40% of the time



Study design
Study Design Activity

Four Components:

  • Activity I. Selecting an Appropriate Study Topic

  • Activity II. Presenting Clearly Defined, Answerable Study Question(s)

  • Activity III. Documenting Clearly Defined Study Indicator(s)

  • Activity IV. Stating a Correctly Identified Study Population



Activity i selecting an appropriate study topic
Activity I. Selecting an Appropriate Overall ScoreStudy Topic

Results:

  • 71 percent of the six evaluation elements were Met

  • 29 percent were Partially Met or Not Met

  • None of the evaluation elements were Not Applicable or Not Assessed


Activity i review the selected study topic
Activity I: Review the Selected Study Topic Overall Score

HSAG Evaluation Elements:

  • Reflects high-volume or high-risk conditions (or was selected by the State).

  • Is selected following collection and analysis of data (or was selected by the State).

  • Addresses a broad spectrum of care and services (or was selected by the State).

  • Includes all eligible populations that meet the study criteria.

  • Does not exclude members with special health care needs.

  • Has the potential to affect member health, functional status, or satisfaction.

    Bolded evaluation elements show areas for improvement



Activity ii presenting clearly defined answerable study question s
Activity II. Presenting Clearly Defined, Answerable Study Question(s)

Results:

  • 28 percent of the two evaluation elements were Met

  • 73 percent were Partially Met or Not Met

  • None of the evaluation elements were Not Applicable or Not Assessed


Activity ii review the study question s
Activity II: Review the Study Question(s) Question(s)

HSAG Evaluation Elements:

  • States the problem to be studied in simple terms.

  • Is answerable.

    Bolded evaluation elements show areas for improvement



Activity iii documenting clearly defined study indicator s
Activity III. Documenting Clearly Defined Study Indicator(s)

Results:

  • 27 percent of the seven evaluation elements were Met

  • 54 percent were Partially Met or Not Met

  • 19 percent were Not Applicable or Not Assessed


Activity iii review selected study indicator s
Activity III: Review Selected Study Indicator(s) Indicator(s)

HSAG Evaluation Elements:

  • Is well defined, objective, and measurable.

  • Are based on practice guidelines, with sources identified.

  • Allows for the study question to be answered.

  • Measures changes (outcomes) in health or functional status, member satisfaction, or valid process alternatives.

  • Have available data that can be collected on each indicator.

  • Are nationally recognized measure such as HEDIS®, when appropriate.

  • Includes the basis on which each indicator was adopted, if internally developed.

    Bolded evaluation elements show areas for improvement



Activity iv stating a correctly identified study population
Activity IV. Stating a Correctly Identified Study Population

Results:

  • 48 percent of the three evaluation elements were Met

  • 52 percent were Partially Met or Not Met

  • None of the evaluation elements were Not Applicable or Not Assessed


Activity iv review the identified study population
Activity IV: Review the Identified Study Population Population

HSAG Evaluation Elements:

  • Is accurately and completely defined.

  • Includes requirements for the length of a member’s enrollment in the managed care plan.

  • Captures all members to whom the study question applies.

    Bolded evaluation elements show areas for improvement


Group activity
Group Activity Population


Study implementation
Study Implementation Population

Three Components:

  • Activity V. Valid Sampling Techniques

  • Activity VI. Accurate/Complete Data Collection

  • Activity VII. Appropriate Improvement Strategies



Activity v presenting a valid sampling technique
Activity V. Presenting a Valid Sampling Technique Overall Score

Results:

  • 5 out of the 20 PIP studies used sampling.

  • 9 percent of the six evaluation elements were Met.

  • 16 percent were Partially Met or Not Met.

  • 75 percent of the evaluation elements were Not Applicable or Not Assessed.


Activity v review sampling methods
Activity V: Review Sampling Methods Overall Score

* This section is only validated if sampling is used.

HSAG Evaluation Elements:

  • Consider and specify the true or estimated frequency of occurrence. (N=5)

  • Identify the sample size. (N=5)

  • Specify the confidence level to be used. (N=5)

  • Specify the acceptable margin of error. (N=5)

  • Ensure a representative sample of the eligible population. (N=5)

  • Ensure that the sampling techniques are in accordance with generally accepted principles of research design and statistical analysis. (N=5)

    Bolded evaluation elements show areas for improvement


Populations or samples
Populations or Samples? Overall Score

All?

Some?

Generally,

  • Administrative data uses populations

  • Hybrid (chart abstraction) method uses samples identified through administrative data



Activity vi specifying accurate complete data collection
Activity VI. Specifying Accurate/Complete Data Collection NHDP Overall Score

Results:

  • 25 percent of the eleven evaluation elements were Met

  • 66 percent were Partially Met or Not Met

  • 10 percent of the evaluation elements were Not Applicable or Not Assessed


Activity vi review data collection procedures
Activity VI: Review Data Collection Procedures NHDP Overall Score

HSAG Evaluation Elements:

  • Clearly defined data elements to be collected.

  • Clearly identified sources of data.

  • A clearly defined and systematic process for collecting data that includes how baseline and remeasurement data will be collected.

  • A timeline for the collection of baseline and remeasurement data.

  • Qualified staff and personnel to collect manual data.

  • A manual data collection tool that ensures consistent and accurate collection of data according to indicator specifications.

    Bolded evaluation elements show areas for improvement


Activity vi review data collection procedures cont
Activity VI: Review Data Collection Procedures (cont.) NHDP Overall Score

HSAG Evaluation Elements:

  • A manual data collection tool that supports interrater reliability.

  • Clear and concise written instructions for completing the manual data collection tool.

  • An overview of the study in the written instructions.

  • Administrative data collection algorithms that show steps in the production of indicators.

  • An estimated degree of automated data completeness (important if using the administrative method).



Baseline data sources

Baseline Data Sources NHDP Overall Score

Medical Records

Administrative claims/encounter data

Hybrid

HEDIS

Survey Data

MCO program data

Other



Activity vii documenting the appropriate improvement strategies
Activity VII. Documenting the Appropriate Improvement Strategies

Results:

  • 15 percent of the four evaluation elements were Met

  • 18 percent were Partially Met or Not Met

  • 68 percent of the evaluation elements were Not Applicable or Not Assessed


Activity seven assess improvement strategies
Activity Seven: Assess Improvement Strategies Strategies

HSAG Evaluation Elements:

  • Related to causes/barriers identified through data analysis and Quality Improvement (QI) processes.

  • System changes that are likely to induce permanent change.

  • Revised if original interventions are not successful.

  • Standardized and monitored if interventions are successful.

    Bolded evaluation elements show areas for improvement


Determining interventions
Determining Interventions Strategies

Once you know how you are doing at baseline, what interventions will produce meaningful improvement in the target population?


First do a barrier analysis
First Do A Barrier Analysis Strategies

What did an analysis of baseline results show ?

How can we relate it to system improvement?

  • Opportunities for improvement

  • Determine intervention(s)

  • Identify barriers to reaching improvement


How was intervention s chosen
How was intervention(s) chosen? Strategies

  • By reviewing the literature

    • Evidence-based

    • Pros & cons

    • Benefits & costs

  • Develop list of potential interventions -- what is most effective?


Types of interventions
Types of Interventions Strategies

  • Education

  • Provider performance feedback

  • Reminders & tracking systems

  • Organizational changes

  • Community level interventions

  • Mass media


Choosing interventions
Choosing Interventions Strategies

  • Balance

    • potential for success with ease of use

    • acceptability to providers & collaborators

    • cost considerations (direct and indirect)

  • Feasibility

    • adequate resources

    • adequate staff and training to ensure a sustainableeffort


Physician interventions multifaceted most effective

Most effective: Strategies

real-time reminders

outreach/detailing

opinion leaders

provider profiles

Less effective:

educational materials (alone)

formal CME program without enabling or reinforcing strategies

Physician Interventions: Multifaceted Most Effective


Patient interventions
Patient Interventions Strategies

  • Educational programs

    • Disease-specific education booklets

    • Lists of questions to ask your physician

    • Organizing materials: flowsheets, charts, reminder cards

    • Screening instruments to detect complications

    • Direct mailing, media ads, websites


Evaluating interventions
Evaluating Interventions Strategies

  • Does it target a specific quality indicator?

  • Is it aimed at appropriate stakeholders?

  • Is it directed at a specific process/outcome of care or service?

  • Did the intervention begin after baseline measurement period?


Interventions checklist
Interventions Checklist Strategies

  • Analyze barriers (root causes)

  • Choose & understand target audience

  • Select interventions based on cost-benefit

  • Track intermediate results

  • Evaluate effectiveness

  • Modify interventions as needed

  • Re-Measure


Group activity1
Group Activity Strategies


Quality outcomes achieved
Quality Outcomes Achieved Strategies

Three Components:

  • Activity VIII. Presentation of Sufficient Data Analysis and Interpretation

  • Activity IX. Evidence of Real Improvement Achieved

  • Activity X. Data Supporting Sustained Improvement Achieved



Activity viii presentation of sufficient data analysis and interpretation
Activity VIII. Presentation of Sufficient Data Analysis and Interpretation

Results:

  • 10 percent of the nine evaluation elements were Met

  • 18 percent of the evaluation elements Partially Met or Not Met

  • 72 percent of the evaluation elements were Not Applicable or Not Assessed


Activity viii review data analysis and interpretation of study results
Activity VIII: Review Data Analysis and Interpretation of Study Results

HSAG Evaluation Elements:

  • Is conducted according to the data analysis plan in the study design.

  • Allows for generalization of the results to the study population if a sample was selected.

  • Identifies factors that threaten internal or external validity of findings.

  • Includes an interpretation of findings.

  • Is presented in a way that provides accurate, clear, and easily understood information.


Activity viii review data analysis and interpretation of study results cont
Activity VIII: Review Data Analysis and Interpretation of Study Results (cont.)

HSAG Evaluation Elements:

  • Identifies initial measurement and remeasurement of study indicators.

  • Identifies statistical differences between initial measurement and remeasurement.

  • Identifies factors that affect the ability to compare initial measurement with remeasurement.

  • Includes the extent to which the study was successful.

    Bolded evaluation elements show areas for improvement


Changes in study design
Changes in Study Design? Study Results (cont.)

Study design should be same as baseline

  • Data source

  • Data collection methods

  • Data analysis

  • Target population or sample size

  • Sampling methodology

    If change:

    rationale must be specified & appropriate



Activity ix evidence of real improvement
Activity IX. Evidence of Real Improvement Score

Results:

  • 5 percent of the four evaluation elements were Met

  • 5 percent were Partially Met or Not Met

  • 90 percent of the evaluation elements were Not Applicable or Not Assessed


Activity ix assess the likelihood that reported improvement is real improvement
Activity IX: Assess the Likelihood that Reported Improvement is “Real” Improvement

HSAG Evaluation Elements:

  • The remeasurement methodology is the same as the baseline methodology.

  • There is documented improvement in processes or outcomes of care.

  • The improvement appears to be the result of intervention(s).

  • There is statistical evidence that observed improvement is true improvement.

    Bolded evaluation elements show areas for improvement


Statistical significance testing
Statistical Significance Testing is “Real” Improvement


Activity x data supporting sustained improvement achieved nhdp overall score
Activity X. Data Supporting Sustained Improvement Achieved - NHDP Overall Score

No Met evaluation elements for this Activity


Activity x data supporting sustained improvement achieved
Activity X. Data Supporting Sustained Improvement Achieved - NHDP Overall Score

Results:

  • 0 percent of the one evaluation element was Met

  • 10 percent was Partially Met or Not Met

  • 90 percent of the evaluation element was Not Applicable or Not Assessed


Activity x assess whether improvement is sustained
Activity X: Assess Whether Improvement is Sustained - NHDP Overall Score

HSAG Evaluation Elements:

  • Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant.


Quality outcomes achieved1
Quality Outcomes Achieved - NHDP Overall Score

Baseline

1st Yr

Demonstrable Improvement

Sustained Improvement


Sustained improvement
Sustained Improvement - NHDP Overall Score

  • Modifications in interventions

  • Changes in study design

  • Improvement sustained for 1 year


Hsag contact information
HSAG Contact Information - NHDP Overall Score

Cheryl Neel, RN, MPH,CPHQ

Manager, Performance Improvement Projects

[email protected]

602.745.6201

Denise Driscoll

Administrative Assistant

[email protected]

602.745.6260


Questions and answers
Questions and Answers - NHDP Overall Score


ad