Introduction to improving the patient experience series
Download
1 / 41

Introduction to Improving the Patient Experience Series - PowerPoint PPT Presentation


  • 91 Views
  • Uploaded on

Introduction to Improving the Patient Experience Series. Part 2 – March 9, 2011. Measuring the Patient Experience Tammy Fisher, MPH Director, Quality & Performance Improvement San Francisco Health Plan . Agenda . Purposes of Measurement Measurement to identify areas for improvement

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Introduction to Improving the Patient Experience Series' - honoria


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Introduction to improving the patient experience series

Introduction to Improving the Patient Experience Series

Part 2 – March 9, 2011

Measuring the Patient Experience

Tammy Fisher, MPH

Director, Quality & Performance Improvement

San Francisco Health Plan


Agenda
Agenda

  • Purposes of Measurement

  • Measurement to identify areas for improvement

    • Tools, methodologies , frequency

  • Measurement for testing & implementing changes

    • Data collection strategies, tools, and methodologies .

  • Measurement to spread and sustain improvements

    • Tools, methodologies, frequency

  • Lessons Learned from the field

    • San Francisco Health Plan



Applying it to patient experience
Applying it to Patient Experience

  • Research

    • Source for changes to try

    • Helps build “will” to try changes

  • Improvement

    • Understand impact of changes quickly

    • Provide rapid feedback – engagement strategy

    • Convince others to try changes

  • Accountability

    • Sustainability- public reporting, pay for performance

4



Identify areas and people for improvement
Identify Areas and People for Improvement

  • Robust surveys

  • Robust measurement methodologies

  • Review trended results

  • Data at the organization and individual provider level

  • Look at composites strongly correlated with overall ratings of experience

  • Align areas with strategic goals “organizational or clinic energy”

6



Surveys
Surveys Results

  • Clinician Group CAHPS Survey

    • https://www.cahps.ahrq.gov/content/products/CG/PROD_CG_CG40Products.asp?p=1021&s=213

  • PBGH Short PAS Survey

    • PAS website: http://www.cchri.org/programs/programs_pas.html

    • Short PAS survey: http://www.calquality.org/programs/patientexp/documents/Short_Form_Survey_PCP_feb2010.doc

  • Other surveys – Press Ganey and Avatar

  • 8


    Survey options
    Survey Options Results

    9


    Robust methodologies
    Robust Methodologies Results

    • Mail administration

      • 3 waves of mailing (initial mail, postcard reminder, second mail)

    • Telephone administration

      • At least 6 attempts across different days of the week and times of day

    • Mixed mail and telephone administration

      • Boost mail survey response by adding telephone administration

    10


    Tips Results

    • Survey

      • Include questions that matter most to consumers

      • Questions that ask about care experience

      • Applicability across heterogeneous populations

      • Demonstrates strong psychometric properties

      • Sufficient response categories (4 point – 6 point scales)

    • Reporting

      • Includes internal and external benchmarks

    • Methodology

      • Appropriate sampling (reduce bias, large samples)

      • Standardized protocols

      • Timeframe- in the last 12 months

    • Frequency

      • Annually

    11



    Purposes of measurement1
    Purposes of Measurement Results

    • For Leadership to know if changes have an impact and to build a compelling case to spread changes to others

    • For providers and staff to get rapid feedback on tests of change to understand their progress towards their own aims and to spread to others in the clinic

    13


    Three key questions
    Three Key Questions Results

    • What are we trying to accomplish? ((Aim)

    • How will we know that a change is an improvement? (Measure)

    • What changes can we make that will result in an improvement? (Change)

    14


    Aim statement
    AIM Statement Results

    15


    Selected changes
    Selected Changes Results

    16


    Pdsa rapid cycle improvement
    PDSA – Rapid Cycle Improvement Results

    Act

    Plan

    • Questions & predictions (why?)

    • Plan to carry out the cycle

    • What changes

    • are to be made?

    • Next cycle?

    Check/Study

    Do

    • Complete data analysis

    • Compare data to predictions

    • Summarize what was learned

    • Carry out the plan

    • Document

    • problems and

    • observations

    • Begin data analysis

    17

    Adapted from the Institute for Healthcare Improvement Breakthrough Series College


    Repeated uses of pdsa cycle

    A Results

    P

    S

    D

    D

    S

    P

    A

    A

    P

    S

    D

    A

    P

    S

    D

    Repeated Uses of PDSA Cycle

    Changes That Result in Improvement

    DATA

    Implementation of Change

    Wide-Scale Tests of Change

    Hunches Theories Ideas

    Follow-up Tests

    Very Small Scale Test

    Adapted from the IHI Breakthrough Series College

    18


    Evaluate impact of changes
    Evaluate Impact of Changes Results

    • Data collection strategies/tools specific to changes tested & implemented

    • Methodologies that allow for sequential testing – small samples, less standardization

    • Data given to individuals testing changes

    • Enough data to know a change is an improvement and to convince others to try it

    • Frequent feedback during testing – daily, weekly, collecting data over time

    • Inexpensive methods

    19



    Data collection tools
    Data Collection Tools Results

    • Point of service surveys

    • Telephonic surveys

    • Comment cards

    • Patient exit surveys

    • Focus groups

    • Kiosks, via web

    • Feedback from people doing the changes

    • Observation

    • Patient Advisory Boards

    22


    Point of service
    Point of Service Results

    • Focus on meaningful measures tied to AIM statement

    • Have 4-6 response choices

    • Include enough measures to appropriately evaluate aspect of care

    • Consistent methodology; train staff collecting information

    • Collect “just enough” data

    • Need 15 measurement points for a run chart

    • Data collection can be burdensome!

    23


    Telephonic surveys
    Telephonic Surveys Results

    • More rapid feedback than mailed surveys

    • Typically less expensive

    • Outside vendors do it and provide reports

    • Easy to manipulate data for reporting

    • Less frequent – monthly data at best

    • Literature suggests more bias than mailed surveys (not so important when testing)

    24


    Sample comment card
    Sample Comment Card Results

    Comment Card

    We would like to know what you think about your visit with Doctor X.

    □ Yes, Definitely □ Yes, Somewhat, □ No

    Did Dr. X listen carefully to you?

    Did Dr. X explain things in a

    way that was easy to understand?

    Is there anything you would like to comment on further?

    Thank you. We are committed to improving the care and services we provide our patients.

    25


    Patient exit interviews
    Patient Exit Interviews Results

    • Rapid feedback on changes tested

    • Not burdensome to collect data

    • Uncover new issues which may go unreported in surveys

    • Requires translation of information into actionable behaviors

    • Providers “see” the feedback

    • Include 3-5 questions, mix of specific measures and open ended questions

    26



    Spreading sustaining improvements
    Spreading & Sustaining Improvements Results

    • Survey

      • Include questions that matter most to consumers

      • Questions that ask about care experience

      • Applicability across heterogeneous populations

      • Demonstrates strong psychometric properties

    • Reporting

      • Comparisons within peer group

    • Methodology

      • Appropriate sampling (reduce bias, large samples)

      • Standardized protocols

      • Risk adjustment

    • Frequency

      • Monthly, Quarterly

    28


    Another look at data
    Another Look at Data Results

    • Medical Group in Los Angeles

    29


    Lessons learned san francisco health plan
    Lessons learned: ResultsSan francisco health plan

    30


    Areas for improvement
    Areas for Improvement Results

    • Provider-patient communication, office staff, & Access to care

      • Performed in the lowest quartile

      • PPC and Access strongly correlated with overall ratings of care

      • Office staff support provider-patient communication – Team approach

    31


    Improvement project
    Improvement Project Results

    • AIM: To improve CAHPS scores by achieving the 50th percentile in the following composites by MY 2012:

      • Access to care

      • Provider-patient communication

    • APPROACH

      • Begin with 10 community clinics

      • Spread to most clinics by MY 2011

    32


    Purposes for measurement
    Purposes for Measurement Results

    • For Leadership to know if changes have an impact and to build a compelling case to spread changes to other clinics

    • For Clinics to get rapid feedback on tests of change to understand their progress towards their own aims and to spread to others in the clinic

    33


    Purpose 1 for spread measures approach
    Purpose 1 Results(for Spread)Measures & Approach

    34


    Cahps survey results
    CAHPS Survey Results Results

    For this provider, there was an 89% “confidence of change” in the 13% improvement for the measure: “Doctor Spends Enough Time with the Patient”

    35


    Patient ratings of their care
    Patient Ratings of their Care Results

    • Standardized survey instrument based on the Clinician-Group CAHPS visit survey, about 30 questions

    • Administered at the point of care by clinic

      • SFHP provides surveys in 3 languages (English, Spanish, Chinese) and picks up surveys on Friday of each week

    • Defined methodology – all patients, given after the visit

    • Three fielding periods: April 2010, Oct 2010, Jan 2011

    • Each fielding period is 4 weeks

    • Risk adjusted results at the provider level with roll up at clinic level

    • Patient incentives – two movie tickets/survey

    • Extra incentives – up to $500 per clinic

    36


    Clinic practice site satisfaction
    Clinic/Practice Site Satisfaction Results

    • Survey instrument based on the Dartmouth and Tantau & Associates, about 20 questions

    • Administered online by SFHP

      • SFHP sends a link to complete the survey online

      • Anonymous, results can be aggregated by role

    • Five fielding periods: March 2010, June 2010, Sept 2010, Dec 2010, March 2011

    • Each fielding period is 2 weeks

    • Results at the clinic level 2 weeks following the close of the measurement period

    37


    Purpose 2 for clinics measures approach
    Purpose 2 Results(for Clinics)Measures & Approach

    38



    Staff patient feedback
    Staff & Patient Feedback Results

    • “During today’s visits, my experience was excellent! Before today my appointments were not that great, but today, I noticed an improvement- A big change! Very Helpful, Thank you”

    • “During today’s visit, I noticed the staff with a better attitude towards their work, especially in the front desk.”

    • Our staff and patients are loving the electronic patient summary discharge. The patients are saying. “I know have something to reference back to about my visit. It makes it easy on my to remember what I need to do to take care of my health.” “I feel that I am responsible for my health” “I have a contract with my doctor”

    40


    Challenges lessons learned
    Challenges ResultsLessons Learned

    • Adapted the CAHPS Visit Based Survey - low reliabilities and less variation – few response categories

    • Point of care methodology – introduced a lot of bias

    • Incentives were extremely helpful

    • Low literacy patients needed help with the survey

    • Very high scores on survey – switched from mean to proportional scoring

    • Providers trusted “just enough data” to implement change with their patients

    41


    ad