1 / 41

Introduction to Improving the Patient Experience Series

Introduction to Improving the Patient Experience Series. Part 2 – March 9, 2011. Measuring the Patient Experience Tammy Fisher, MPH Director, Quality & Performance Improvement San Francisco Health Plan. Agenda. Purposes of Measurement Measurement to identify areas for improvement

sukey
Download Presentation

Introduction to Improving the Patient Experience Series

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Improving the Patient Experience Series Part 2 – March 9, 2011 Measuring the Patient Experience Tammy Fisher, MPH Director, Quality & Performance Improvement San Francisco Health Plan

  2. Agenda • Purposes of Measurement • Measurement to identify areas for improvement • Tools, methodologies , frequency • Measurement for testing & implementing changes • Data collection strategies, tools, and methodologies . • Measurement to spread and sustain improvements • Tools, methodologies, frequency • Lessons Learned from the field • San Francisco Health Plan

  3. Purposes of Measurement 3

  4. Applying it to Patient Experience • Research • Source for changes to try • Helps build “will” to try changes • Improvement • Understand impact of changes quickly • Provide rapid feedback – engagement strategy • Convince others to try changes • Accountability • Sustainability- public reporting, pay for performance 4

  5. Measurement Continuum for Improvement 5

  6. Identify Areas and People for Improvement • Robust surveys • Robust measurement methodologies • Review trended results • Data at the organization and individual provider level • Look at composites strongly correlated with overall ratings of experience • Align areas with strategic goals “organizational or clinic energy” 6

  7. Example of a Priority Matrix for CAHPS Health Plan Survey Results 7

  8. Surveys • Clinician Group CAHPS Survey • https://www.cahps.ahrq.gov/content/products/CG/PROD_CG_CG40Products.asp?p=1021&s=213 • PBGH Short PAS Survey • PAS website: http://www.cchri.org/programs/programs_pas.html • Short PAS survey: http://www.calquality.org/programs/patientexp/documents/Short_Form_Survey_PCP_feb2010.doc • Other surveys – Press Ganey and Avatar 8

  9. Survey Options 9

  10. Robust Methodologies • Mail administration • 3 waves of mailing (initial mail, postcard reminder, second mail) • Telephone administration • At least 6 attempts across different days of the week and times of day • Mixed mail and telephone administration • Boost mail survey response by adding telephone administration 10

  11. Tips • Survey • Include questions that matter most to consumers • Questions that ask about care experience • Applicability across heterogeneous populations • Demonstrates strong psychometric properties • Sufficient response categories (4 point – 6 point scales) • Reporting • Includes internal and external benchmarks • Methodology • Appropriate sampling (reduce bias, large samples) • Standardized protocols • Timeframe- in the last 12 months • Frequency • Annually 11

  12. Measurement for quality improvement 12

  13. Purposes of Measurement • For Leadership to know if changes have an impact and to build a compelling case to spread changes to others • For providers and staff to get rapid feedback on tests of change to understand their progress towards their own aims and to spread to others in the clinic 13

  14. Three Key Questions • What are we trying to accomplish? ((Aim) • How will we know that a change is an improvement? (Measure) • What changes can we make that will result in an improvement? (Change) 14

  15. AIM Statement 15

  16. Selected Changes 16

  17. PDSA – Rapid Cycle Improvement Act Plan • Questions & predictions (why?) • Plan to carry out the cycle • What changes • are to be made? • Next cycle? Check/Study Do • Complete data analysis • Compare data to predictions • Summarize what was learned • Carry out the plan • Document • problems and • observations • Begin data analysis 17 Adapted from the Institute for Healthcare Improvement Breakthrough Series College

  18. A P S D D S P A A P S D A P S D Repeated Uses of PDSA Cycle Changes That Result in Improvement DATA Implementation of Change Wide-Scale Tests of Change Hunches Theories Ideas Follow-up Tests Very Small Scale Test Adapted from the IHI Breakthrough Series College 18

  19. Evaluate Impact of Changes • Data collection strategies/tools specific to changes tested & implemented • Methodologies that allow for sequential testing – small samples, less standardization • Data given to individuals testing changes • Enough data to know a change is an improvement and to convince others to try it • Frequent feedback during testing – daily, weekly, collecting data over time • Inexpensive methods 19

  20. Monthly Telephonic Surveys 21

  21. Data Collection Tools • Point of service surveys • Telephonic surveys • Comment cards • Patient exit surveys • Focus groups • Kiosks, via web • Feedback from people doing the changes • Observation • Patient Advisory Boards 22

  22. Point of Service • Focus on meaningful measures tied to AIM statement • Have 4-6 response choices • Include enough measures to appropriately evaluate aspect of care • Consistent methodology; train staff collecting information • Collect “just enough” data • Need 15 measurement points for a run chart • Data collection can be burdensome! 23

  23. Telephonic Surveys • More rapid feedback than mailed surveys • Typically less expensive • Outside vendors do it and provide reports • Easy to manipulate data for reporting • Less frequent – monthly data at best • Literature suggests more bias than mailed surveys (not so important when testing) 24

  24. Sample Comment Card Comment Card We would like to know what you think about your visit with Doctor X. □ Yes, Definitely □ Yes, Somewhat, □ No Did Dr. X listen carefully to you? Did Dr. X explain things in a way that was easy to understand? Is there anything you would like to comment on further? Thank you. We are committed to improving the care and services we provide our patients. 25

  25. Patient Exit Interviews • Rapid feedback on changes tested • Not burdensome to collect data • Uncover new issues which may go unreported in surveys • Requires translation of information into actionable behaviors • Providers “see” the feedback • Include 3-5 questions, mix of specific measures and open ended questions 26

  26. Patient Visit Walk-through

  27. Spreading & Sustaining Improvements • Survey • Include questions that matter most to consumers • Questions that ask about care experience • Applicability across heterogeneous populations • Demonstrates strong psychometric properties • Reporting • Comparisons within peer group • Methodology • Appropriate sampling (reduce bias, large samples) • Standardized protocols • Risk adjustment • Frequency • Monthly, Quarterly 28

  28. Another Look at Data • Medical Group in Los Angeles 29

  29. Lessons learned: San francisco health plan 30

  30. Areas for Improvement • Provider-patient communication, office staff, & Access to care • Performed in the lowest quartile • PPC and Access strongly correlated with overall ratings of care • Office staff support provider-patient communication – Team approach 31

  31. Improvement Project • AIM: To improve CAHPS scores by achieving the 50th percentile in the following composites by MY 2012: • Access to care • Provider-patient communication • APPROACH • Begin with 10 community clinics • Spread to most clinics by MY 2011 32

  32. Purposes for Measurement • For Leadership to know if changes have an impact and to build a compelling case to spread changes to other clinics • For Clinics to get rapid feedback on tests of change to understand their progress towards their own aims and to spread to others in the clinic 33

  33. Purpose 1 (for Spread)Measures & Approach 34

  34. CAHPS Survey Results For this provider, there was an 89% “confidence of change” in the 13% improvement for the measure: “Doctor Spends Enough Time with the Patient” 35

  35. Patient Ratings of their Care • Standardized survey instrument based on the Clinician-Group CAHPS visit survey, about 30 questions • Administered at the point of care by clinic • SFHP provides surveys in 3 languages (English, Spanish, Chinese) and picks up surveys on Friday of each week • Defined methodology – all patients, given after the visit • Three fielding periods: April 2010, Oct 2010, Jan 2011 • Each fielding period is 4 weeks • Risk adjusted results at the provider level with roll up at clinic level • Patient incentives – two movie tickets/survey • Extra incentives – up to $500 per clinic 36

  36. Clinic/Practice Site Satisfaction • Survey instrument based on the Dartmouth and Tantau & Associates, about 20 questions • Administered online by SFHP • SFHP sends a link to complete the survey online • Anonymous, results can be aggregated by role • Five fielding periods: March 2010, June 2010, Sept 2010, Dec 2010, March 2011 • Each fielding period is 2 weeks • Results at the clinic level 2 weeks following the close of the measurement period 37

  37. Purpose 2 (for Clinics)Measures & Approach 38

  38. Point of Care Survey 39

  39. Staff & Patient Feedback • “During today’s visits, my experience was excellent! Before today my appointments were not that great, but today, I noticed an improvement- A big change! Very Helpful, Thank you” • “During today’s visit, I noticed the staff with a better attitude towards their work, especially in the front desk.” • Our staff and patients are loving the electronic patient summary discharge. The patients are saying. “I know have something to reference back to about my visit. It makes it easy on my to remember what I need to do to take care of my health.” “I feel that I am responsible for my health” “I have a contract with my doctor” 40

  40. ChallengesLessons Learned • Adapted the CAHPS Visit Based Survey - low reliabilities and less variation – few response categories • Point of care methodology – introduced a lot of bias • Incentives were extremely helpful • Low literacy patients needed help with the survey • Very high scores on survey – switched from mean to proportional scoring • Providers trusted “just enough data” to implement change with their patients 41

More Related