1 / 34

Introduction to Improving the Patient Experience Series

Introduction to Improving the Patient Experience Series. Part 2 – April 7, 2010. Measuring the Patient Experience Tammy Fisher, MPH Director, Quality & Performance Improvement San Francisco Health Plan . Agenda . Purposes of Measurement Measurement to identify areas for improvement

ankti
Download Presentation

Introduction to Improving the Patient Experience Series

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Improving the Patient Experience Series Part 2 – April 7, 2010 Measuring the Patient Experience Tammy Fisher, MPH Director, Quality & Performance Improvement San Francisco Health Plan

  2. Agenda • Purposes of Measurement • Measurement to identify areas for improvement • Tools, methodologies , frequency • Measurement to evaluate impact of changes • Data collection strategies, tools, and methodologies . • Measurement to spread and sustain improvements • Tools, methodologies, frequency • Case Study • San Francisco Health Plan • Providing feedback • Strategies

  3. Purposes for Measurement 3

  4. Applying it to Patient Experience • Improvement • Understand impact of changes • Provide rapid feedback – engagement strategy • Convince others to try changes • Accountability • Diagnostic – identify high leverage areas and people for targeted improvements • Sustainability- public reporting, pay for performance • Research – borrow methods • Build a compelling business case to Leadership 4

  5. Measurement Continuum 5

  6. Identify Areas and People for Improvement • Robust surveys • Robust measurement methodologies • Measure annually • Data at the organization and individual provider level • Look at composites strongly correlated with overall ratings of experience 6

  7. Validated Surveys • Clinician Group CAHPS Survey • https://www.cahps.ahrq.gov/content/products/CG/PROD_CG_CG40Products.asp?p=1021&s=213 • Clinician Group CAHPS Visit Survey • https://www.cahps.ahrq.gov/content/products/CG/PROD_CG_CG40Products.asp?p=1021&s=213 • PBGH Short PAS Survey • PAS website: http://www.cchri.org/programs/programs_pas.html • Short PAS survey: http://www.calquality.org/programs/patientexp/documents/Short_Form_Survey_PCP_feb2010.doc 7

  8. Survey Options 8

  9. Robust Methodologies • Mail administration • 3 waves of mailing (initial mail, postcard reminder, second mail) • Telephone administration • At least 6 attempts across different days of the week and times of day • Mixed mail and telephone administration • Boost mail survey response by adding telephone administration 9

  10. Tips • Survey • Include questions that matter most to consumers • Questions that ask about care experience • Applicability across heterogeneous populations • Demonstrates strong psychometric properties • Reporting • Includes internal and external benchmarks • Methodology • Appropriate sampling (reduce bias, large samples) • Standardized protocols • Timeframe- in the last 12 months • Frequency • Annually 10

  11. Evaluate Impact of Changes • Data collection tool specific to changes tested • Methodologies that allow for sequential testing – small samples, less standardization • Data given to individuals testing changes • Frequent feedback – daily, weekly, monthly • Inexpensive methods 11

  12. Data Collection Tools • Point of service surveys • Telephonic surveys • Comment cards • Patient exit surveys • Focus groups • Kiosks, via web 12

  13. Point of Service • Good for measuring the effect of changes tested • Focus on meaningful measures • Have 4-6 response choices • Include 8-20 measures • Document collection methodology; train staff collecting information • Collect “just enough” data • Have at least 15 completed surveys and 15 measurement points • Easy to develop reports • Data collection is burdensome! 13

  14. Telephonic Surveys • More rapid feedback than mailed surveys • Typically less expensive • Outside vendors do it and provide reports • Easy to manipulate data for reporting • Less frequent – monthly data at best • Literature suggests more bias than mailed surveys 14

  15. Sample Comment Card Comment Card We would like to know what you think about your visit with Doctor X. □ Yes, Definitely □ Yes, Somewhat, □ No Did Dr. X listen carefully to you? Did Dr. X explain things in a way that was easy to understand? Is there anything you would like to comment on further? Thank you. We are committed to improving the care and services we provide our patients. 15

  16. Patient Exit Interviews • Rapid feedback on changes tested • Not burdensome to collect data • Uncover new issues which may go unreported in surveys • Requires translation of information into actionable behaviors • Providers “see” the feedback • Include 3-5 questions, mix of specific measures and open ended questions • Receptionist or non-clinic member obtains feedback (HP or IPA staff) 16

  17. Spreading & Sustaining Improvements • Survey • Include questions that matter most to consumers • Questions that ask about care experience • Applicability across heterogeneous populations • Demonstrates strong psychometric properties • Reporting • Comparisons within peer group • Methodology • Appropriate sampling (reduce bias, large samples) • Standardized protocols • Risk adjustment • Timeframe- most recent visit • Frequency • Quarterly 17

  18. Case study: SFHP 18

  19. Areas for Improvement • Provider- patient communication, office staff, & Access to care • Performed in the lowest quartile • PPC and Access strongly correlated with overall ratings of care • Office staff support provider-patient communication – Team approach 19

  20. Start Small, then Scale Up 3 -10 Practices 6 – 8 months 6 – 12 months • Learn about getting results at your practices • Develop physician and staff champions • Understand what it takes from the group to support practice changes Design systems and tools to support changes across many sites Network Rollout Thanks to Chuck Kilo, MD

  21. Improvement Project • AIM: To improve CAHPS scores by achieving the 50th percentile in the following composites by MY 2012: • Access to care • Provider-patient communication • APPROACH • Begin with 10 pilots • Spread to most providers by MY 2011 21

  22. Purposes for Measurement • For Leadership to know if changes have an impact and to build a compelling case to spread changes to other clinics • For Clinics to get rapid feedback on tests of change to understand their progress towards their own aims 22

  23. Purpose 1 (for Leadership)Measures & Approach 23

  24. Patient Ratings of their Care • Standardized survey instrument based on the Clinician-Group CAHPS visit survey, about 30 questions • Administered at the point of care by clinic • SFHP provides surveys in 3 languages (English, Spanish, Chinese) and picks up surveys on Friday of each week • Defined methodology – all patients, given after the visit • Five fielding periods: April 2010, July 2010, Oct 2010, Jan 2011, April 2011 • Each fielding period is 3 weeks • Risk adjusted results at the provider level with roll up at clinic level • Extra incentives – up to $500 per clinic 24

  25. Clinic/Practice Site Satisfaction • Survey instrument based on the Dartmouth and Tantau & Associates, about 20 questions • Administered online by SFHP • SFHP sends a link to complete the survey online • Anonymous, results can be aggregated by role • Five fielding periods: March 2010, June 2010, Sept 2010, Dec 2010, March 2011 • Each fielding period is 2 weeks • Results at the clinic level 2 weeks following the close of the measurement period 25

  26. Purpose 2 (for Clinics)Measures & Approach 26

  27. Providing feedback 27

  28. Tips • Provide supportive feedback (non-judgmental) • Include peer comparisons, targets, explanation of measures, show trended data over 2-3 years, identify “actionable behaviors” • Meet 1:1, use peer/clinic group meetings, dashboards, distribute via mail/email/web • Include testimonials from providers and patients – “stories” • Encourage Peer-peer interactions to follow-up with providers 28

  29. How Data is Displayed is Important • Pre/Post data collection + larger samples, can test for statistical significance + easy to interpret data - may miss an opportunity to intervene – results masked by natural variation - can’t measure sustainability • Run charts - hard to interpret - need enough data to establish trends + analyze variation and pinpoint when improvement occurred + measures process and ability to act on “slippage” + frequent feedback over time + evaluate sustainability • Narrative + hear the patient’s voice – see their comments + get data quickly - hard to identify trends and pinpoint areas for improvement

  30. © Pacific Business Group on Health

  31. © Pacific Business Group on Health

  32. Run Chart

  33. Run Chart

More Related