1 / 23

Melinda Karp MHQP Director of Programs June 26, 2006

Measuring and Reporting Patients’ Experiences with Their Doctors Process, Politics and Public Reports in Massachusetts. Melinda Karp MHQP Director of Programs June 26, 2006. Today’s Objectives. Provide brief background on MHQP as important context for measurement and reporting efforts

lynnea
Download Presentation

Melinda Karp MHQP Director of Programs June 26, 2006

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring and Reporting Patients’ Experiences with Their Doctors Process, Politics and Public Reports in Massachusetts Melinda Karp MHQP Director of Programs June 26, 2006

  2. Today’s Objectives • Provide brief background on MHQP as important context for measurement and reporting efforts • Describe evolution of MHQP agenda for measuring and reporting patient experiences—key methods questions in moving from research to large scale implementation • Describe stakeholder perspectives and decision points around key reporting issues

  3. Provider Organizations MA Hospital Association MA Medical Society 2 MHQP Physician Council representatives Government Agencies MA EOHHS CMS Region 1 Employers Analogue Devices Health Plans Blue Cross Blue Shield of Massachusetts Fallon Community Health Plan Harvard Pilgrim Health Care Health New England Tufts Health Plan Consumers Exec. Director Health Care for All Exec. Director NE Serve Academic Harris Berman, MD, Board Chair Stakeholders at the MHQP “Table”

  4. The Evolution of MHQP’s Patient Experience Measurement Agenda 2002-2003 • Demonstration project in partnership with The Health Institute (Funded by Commonwealth and RWJF) 2004-2005 • Development of viable business model for implementing statewide patient experience survey 2005-2006 • Fielding and reporting of statewide survey

  5. “1st Generation” Questions: Moving MD-Level Measurement into Practice • Is there enough performance variability to justify measurement? • What sample size is needed for highly reliable estimate of patients’ experiences with a physician? • How much of the measurement variance is accounted for by physicians as opposed to other elements of the system (practice site, network organization, plan)? • What is the risk of misclassification under varying reporting frameworks?

  6. Sample Size Requirements for Varying Physician-Level Reliability Thresholds

  7. Allocation of Explainable Variance: Doctor-Patient Interactions Communication Patient trust Health promotion Whole-person orientation Interpersonal treatment Source: Safran et al. JGIM 2006.

  8. Allocation of Explainable Variance: Organizational/Structural Features of Care Source: Safran et al. JGIM 2006.

  9. Risk of Misclassification Source: Safran et al. JGIM 2006; 21:13-21.

  10. Risk of Misclassification Source: Safran et al. JGIM 2006; 21:13-21.

  11. MHQP 2005 Statewide Survey • Physician-level survey format • Site-level sampling to support site-level reporting • Estimated samples required to achieve > 0.70 site-level reliability

  12. Site Reliability Chart: Integration Sample sizes needed to achieve site reliability for integration domain.

  13. Setting the Stage for Public Reporting: Key Issues for Physicians • What measures get reported • How measures get reported

  14. Percent of Sites with A-Level Reliability by Measure and Survey-Type

  15. Framework for Public Reporting Integration of Care 78.7 80.5 84.3 89.7 86.1 87.9 ½ 79.6 85.2 88. 8 15th ptile 50th ptile 85th ptile

  16. Summary and Implications • With sufficient sample sizes, data obtained using C/G CAHPS approach yields data with MD- and site-level reliability >0.70 • For site-level reliability, number of MDs per site influences required sample sizes • Risk of misclassification can be held to <5% with by • Limiting number of performance categories • Creating buffer (“zone of uncertainty”) around performance cutpoints • Trade-offs are likely around data quality standards (e.g., acceptable “risk”) vs. data completeness

  17. The Continuing Evolution… 2006-2007 • Engagement around QI activities • Participation in Commonwealth Fund grant to study highest performing practices • Grant proposal to Physician Foundation to develop and pilot integrated clinical-patient experience QI curriculum • Determining achievable benchmarks • Fielding of Specialist Care Survey in 2006/2007 • Repeat Primary Care Survey in 2007

  18. For more information …Melinda Karp, Director of Programs mkarp@mhqp.org617-972-9056www.mhqp.org

More Related