1 / 15

Adjusted Quartile Rankings for the 2011 UDS Clinical Performance Measures

Adjusted Quartile Rankings for the 2011 UDS Clinical Performance Measures. Quyen Ngo-Metzger, MD, MPH Data Branch Chief, Office of Quality of Data Bureau of Primary Health Care, HRSA Leiyu Shi, DrPH, MBA, MPA Professor and Director, Primary Care Policy Center

koto
Download Presentation

Adjusted Quartile Rankings for the 2011 UDS Clinical Performance Measures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Adjusted Quartile Rankings for the 2011 UDS Clinical Performance Measures Quyen Ngo-Metzger, MD, MPH Data Branch Chief, Office of Quality of Data Bureau of Primary Health Care, HRSA Leiyu Shi, DrPH, MBA, MPAProfessor and Director, Primary Care Policy Center Johns Hopkins University Bloomberg School of Public Health October 16, 2012

  2. BPHC Quality Strategy Better Care ⃘ Healthy People & Communities ⃘ Affordable Care INTEGRATED HEALTH SYSTEM • Strategy Implementation • Use Data and Information • Partnerships/ Collaboration for Quality Improvement • Priorities & Goals • Implementation of QA/QI Systems • All Health Centers fully implement their QA/QI plans • Adoption and Meaningful Use of EHRs • All Health Centers implement EHRs across all sites and providers • Patient Centered Medical Home Recognition • All Health Centers receive PCMH recognition • Improving Clinical Outcomes • All Health Centers meet/exceed HP2020 goals on at least one UDS clinical measure • Workforce/Team-Based Care INTEGRATED SERVICES COMPREHENSIVE SERVICES ACCESS

  3. Background • What is the grantee adjusted quartile? • Ranks clinical performance compared to other grantees, while accounting for specific differences for the following grantee characteristics: • Percent of patients that are uninsured, minority, special populations and EHR use   • Ranked from quartile 1 (highest 25% of reporting grantees) to quartile 4 (lowest 25%)

  4. Background • Why is it necessary to provide adjusted ranking? • Grantees’ differences in types of patients served and use of an EHR impact their performance for clinical measures. • Adjusted rankings provide grantees with additional information to compare performance to other grantees on a level playing field.

  5. Approach • How is the ranking determined? • A multivariate model predicts clinical performance based on a grantee’s characteristics. • The difference between the predicted and the grantee’s actual reported clinical performance is used to determine the ranking. • Grantees should expect to see a high ranking if it is performing above what is predicted for a grantee with similar characteristics. 

  6. Analysis Methods • Data • The 2011 UDS dataset was used to perform this analysis with a total sample size of 1,128 grantees. • Dependent Variables • The dependent variables are the 13 clinical performance measures, for example: • Childhood immunization % • Diabetes control HbA1c % • Hypertension control % • Pap test % • Low birth weight %

  7. Analysis Methods • Independent Variables • Grantee characteristics that were selected for adjustment: • % minority patients (includes Hispanic/Latino and Non-Hispanic/Latino: Asian, Black/African American, Native Hawaiian/PI, and American Indian/Alaska Native) • % uninsured patients • % homeless patients • % agricultural worker patients • EHR Status (non-EHR is the reference group): grantees are considered as using EHR if observations are greater than 70, and non-EHR if observations are equal to or less than 70

  8. Analysis Methods • Exclusion Criteria • Grantees with missing clinical performance measures • Grantees with less than 30 total patients in the universe for the low birth weight and prenatal care measures • For other clinical measures, grantees with less than 30 observations (both total patients in the universe and EHR/sampled charts) on a clinical performance measure

  9. Analysis Methods Steps • Use the multivariate model to compute parameter estimates [i.e., the regression coefficients or Betas(β)]. • Compute predicted rates based on the estimates from the multivariate linear model. • Calculate the difference between the actual and predicted rates. • Compute quartile ranking based on the difference between the actual and predicted rates.

  10. Results • Regression results are used to compute predicted performance rates. • The following two examples demonstrate: • A high ranking for a grantee whose actual performance rate exceeds its predicted performance rate. • A low ranking for a grantee whose predicted performance rate exceeds its actual performance rate. • We will use one measure, Childhood Immunization, to illustrate the adjusted analysis process.

  11. Results Table 1. Linear Regression Results: Childhood Immunization Clinical Performance among Grantees, UDS 2011 • The Beta values below are used to estimate predicted values for the clinical performance measures (y = 31.0396 + [0.1968×%minority] + [0×%uninsured] + [-0.1092×%homeless] + [0×%agricultural worker] + [0×EHR]).

  12. Results Example 1. Grantee’s actual clinical performance is higher than its predicted performance • Measure: Childhood Immunization • Actual performance rate: 55.56% (ranked in the 2nd quartile without adjustment) • Adjusted health center characteristics: • % minority: 7.02% • % uninsured: 33.59% • % homeless: 4.16% • % agricultural worker: 0.20% • EHR status: Yes • Predicted performance rate: 31.97% • The difference between the grantee’s actual and predicted performance is calculated. • These differences are then ranked for all grantees and placed into quartiles. • Grantee is now ranked in the 1st quartile after adjustment.

  13. Results Example 2. Grantee’s actual clinical performance is lower than its predicted performance • Measure: Childhood Immunization • Actual performance rate: 45.71% (ranked in the 2nd quartile without adjustment) • Adjusted health center characteristics: • % minority: 97.69% • % uninsured: 38.64% • % homeless: 0.40% • % agricultural worker: 0.01% • EHR status: None • Predicted performance rate: 50.22% • The difference between the grantee’s actual and predicted performance is calculated. • These differences are then ranked for all grantees and placed into quartiles. • Grantee is now ranked in the 3rd quartile after adjustment.

  14. Discussion How should we use and interpret the grantee adjusted quartile? • The adjusted quartile rankings are for informational purposes and do not change or replace the grantee’s reported clinical performance. • Grantees should continue to focus on performance improvement based on their reported clinical performance. • BPHC will use the reported clinical performance to assess performance over time. • Grantees should also use the Uniform Data System Health Center Trend report to set attainable goals for future years based on past performance. • The method for making ranking adjustments will be evaluated by BPHC and additional refinements will likely occur to further improve the adjusted rankings.

  15. Discussion • Further information • The adjusted quartile data can be found in the Electronic Handbook (EHB) in the Health Center Performance Comparison report and online in Individual Health Center Data reports at http://bphc.hrsa.gov/uds/view.aspx?q=rlg&year=2011. • The adjusted quartile ranking description and FAQs can be found at http://bphc.hrsa.gov/healthcenterdatastatistics/reporting/index.html. • Direct further questions to OQDComments@hrsa.gov.

More Related