1 / 29

Office of Quality and Performance

Office of Quality and Performance. SD Fihn MD MPH December 12, 2008. Office of Quality and Safety. Associate Deputy USH for Quality & Safety (10 G) William E. Duncan, MD, PhD, MACP. Special Advisor Ashish Jha, MD, MPH. VHA Quality and Safety Advisory Committee. Assistant Deputy USH

beauregard
Download Presentation

Office of Quality and Performance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Office of Quality and Performance SD Fihn MD MPH December 12, 2008

  2. Office of Quality and Safety Associate Deputy USH for Quality & Safety (10 G) William E. Duncan, MD, PhD, MACP Special Advisor Ashish Jha, MD, MPH VHA Quality and Safety Advisory Committee Assistant Deputy USH for Quality & Safety Peter Almenoff, MD, FCCP Chief Quality & Performance Officer (10Q) Stephan D. Fihn, MD, MPH, FACP Chief Patient Safety Officer (10Q) James P. Bagian, MD, PE Quality and Safety Analytics Office of Quality & Performance OPES* Eileen Moran, MS National Center for Patient Safety IPEC Marta Render, MD IPEC = Inpatient Evaluation Center; OPES- Office of Productivity, Efficiency & Staffing

  3. OQP Team • Leadership • Joe Francis, Roxane Rusch, Mark Enderle • Programs • Evidence-Based Practice: Carla Cassidy • Performance Measurement: Tammy Czarnecki • Analysis: Steve Wright • C&P: Kate Enchelmayer • Accreditation: Dody Tyler • Pt. Experience: John Elter • Staff • Collaborations: PCS, 10N, Public Health, ONS, ORD

  4. Figures • Total budget ~$38 million (minus IT) • ~$30 million in contracts • EPRP, SHEP, TJC, CARF, URAC, CPGs, VetPRO • Pending -- Protected Peer Review • ~40 employees • 61 presently authorized • Located in Washington DC, Durham, Providence, Iowa, Tucson, Seattle, Detroit, Fayetteville AR

  5. Major Functions/Programs • Evidence Based Practice • Performance Measurement • Continuous Improvement and Practice Optimization • Utilization Management • Risk Management • Peer Review • Professional Credentialing and Privileging • Analytic Resources • Accreditation

  6. Evidence Based Practice • Cataloguing clinical evidence • Clinical practice guideline development and adoption with companion tools • Collaborative with Dept. of Defense • Collaborations and partnerships • Health Services Research and Development • Inter-Agency liaisons

  7. Post Deployment Health Uncomplicated Pregnancy Major Depressive Disorder PTSD Psychosis Substance abuse disorder Medically Unexplained Symptoms Opioid Use in Chronic Pain Mild TBI Post Operative Pain Bio/Chem/Rad/Blast Injury Tobacco Use Cessation Obesity Amputation Disease Prevention Heart Failure Hypertension Ischemic Heart Disease Dyslipidemia Diabetes Mellitus Pre End Stage Renal Disease COPD Stroke Rehabilitation Acute Stroke Rehabilitation Dysuria Asthma (Adult and Pediatric) GERD Glaucoma Erectile Dysfunction Low Back Pain Current Clinical Practice Guidelines 14 in process of update

  8. Decision Support Performance Measures Appropriateness Measures Clinical Guidelines Evidence Clinical Processes Formulary Clinical Reminders Future Course • Expansion of evidence program • Partnerships • Beyond CPG’s • Department-wide evidence base • Creation of coordinated “evidence-driven products” • CPGs – new formats • Clinical pathways • QI initiatives • Clinical reminders • Decision support

  9. Performance Measurement • Evidence-based measurement • Leverage of electronic data sources • Goal and target setting • Inter-agency collaboration • Initiatives in support of transparency and interoperability • Enhanced performance reporting

  10. VA Perspective • VA early adopter of PMs – initiated in1996 • Initially manual abstraction of clinical data from randomly selected records (EPRP) • Has evolved to include data from additional sources including Austin, PBM, DSS, VISTA • Multiple domains other than clinical • Reliance largely on audit/feedback • Performance contracts with senior leadership • Integral to transformation

  11. Performance Improvement • Prevention Index • 1996: • Influenza & pneumococcal vaccination • Breast, cervical, colorectal, prostate cancer screening • Screening for tobacco and problem alcohol use; counseling for tobacco cessation • 2004: • Added hyperlipidemia screening • Prostate Ca screening  education/counseling Similar story for most original PMs

  12. Performance Measurement Current System -- 101 Performance Measures: (Does not include self report/transformational measures) • Mission Critical (ECF plan): 50 clinical, 2 satisfaction & 1 access • Non-mission critical performance measures: 17 clinical, 27 access, functional status, & 2 non-clinical • 157 Supporting Indicators • 94 clinical, 27 non-Clinical, 10 functional status, & 10 access Performance is >90% on nearly 50% of measures

  13. Limitations of Current PM System • Data for many measures, including HEDIS and ORYX, obtained via EPRP which is dependent on manual review of medical records. • Expensive • Lag between collection and feedback. • Relatively small samples of patients inadequate to compare important subgroups such as sex, ethnic group, mental health patients, disabled, etc. • Introducing new measures cumbersome • Reliant upon the quality of documentation and the abstractor’s ability to accurately identify documentation. • Focuses attention on mechanics of measurement rather than improvement. • Measurement criteria often change  noncomparable data over time. • Aggregate measures difficult to interpret and may lead to inaccurate conclusions about overall performance. • Measures lacking in many critical areas such acute care, population health and use of community resources. • Adequate mechanisms for collection of data sometimes lacking

  14. Reliability & validity of measure IT Support Cost Opportunities for improvement Clinical Guidelines Performance Measures Clinical burden Magnitude of effect Relevance to strategic goals Framework for adopting PMs

  15. ProClarity Cube – Dozens of pre-built Graphical views for the field VA staff that have ProClarity Desktop Professional have the ability to link to our cube and build their own custom views to meet specialized business processes.

  16. AMI (11) CHF (4) Diabetes (7) Ischemic heart disease (3) Prevention (7) Pneumonia (9) Surgical Quality (SCIP) (8) Behavioral health screening (3) Tobacco follow-up (3) Inpatient Composites 9 Composite Measures Outpatient Composites

  17. Within 2 SD of the Mean Greater than 2 SD from the Mean

  18. 2SD below the Mean

  19. Other Composites- Under Discussion • Access measures: • Create groupings for MH, new patient wait time, established patient wait time? • Patient Satisfaction: • Inpatient & outpatient together or separate? • Incorporate satisfaction into disease-specific composites? • Access?

  20. Performance Alerts Subscriber-driven Performance Alert System designed to notify Senior and Line managers of possible issues related to performance measure groupings i.e. Cardiology, Diabetes Mellitus, Surgical Infection Prevention, Mental Health and so on. .

  21. Performance Measurement Planned Actions: • Reestablish commitment to evidence, value to pt care • Systematic, critical review of existing measures • Migration from manual sampling to automated, longitudinal measures on all patients • Resource survey underway • Potential e-measures for ’09: Mental heath, diabetes • Development of meaningful patient-centered measures • Longitudinal (cohort) measures: CHF • Incorporating preferences

  22. Analysis and Reporting • Current emphasis on data collection and reporting – minimal analysis. • OQP budget >2/3’s devoted to data collection • Major effort to automate reporting with dashboards and proclarity cubes • Insufficient investment in directed, in-depth analyses of quality issues

  23. SHEP: Transition to CAHPS • SHEP is transitioning from Picker Dimensions of Care to the Consumer Assessment of Healthcare Plans and Systems (CAHPS) • NRC-Picker and CAHPS SHEP are being parallel administered for 4th quarter FY08 to determine: • Correlation of new questions with old dimensions • Continuity of performance measures • Includes assessment of internal consistency, item response theory, construct validity, factor analysis, and calibration

  24. Implications for QuERI • Research expertise/consultation critical • New PMs • Evidence development (ESPs) • QI tools – identification, grading, cataloguing • SHEP – new modules, health status • Analytic support, e.g., surgical complexity, adjusted outcomes • Suggestions?

More Related