1 / 27

AHRQ State and Regional Demonstration Project Evaluation:

This project aims to improve healthcare communication and efficiency through data exchange among healthcare facilities in a three-county region. The evaluation will assess usability, costs, and clinical outcomes through qualitative research and the implementation of a system questionnaire. The project also aims to reduce inpatient admissions, duplicate testing, and expenses in the emergency department (ED) while improving workflow efficiency. Activity-based data collection and the use of a vault as the primary data source will be utilized to assess the impact on length of stay (LOS) and clinical outcomes.

mcnallyc
Download Presentation

AHRQ State and Regional Demonstration Project Evaluation:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AHRQ State and Regional Demonstration Project Evaluation: Barbeque, Blues, Beneficial Technology Kevin B. Johnson, MD, MS Associate Professor, Biomedical Informatics, Vanderbilt University Medical Center Nashville, Tennessee

  2. Project Overview

  3. Project Drivers • Incomplete information increases admission rate and ED LOS • Poor communication impacts ED efficiency • Less patient data at the point of care impacts the rate of test ordering • Less patient data at the point of care impacts clinical outcomes

  4. Data Exchange Has HUGE Potential ROI If data is exchanged across all facilities within the three-county region, the overall savings has potential to reach $48.1 million. Notes: 1 – Core healthcare entities include: Baptist Memphis, Le Bonheur Children’s Hospital, Methodist University Hospital, The Regional Medical Center (The MED), Saint Francis Hospital, St. Jude Children’s Research Hospital, Shelby County/Health Loop, UTMG, LabCorp, Memphis Managed Care-TLC, Omnicare

  5. Learn,Collaborate,Design Get the Model right Build the Team ID the settings Implement Outcomes Research System Implementation and Evaluation Qualitative Research Qualitative Research

  6. Key Aspects of Value Proposition • Qualitative Information • Costs • System usability • System use and utility • Clinical value (patient outcomes) • Dollars saved in care delivery process • Workflow efficiency gains

  7. Qualitative Questions • Usability (focus groups in ED) • 1 month and 1 year after go-live • Barriers to implementing infrastructure (cognitive artifacts) • Evaluated in year 4 • Drivers for adoption (interviews of governing board and ED staff) • Evaluated in year 5

  8. Costs • Personnel • Training • Community Meetings • Sales • Legal agreements • Organizational development • Equipment • Software development • Site-specific customizations and costs

  9. Assessing Usability: The Questionnaire for User Interaction Satisfaction (QUIS) is a tool developed by a multi-disciplinary team of researchers in the Human-Computer Interaction Lab (HCIL) at the University of Maryland at College Park. The QUIS was designed to assess users' subjective satisfaction with specific aspects of the human-computer interface. The QUIS team successfully addressed the reliability and validity problems found in other satisfaction measures, creating a measure that is highly reliable across many types of interfaces. Questionnaire for User Interface Satisfaction

  10. QUIS Details • Six scales • Eleven interface factors • Screen • Terminology/system feedback • learning factors • system capabilities • technical manuals • internet access • on-line tutorials, multimedia, voice recognition, virtual environments, and software installation

  11. System Usability • Will conduct usability testing of SPL • Vanderbilt as pilot site for face validity and modifying QUIS • Will modify accordingly • Will survey Memphis ED attendings and nursing staff 1 month after go live and again 6 months later

  12. System Usage and Epidemiology • Help desk use • Provider enrollment • Patient enrollment (RHIO in versus RHIO out) • Usage statistics • Latency • Downtime

  13. Content Quality • Accuracy • Missing data • Categorization errors

  14. Disease-specific Hypotheses • Improved neonatal GBBS management • Improved asthma controller med use • Improved ACE/ARB use in CHF • Improved immunization rates (flu, s.pneumo) • ?Others

  15. ED Administrative Outcomes • Reduce inpatient admissions • Decreased duplicate testing (radiology and lab) • Decreased ED Expenses • Workflow efficiency • Costs per visit

  16. Workflow change • Activity-based costing • Model construction at Vanderbilt • Model validation in Memphis • Use model to construct activity matrices in EDs under study • Assess how activity matrices change pre and 1 year post implementation

  17. Model Construction: Data Collection • Trained observers will document • Key transition points in information flow: • Eliciting prior medical history • Triage and treatment processes • Disposition/discharge from ED • Data Elements • Activity performed • Agent (RN, MD, Clerk, etc.) • Start-Stop times (hh:mm:ss)

  18. Sample of Activity-Based Data

  19. Activity-Based Estimates (Aggregate)

  20. Data Sources Outcome of interest Record Accessed During Study No RHIO record Accessed Patient with Data in vaults Patient without Data in vaults

  21. Using the Vault as the Primary Data Source for Outcomes = LOS of all encounters in vaults (before go live) Baseline LOS LOS of all encounters in vault whose records were not accessed LOS of all encounters in vault whose records were accessed = vs Change in LOS

  22. Rollout stable on off Clinical Outcomes Methodology • Pre-post • Easy to implement • Will not impact rollout or clinic flow • Sensitive to existing trends

  23. Rollout stable off Other Approaches • Assign times of day randomly to downtime status • Assign patients randomly to control group (no data for them) • Assign retrieval events randomly to control (i.e., no result) retrievals

  24. Covariate Analysis • ED (site) characteristics survey to be completed by ED Administration • Readiness survey to be completed by ED administration and clinical leadership

  25. IRB Approach: Five Approvals • Activity-based costing (approved) • Usability, readiness and demographic survey (letters of cooperation) • Baseline data for administrative measures and activity costing • System content quality • Disease-specific hypotheses

  26. Thanks!

More Related