1 / 31

Usability and Human Factors

Usability and Human Factors. Electronic Health Records and Usability. Lecture a.

Download Presentation

Usability and Human Factors

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Usability and Human Factors Electronic Health Records and Usability Lecture a This material(Comp 15 Unit 6) was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1U24OC000003. This material was updated by The University of Texas Health Science Center at Houston under Award Number 90WT0006. This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/4.0/.

  2. Electronic Health Records and UsabilityLecture a – Learning Objectives • Define usability as it pertains to the EHR (Lecture a) • Challenges of EHR design and usability in typical workflow (Lecture a)

  3. Why? • 3 reports (AHRQ, HIMSS, NRC) in 2010 • strong, often direct relationship with  • clinical productivity • error rate • user fatigue • user satisfaction, effectiveness, efficiency

  4. Usability and EHR Certification • 2014: Certification criteria revised to include safety-enhanced design • 2015: US DHHS recommendsfollowing NIST and ISO standards for user centered design and usability evaluation: • NISTIR 7804 “Technical Evaluation, Testing, and Validation of the Usability of Electronic Health Records”  • provided examples of method(s) that could be employed for UCD, including ISO 9241-11, ISO 13407, ISO 16982, ISO/IEC 62366, ISO 9241-210 and NISTIR 7741.

  5. Why? (Cont’d – 1) • Lack of usability and accessibility will result in: • Lack of trust • Potential abuse • Lessons from electronic voting: • No election has been proven to have been hacked • However, usability has altered the outcome of elections1 • User’s view of system conditioned by interface experience Tognazzini, B. (2001).

  6. HIMSS Usability Criteria (2009) 1. Simplicity 2. Naturalness 3. Consistency 4. Minimizing cognitive load 5. Efficient interactions 6. Forgiveness 7. Feedback 8. Effective use of language 9. Effective information presentation 10. Preservation of context HIMSS EHR Usability Task Force (2009).

  7. National Center for Cognitive Informatics General Design Principles for EHRs • Consistency and standards • Visibility of system state • Match between system and world • Minimalist design • Minimize memory load • Informative feedback • Flexibility and efficiency • Good error messages • Prevent errors • Clear closure • Reversible actions • Use the user’s language • Users in control • Help and documentation

  8. The State of the Art • Egregious bad design exists in current EHRs • Complicated by: • Vendor contracts forbidding customer to talk about their experience • Lack of ability to publish (e.g. screenshots) can hinder scholarly research

  9. State of the Art (Cont’d – 1) • AHRQ Report on Vendor Practices (2010) • We’re not there yet • Many legacy systems >10 years old (AHRQ, 2010) • Standards are borrowed • Best practices not defined • Expectations unclear • Communication is limited • Formal usability testing rare • Usability is perceived to be overly subjective

  10. AHRQ:Report on Vendor Practices • But…. • Users involved in EHR design/review • Vendors compete on usability [is this something that should be competed on? Not a basic requirement?] • Users demand better products • Plans for formal usability testing increasing • Vendors willing to collaborate

  11. State of the Arguments • Some feel clinicians have ‘given up’ due to difficulty of getting things changed • Learned helplessness • Political and power struggle • Administration vs. staff • Vendor vs. users… • Lack of clinician input at design • Or too limited clinician input at all phases, from design to rollout

  12. AHRQ:Report on Vendor Practices (Cont’d – 1) • “The field is competitive so there is little sharing of best practices to the community. The industry should not look towards vendors to create these best practices. Other entities must step up and define [them] and let the industry adapt.” • “Products are picked on the amount of things they do, not how well they do them.” • “There are no standards most of the time, and when there are standards, there is no enforcement of them. The software industry has plenty of guidelines and good best practices, but in HIT, there are none.”

  13. Vendor Testing • A review of 41 vendor reports in 2015 found: • “A lack of adherence to ONC certification requirements and usability testing standards among several widely used EHR products...” • only 22% had used at least the minimum number of participants with clinical backgrounds Ratwani (2015).

  14. The Bad and the Ugly • Examples of egregious usability problems (Silverstein, 2009) • Related data far apart, requires user to click multiple times (‘clickorrhea’) • e.g. diastolic blood pressure 4 screens from systolic • Diagnosis (Dx) lists with rare Dx at top, common at bottom, hidden terms • incorrect selection

  15. What is Wrong With This Picture? Silverstein, S. (2009).

  16. What is Wrong With This Picture? (Cont’d – 1) • Note the warning that there are no warnings about abnormal results • "There are no indicator flags” Silverstein, S. (2009).

  17. What is Wrong With This Picture? (Cont’d – 2) • Results section says "negative" and "results final” • Most busy clinicians' eyes would stop there, especially in the wee hours Silverstein, S. (2009).

  18. What is Wrong With This Picture? (Cont’d – 3) • Addendum to the report that the result is actually positive for MRSA, a drug-resistant infection • No flag on that addition, yet during data entry at the lab, a flag was requested and seen by the reporting technician Silverstein, S. (2009).

  19. What’s Wrong? • Clinician forced to hunt around every result for indications of normalcy or abnormally • Disparity between what is seen at the lab (abnormal addendum is flagged) and the clinician’s view • Not fictitious: treatment delayed >24 hours until someone later noticed the addendum • System is CCHIT certified • On paper, the wrong item could have been crossed out

  20. More of the Bad • Alphabetical problem list: • Convenient for programmer, not for the doctor • Should prioritize by importance Silverstein, S. (2009).

  21. More of the Bad (Cont’d – 1) • List auto-populated by system • Not editable by clinician • Patient does not have atrial fibrillation (entered by nurse to speed order) • Requires vendor to remove Silverstein, S. (2009).

  22. More of the Bad (Cont’d – 2) • Multiple diabetes entries (incorrect) • Lack of controlled terminology mapping • Useless information: • ‘Medication use, long term’ • Clutters screen Silverstein, S. (2009).

  23. More of the Bad (Cont’d – 3) • Repetition, extraneous information, lack of focus and clarity • Lack of any symbolic or diagrammatic representations, and general clutter Silverstein, S. (2009).

  24. More of the Bad (Cont’d – 4) Silverstein, S. (2009).

  25. More of the Bad (Cont’d – 5) • Forces user to keep track of columns with finger on screen • Easy to confuse columns Silverstein, S. (2009).

  26. More of the Bad (Cont’d – 6) • Units (e.g. mg/dL) in every line, repetitious, distracting • Lab panel components scattered Silverstein, S. (2009).

  27. Electronic Health Records and Usability Summary – Lecture a • State of the Art • AHRQ repots on vendor practices • Examples of how wrong data are input into EHR systems

  28. Electronic Health Records and UsabilityReferences – Lecture a References 2014 Edition Release 2 Electronic Health Record (EHR) Certification Criteria and the ONC HIT Certification Program; Regulatory Flexibilities, Improvements, and Enhanced Health Information Exchange. 79 Federal Register. (September 11, 2014) 45 CFR Part 170. Final Rule. Retrieved on June 27, 2016 from https://www.gpo.gov/fdsys/pkg/FR-2014-09-11/pdf/2014-21633.pdf 2015 Edition Health Information Technology (Health IT) Certification Criteria, 2015 Edition Base Electronic Health Record (EHR) Definition, and ONC Health IT Certification Program Modifications. 80 Federal Register. (October 16, 2015). 45 CFR 170. Pages 62601-62759. Final Rule. Retrieved on June 27, 2016 from https://www.federalregister.gov/articles/2015/10/16/2015-25597/2015-edition-health-information-technology-health-it-certification-criteria-2015-edition-base Safety-enhanced design. Usercentered design processes must be applied to each capability an EHR technology includes that is specified in the following certification criteria: § 170.314(a)(1), (2), (6) through (8), (16) and (18) through (20) and (b)(3), (4), and (9).” Page 54479. HIMSS EHR Usability Task Force (2009). Defining and testing EMR usability: principles and proposed methods of EMR usability evaluation and ratings.. Retrieved on September 4th, 2011 from http://www.himss.org/defining-and-testing-emr-usability-principles-and-proposed-methods-emr-usability-evaluation-and(Link updated June 27, 2016) McDonnell C, Werner K, Wendel L. Electronic Health Record Usability: Vendor Practices and Perspectives. AHRQ Publication No. 09(10)-0091-3-EF. Rockville, MD: Agency for Healthcare Research and Quality. May 2010. National Center for Cognitive Informatics & Decision Making in Healthcare. (n.d.) General Design Principles for EHRs. Retrieved June 27, 2016 from https://sbmi.uth.edu/nccd/ehrusability/design/guidelines/Principles/index.htm. Ratwani, R. M., Benda, N. C., Hettinger, A. Z., & Fairbanks, R. J. (2015). Electronic health record vendor adherence to usability certification requirements and testing standards. JAMA, 314(10), 1070-1071. Retrieved from http://jama.jamanetwork.com/article.aspx?articleid=2434673&resultClick=3#jld150029r3

  29. Electronic Health Records and UsabilityReferences – Lecture a References Silverstein, S. (2009). Are Health IT Designers, Testers and Purchasers Trying to Harm Patients? Part 2 of a Series Healthcare Renewal Blog, Sunday, February 22, 2009. Retrieved on August 11th, 2010 from http://hcrenewal.blogspot.com/2009/02/are-health-it-designers-testers-and.html. Tognazzini, B. (2001). The Butterfly Ballot: Anatomy of a Disaster. Retrieved from http://www.asktog.com/columns/042ButterflyBallot.html

  30. Electronic Health Records and UsabilityReferences – Lecture a (Cont’d – 1) Images Slide 15, 16, 17, 18, and 20, 21, 22, 23, 24, 25, 26 : Silverstein, S. (2009). Are Health IT Designers, Testers and Purchasers Trying to Harm Patients? Part 2 of a Series Healthcare Renewal Blog, Sunday, February 22, 2009. Retrieved on August 11th, 2010 from http://hcrenewal.blogspot.com/2009/02/are-health-it-designers-testers-and.html

  31. Usability and Human FactorsElectronic Health Records and UsabilityLecture a This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1U24OC000003. This material was updated by The University of Texas Health Science Center at Houston under Award Number 90WT0006.

More Related