1 / 46

Plenary 2: Human Factors in Healthcare Rollin J. “Terry” Fairbanks, MD MS

Plenary 2: Human Factors in Healthcare Rollin J. “Terry” Fairbanks, MD MS Vice President, Quality & Safety, MedStar Health Professor of Emergency Medicine, Georgetown University Founding Director, National Center for Human Factors in Healthcare Washington DC, USA 23 March 2019.

combs
Download Presentation

Plenary 2: Human Factors in Healthcare Rollin J. “Terry” Fairbanks, MD MS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Plenary 2: Human Factors in Healthcare Rollin J. “Terry” Fairbanks, MD MS Vice President, Quality & Safety, MedStar Health Professor of Emergency Medicine, Georgetown University Founding Director, National Center for Human Factors in Healthcare Washington DC, USA 23 March 2019

  2. As part of our extensive program and with CPD hours awarded based on actual time spent learning, credit hours are offered based on attendance per session, requiring delegates to attend a minimum of 80% of a session to qualify for the allocated CPD hours. • Less than 80%attendance per session = 0 CPD hours • 80% or higherattendance per session = full allotted CPD hours • Total CPD hours for the forum are awarded based on the sum of CPD hours earned from all individual sessions. ME Forum 2019 Orientation

  3. Conflict of Interest The speaker(s) or presenter(s) in this session has/have no conflict of interest or disclosure in relation to this presentation.

  4. Learning Objectives • At the end of this session, participants will be able to: • Define Human Factors and Ergonomics • Explain the Systems Approach to Safety • Describe How Human Factors design impacts safety

  5. Think Differently…. To view safety and risk through the lens of safety science Twitter Discussion: #HFsafety @TerryFairbanks Goal

  6. Chart Credit: Modified from L. Leape

  7. The Problem • USA’s Institute of Medicine (IOM) • Report: 2000 • Govt: 50% less error in 5 years • Funding, Regs, High Focus • 19 Years later…. DISAPPOINTING IMPROVEMENT • WHY? Focus still on individual performance •  Solutions inconsistent with safety science Leape LL, Berwick DM. Five years after To Err Is Human: what have we learned? JAMA. May 18 2005;293(19) Wachter RM. The end of the beginning: Patient Safety Five Years After 'To Err Is Human'. Health Aff. 2004(11) Wachter RM. Patient Safety At Ten: Unmistakable Progress, Troubling Gaps. Health Aff. 2010 (29:1) Landrigan, Parry, et al. Temporal Trends in Rates of Patient Harm Resulting from Medical Care. NEJM 363(22): 2010 Shekelle, Pronovost, et al. Advancing the science of patient safety. Ann Int Med 154(10): 2011 Longo, Hewett, Ge, Schubert. The long road to patient safety: a status report on patient safety systems. JAMA, 294(22): 2005.

  8. What is the Healthcare Industry Doing Wrong? Preoccupation with Human Error Instead of reducing HARM ….Leads to ineffective solutions

  9. “Systems Approach” Is the goal: “Eliminate Human Error?” NO Human Error cannot be eliminated • Futile goal; misdirects resources/focus • Causes culture of blame and secrecy • “name, blame, shame, and train” mentality It is about reducing HARM

  10. Healthcare: Complex adaptive systemHumans and Technology“Sociotechnical System” Unordered: Cannot predict cause & Effect & Cannot be modeled or forecasted Ordered & Constrained & Can be reduced to a set of rules “Adaptive” in that their individual and collective behavior changes as a result of experience • 1. Sardone G, Wong G, Making sense of safety: a complexity based approach to safety interventions. Proceedings of the Association of Canadian Ergonomists 41st Annual Conference, Kelowna, BC, October 2010 • 2. Snowden D, cognitive-edge.com 3. Hollnagel, Woods, Leveson 2006

  11. Complex Adaptive Systems WORK AS IMAGINED How managers believe work is being done (rules) GAP WORK AS PERFORMED Every-day work: How work IS being done Adapted from: Ivan Pupulidy

  12. Human Factors & Ergonomics …discovers and applies scientific data about human behavior & cognition, abilities & limitations, physical traits, and other characteristics …to the design of tools & machines, systems, environments, processes, and jobs for productive, safe, comfortable, and effective human use.

  13. Human Factors & Ergonomics • “We don’t redesign humans; We redesign the system within which humans work”

  14. Cognitive Science (how we think) Industrial and Organizational Psychology (how we collaborate) Work Analysis (how we work now) System Safety Engineering (how we manage risk)

  15. 809M airline passengers/yr…. ...30,000 flights per day Pilots & ATC: 2 errors per hour

  16. Example: Defibrillator Case

  17. Defibrillator Case • VF cardiac arrest • nurse with patient • charges unit… • clears patient… presses “on” button Machine powers down • 2-3 minute delay in shock

  18. Huh? Medical Professionals: Just don’t make errors

  19. Knowledge-Based Improvisation in unfamiliar environments No routines or rules available Trial & Error Rule-Based Protocolized behavior Process, Procedure 1. Misapply Good rule 2. Not apply good rule 3. Apply bad rule Skill-Based Automated Routines Require little conscious attention Slips & lapses Figure adapted from: Embrey D. Understanding Human Behaviour and Error, Human Reliability Associates Based on Rasmussen’s SRK Model of cognitive control, adapted to explain error by Reason (1990, 2008)

  20. Policies, Inservices, SignageDiscipline, Training, Vigilance, Mindful Moments, etc Slips and Lapses: Common

  21. Defibrillator Case #2 • 32 year old healthy man • Presents to ED with sustained SVT & chest pain • Primary interventions unsuccessful • Synchronized shock @50j  refractory • Try again @ 100j  VF Arrest • 45m resuscitation attempt  patient dies • Investigation reveals that “MD failed to put device in SYNC mode for second shock”

  22. Defibrillator Usability Study • Two defibrillator models • SimManTM patient simulator • 50% of participants inadvertently delivered an unsynchronized countershock for SVT • 71% of participants never aware • Fairbanks RJ, Caplan SH, et al. Usability Study of Two Common Defibrillators Reveals Hazards. Annals of Emergency Medicine Oct 2007; 50(4): 424-432.

  23. Vendor Response “the preventative or corrective action is provided in the device labeling” Fairbanks RJ and Wears RL. Hazards With Medical Devices: the Role of Design. Annals of Emergency Medicine Nov 2008; 52(5): 519-521.

  24. Defibrillator Case= COMMON ERROR Trend found in EMS Reporting system Simulation study (Denmark) • 72 physicians • 5 of 192 defib attempts – Turned it off • Measurable delay in shock • Devices turn off even if charged and ready • Hoyer, Christensen, et al. Annals of Emergency Medicine 2008; 52(5): 512-514. • Fairbanks and Wears. Annals of Emergency Medicine 2008; 52(5): 519-521.

  25. Safety Attitudes “The single greatest impediment to error prevention in the medical industry is that we punish people for making mistakes.” --Lucian Leape, Testimony to congress

  26. Why is a culture of safety so important? • 1 serious or major injury • 10 minor injuries • 30 property damage injuries • 600 incidents with no visible damage or injury Bird, 1969 1,753,498 accidents from 297 companies, 21 different industries Slide acknowledgment: Robert Panzer, MD

  27. US Airways Non-Reprisal Policy “US Airways will not initiate disciplinary proceedings against any employee who discloses an incident or occurrence involving flight safety…” “This policy excludes events known or suspected to involve criminal activity, substance abuse, controlled substances, alcohol, or intentional falsification.” Safety “Bad Apples”

  28. Airline Safety Approaches • “It is vastly more important to identify the hazards and threats to safety, than to identify and punish an individual for a mistake.” • “We exchange the ability to reprimand an individual for the ability to gain greater knowledge.” • --Jeff Skiles, Miracle on Hudson first officer, • On airline safety philosophy

  29. Just Culture: The Three Behaviors Normal Error At-Risk Behavior Reckless Behavior A choice: risk not recognized or believed justified Conscious disregard of unreasonable risk Inadvertent action: slip, lapse, mistake • Manage through changes in: • Processes • Procedures • Recurrent training • Design • Environment • Manage through: • Removing incentives for At-Risk Behaviors • Creating incentives for healthy behaviors • Increasing situational awareness • Re-examining environment • Manage through: • Remedial action • Punitive action Griffith University Support Coach Sanction Adapted from: David Marx, Just Culture. Outcome Engility 2008: www.JustCulture.org Recommended Reading: Just Culture: Balancing Safety and Accountability, Sidney Dekker (2008)

  30. Indiana: 5 nurses $$$$

  31. We See… What We Expect To See Aoccdrnig to rscheearch at CmabrigdeUinervtisy, it deosn'tmttaer in wahtoredr the ltteers in a wrod are, the olnyiprmoetnttihng is taht the frist and lsatltteer be at the rghitpclae. The rset can be a toatlmses and you can sitllraed it wouthit a porbelm. Tihs is bcuseae the huamnmniddeos not raederveylteter by istlef, but the wrod as a wlohe.

  32. “Skills-Based Error” = Slips and Lapses = “Automaticity” Errors  HUGE OPPORTUNITY 

  33. Glucometer Case… • Patient with hx of poorly-controlled BG levels • Admitted to diabetic unit at hospital • Pt appears normal or hyperglycemic • Accucheck indicates critically low BG • Misinterpreted by tech and RN as critical high • Pt given repeated doses of insulin • Altered, rapid response called • Receives D50, Glucagon, & D10 drip • Stays in ICU for 3 days: MAJOR EVENT

  34. Nurse SUSPENDED

  35. One week later…Repeated Incident • Same scenario, different unity • Multiple RNs, NP involved • All misinterpreted critical LO as critical HI Did disciplinary response make us safer?

  36. “Critical Low” 0.1% (119/80,000)

  37. How could you miss it?

  38. Procurement: Who determines wording?

  39. MedStar Health’s Approach to Safety Proactive Reactive Proactive Safety Event Primary Prevention Secondary Prevention Tertiary Prevention Realities of Actual Context Recover and Learn from Events Design System for High Quality and Safety, Low Risk Identify and Mitigate Existing Hazards We found literally hundreds of predictive events in our reporting system” –Chief Pharmacy Officer, Clarion Health

  40. What is the impact on the safety culture? Meet Annie https://www.youtube.com/watch?v=zeldVu-3DpM 43

  41. 19 years later…. • Why Such Disappointing Change? • Focus on the INDIVIDUAL • Focus on EVENTS • Focus on OUTCOME • Culture of Blame • Lack of a true systems approach • Optimize technology & human interaction • Focus of Design

  42. “Fallibility is part of the human condition; We cannot change the human condition; But we can change the conditions under which people work” --James Reason, PhD

  43. Thank you. Contact Information: Rollin J. (Terry) Fairbanks, MD MS Vice President, Quality & Safety, MedStar Health Professor of Emergency Medicine, Georgetown University Founding Director, National Center for Human Factors Engineering in Healthcare Washington DC USA Terry.Fairbanks@MedStar.Net Twitter: @TerryFairbanks www.MedStarHealth.org www.MedicalHumanFactors.net

More Related