1 / 29

Please read this before using presentation

Please read this before using presentation. This presentation is based on content presented at the 2007 Mines Safety Roadshow held in October 2007

traugott
Download Presentation

Please read this before using presentation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Please read this before using presentation • This presentation is based on content presented at the 2007 Mines Safety Roadshow held in October 2007 • It is made available for non-commercial use (eg toolbox meetings) subject to the condition that the PowerPoint file is not altered without permission from Resources Safety • Supporting resources, such as brochures and posters, are available from Resources Safety • For resources, information or clarification, please contact: ResourcesSafety@docep.wa.gov.au or visit www.docep.wa.gov.au/ResourcesSafety

  2. Toolbox presentation:Safety culture – part 1Integrating human factors and safety management systems October 2007

  3. Safety culture toolbox series • Integrating human factors and safety management systems (Author: Bert Boquet, Embry-Riddle Aeronautical Museum) • What does safety culture mean for mining? • Safety culture in practice in Australian mining

  4. Safety culture: a brief history • Chernobyl, 1986 • International Atomic Energy Agency noted a “Poor Safety Culture” as a factor in the accident.

  5. Safety culture: a brief history • King’s Cross underground fire, 1987 Thirty-one people died in the Kings Cross fire, which broke out as commuters headed home. Poor safety culture was cited as a factor.

  6. Safety culture: a brief history • Piper Alpha, 1988 Worst ever offshore petroleum accident, during which 167 people died and a billion dollar platform was destroyed. Poor safety culture was cited as contributing to this accident.

  7. 44,000 to 98,000 people in the US die each year as a result of medical error. This includes: • Wrong medications • Too much of a given drug • Surgical error • Infection control • Misdiagnosis • In summary, human error

  8. “There are activities in which the degree of professional skill which must be required is so high, and the potential consequences of the smallest departure from that high standard are so serious, that one failure to perform in accordance with those standards is enough to justify dismissal.” — Lord Denning

  9. “The single greatest impediment to error prevention in the medical industry is that we punish people for making mistakes.” — Dr Lucian Leape, Harvard School of Public Health

  10. “People make errors, which leads to accidents. Accidents lead to deaths. The standard solution is to blame the people involved. If we find out who made the errors and punish them, we solve the problem, right? Wrong. The problem is seldom the fault of an individual; it is the fault of the system. Change the people without changing the system and the problems continue.” — Don Norman

  11. Nature of blame • Operator is seen as having control • Operator makes conscious decisions about how to carry out job • Operator has rules and procedures to follow • Organization has vested interest in blaming operator

  12. Nature of blame cont. • Because people fear being “punished” for errors made on the job, minor events and mistakes go unreported • Furthermore, by focusing on the active failures, this in practice absolves organization from blame (and liability) when accidents occur

  13. What does all of this have to do with safety? • To improve safety, we must make better use of minor human error events • Threat of corporate disciplinary action and regulatory enforcement is a major obstacle to event reporting and investigation • Engineering a sound safety culture is how we go about managing human error • However, nature of human error remains a problem in most systems

  14. Research sponsors - FAA, DoD, NASA, & airplane manufacturers provide research funding. - Research programs are needs-based and data-driven. Interventions are therefore very effective. EffectiveIntervention and Prevention Programs Data-drivenResearch Prevention Mitigation Accident database Mechanical failure Accident investigation Database analysis - Highly sophisticated techniques and procedures - Information is objective and quantifiable - Effective at determining why the failure occurred - Designed around traditional categories - Variables are well-defined and causally related - Organization and structure facilitate access and use - Traditional analyses are clearly outlined and readily performed. - Frequent analyses help identify common mechanical and engineering safety issues. - Catastrophic failures are infrequent events - When failures do occur, they are often less severe or hazardous due to effective intervention programs. ACCIDENT Feedback Wiegmann, D. & Shappell, S. (2001). Human error analysis of commercial aviation accidents: Application of the Human Factors Analysis and Classification System (HFACS). Aviation, Space, and Environmental Medicine,72, 1006-1016.

  15. Research sponsors - FAA, DoD, NASA, & Airlines provide funding for safety research programs. - Lack of good data leads to research programs based primarily on interests and intuitions. Interventions are therefore less effective. IneffectiveIntervention and Prevention Programs Fad-drivenResearch Prevention Mitigation Accident database Database analysis Accident investigation Human error - Traditional human factors analyses are onerous due to ill-defined variables and database structures. - Few analyses have been performed to identify underlying human factors safety issues. - Not designed around any particular human error framework - Variables often ill-defined - Organization and structure difficult to understand - Less sophisticated techniques and procedures - Information is qualitative and illusive - Focus on “what” happened but not “why” it happened - Errors occur frequently and are the major cause of accidents. - Few safety programs are effective at preventing the occurrence or consequences of these errors. ACCIDENT Feedback Wiegmann, D. & Shappell, S. (2001). Human error analysis of commercial aviation accidents: Application of the Human Factors Analysis and Classification System (HFACS). Aviation, Space, and Environmental Medicine,72, 1006-1016.

  16. Systems approach tohuman error management Perhaps one of the best models for human error within an organization or a system is one proposed by James Reason:

  17. Unsafe acts Reason’s “Swiss-cheese” model of human error Failed or absent defences Adapted from Reason (1990)

  18. Reason’s “Swiss-cheese” model of human error Preconditions for unsafe acts Unsafe acts Failed or absent defences Adapted from Reason (1990)

  19. Reason’s “Swiss-cheese” model of human error Unsafe supervision Preconditions for unsafe Acts Unsafe acts Failed or absent defences Adapted from Reason (1990)

  20. Inputs Organizational factors Reason’s “Swiss-cheese” model of human error Unsafe supervision Preconditions for unsafe Acts Unsafe acts Failed or absent defences Accident and injury Adapted from Reason (1990)

  21. Applying the cheese • In order to make full use of the systems approach, one must be willing to look beyond the active failures: • Medicine • Aviation • Air traffic controllers • All have become very skilled at identifying active failures • Not so for latent failures

  22. Practical implications Study of all commercial aircraft accidents in the US 1990–2002 The investigation used the Human Factors Analysis and Classification System to classify both active and latent failures from 1,020 National Transportation and Safety Board Accident Reports • Only 58 organizational failures were identified • Of these, most were operational processes • Most surprisingly, only 46 supervisory failures were identified from the reports • The majority being inadequate supervision

  23. Where are the latent failures? • On the surface, the foregoing data may point to the fact that there may be relatively few latent failures in US commercial aviation • Another, more plausible, alternative is that the incidence is under-reported • If the case is one of under-reporting, then what?

  24. Error reporting and safety systems • Any safety management system (SMS) is only as good as the quality and the quantity of the data (errors) that are reported • Accidents in and of themselves provide little information regarding the status of an organizations health with respect to safety • Poor error data leads to inconsistent results with respect to interventions

  25. Violations

  26. Technology/ Engineering Displays, automation, “bend metal” Organizational/ Administrative New Rules, policies, procedures Interventions Selection, training, incentives Human/Crew Facilities, weather, stressors Environment Scheduling, risk, processes Task/Mission Intervention approaches (philosophies)

  27. Technology/ Engineering Decision Errors Organizational/ Administrative Skill-Based Errors Errors Human/Crew Perceptual Errors Interventions Unsafe Acts Environment Routine Task/Mission Violations Exceptional Intervention approaches (philosophies)

  28. Field Tool Investigator Trng HFACS Analysis ® Human Factors Analysis and Classification System HFIX ® Identify Vulnerabilities Human Factors Intervention matriX Identify/Develop HF Programs HF Consulting Safety Management Process Services Science Data Hazard Identification Hazard Assessment Monitor Identify Interventions Focus Groups Feasibility Prioritize Intervention Assessment Intervention Implementation

  29. Queries Albert (Bert) Boquet Department Chair, Human Factors and Systems College of Arts and Sciences Embry-Ruddle Aeronautical University Daytona Beach FL 32114-3900 USA  + 1 386 226 7035 • boque007@erau.edu • www.embryriddle.edu

More Related