1 / 19

Human Factors Risk Culture

Management of Human Factors Risk in Safety-Critical Industries Royal Aeronautical Society, 11 th May 2006. Human Factors Risk Culture. James Reason Emeritus Professor University of Manchester. System & cultural issues. Unsafe acts (errors and violations).

reed
Download Presentation

Human Factors Risk Culture

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Management of Human Factors Risk in Safety-Critical Industries Royal Aeronautical Society, 11th May 2006 Human Factors Risk Culture James Reason Emeritus Professor University of Manchester

  2. System & cultural issues Unsafe acts (errors and violations) Equipment failures (hardware  software) 1960s Metal fatigue Aberfan Ibrox 1970s Flixborough Seveso Tenerife TMI Mt Erebus 1980s Chernobyl Zeebrugge Bhopal Piper Alpha Dryden 1990s Paddington Long Island Alabama Estonia Eschede 2000s Linate Überlingen Columbia Expanding focus of safety concern across industries 1955 2005

  3. Though it has the definitional precision of a cloud The importance of culture Only culture can reach all parts of the system. Only culture can exert a consistent influence, for good or ill.

  4. Culture: Two aspects • Something an organisation is: shared values and beliefs. • Something an organisation has: structures, practices, systems. • Changing practices easier than changing values and beliefs.

  5. Just culture Reporting culture Learning culture A safe culture: Interlocking elements

  6. GENERATIVE Respects, anticipates and responds to risks. A just, learning, flexible, adaptive, prepared & informed culture. Strives for resilience. PROACTIVE Aware that ‘latent pathogens’ and ‘error traps’ lurk in system. Seeks to eliminate them beforehand. Listens to ‘sharp enders’. CALCULATIVE Systems to manage safety, often in response to external pressures. Data harvested rather than used. ‘By the book’. REACTIVE Safety given attention after an event. Concern about adverse publicity. Establishes an incident reporting system. PATHOLOGICAL Blame, denial and the blinkered pursuit of excellence (Vulnerable System Syndrome). Financial targets prevail: cheaper/faster. Cultural ‘strata’

  7. Some barriers to cultural progression Tradeoffs Fixes Silence Denial Blaming

  8. Culture change: a continuum • Don’t accept the need for change. • Accept need, but don’t know where to go. • Know where to go, but not how to get there. • Know how, but doubt it can be achieved. • Make changes, but they are cosmetic only. • Make changes, but no benefits—model doesn’t align with real world. • Model aligns today, but not tomorrow. • Successful transition—model keeps in step with a changing world.

  9. Contrasting perspectives on the human factor • Person model vs system model • Human-as-hazard vs human-as-hero ‘Reliability (safety) is a dynamic non-event’ (Karl Weick)

  10. Person model System model Getting the balance right Learned helplessness Blame Deny Isolate Both extremes have their pitfalls.

  11. On the front line . . . • People at the sharp end have little opportunity to improve the system overall. • We need to make them more risk-aware and ‘error-wise’ – mental skills that will: • Allow them to recognise situations with high error/risk potential. • Improve their ability to detect and recover errors that are made.

  12. Risk-awareness on the front line:Lessons from various industries • Western Mining Corporation: ‘Take time, take charge’. • Thinksafe SAM: Steps are S=spot the hazard, A=assess the risk, M=Make changes. • Esso’s ‘Step back five by five’. • Defensive driver training. • Three-bucket assessments

  13. The 3-bucket model forassessing risky situations 3 2 1 SELF CONTEXT TASK

  14. How the model works • In any given situation, the probability of unsafe act(s) being committed is a function of the amount of bad stuff in all three buckets. • Full buckets do not guarantee an unsafe act, nor do empty ones ensure safety. We are talking probabilities not certainties. • But with foreknowledge we can gauge these levels for any situation and act accordingly. • Don’t go there—challenge assumptions, seek help.

  15. Preaching risk awareness is not enough—needs system back up • Western Mining Corporation • Each day supervisors ask workers for examples of ‘take time take charge’. • What makes this happen is that, at weekly meetings with managers, supervisors provide examples of ‘take time take charge’. • Feedback to original reporters. • A manager at corporate level whose sole task is to make the process work.

  16. Resilience Individual mindfulness Collective mindfulness System Resilience Management Local risk awareness Frontline operators Turbulent interface between system & world World ‘Harm absorbers’ Activities

  17. AB DA Recognition that blaming has no remedial value. Realisation that system model has been applied successfully in other domains (aviation). Reporting identifies error traps. Backlash from management, patients and lawyers. Renewed eagerness to automate fallible people out of the loop. Increased reliance on administrative controls. Reduced autonomy. CD BC Local fixes & workarounds hide system problems from mgt. Attempts to be helpful bypass system defences & barriers. Forgetting to be afraid. Doing too much with too little. Unlike aviation, healthcare is a very personal business. Recognise need for ‘error wisdom’ on the frontline. Surgical excellence = ability to detect and recover errors. Human as hazard Errors & violations D A Systemic factors concealed Systemic factors revealed Reduced variability cycle C B Human as hero Adjustments, compensations & improvisations

  18. Something to aim for? • It is hoped that as an organization learns and matures, variability will diminish. • The tensions and transitions implicit in the cycle will remain, but the perturbations should be less disruptive. • Eventually (one hopes), the person and system models will operate cooperatively rather than competitively. • Enhanced resilience (one hopes) will be an emergent property of this greater harmony.

  19. Summary • In all hazardous industries, there has been an increasing involvement of systemic/cultural factors in the understanding of safety. • It was argued that a balance needs to be struck between system & person models. • The person model usually means ‘human as hazard’. But there is also ‘human as hero’. • Speculative cycles around two-sided person & system axes are outlined.

More Related