1 / 110

Living with High Risk Technologies

Living with High Risk Technologies. Charles Perrow, “Normal Accidents”. Technology. First Picture of Water on Mars!. What is a Normal Accident?. Definitions Complexity and catastrophe Looking at Systems Risk outweigh benefits Conclusions. Outline. Normal Accident.

anakin
Download Presentation

Living with High Risk Technologies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Living with High Risk Technologies Charles Perrow, “Normal Accidents”

  2. Technology • First Picture of Water on Mars!

  3. What is a Normal Accident?

  4. Definitions Complexity and catastrophe Looking at Systems Risk outweigh benefits Conclusions. Outline

  5. Normal Accident • synonym for "inevitable accidents."

  6. Normal Accidents • Normal accidents in a particular system may be common or rare ("It is normal for us to die, but we only do it once."), but the system's characteristics make it inherently vulnerable to such accidents, hence their description as "normal."

  7. Failures • Discrete Failures • A single, specific, isolated failure is referred to as a "discrete" failure. X

  8. Redundant Systems • Redundant sub-systems provide a backup, an alternate way to control a process or accomplish a task, that will work in the event that the primary method fails. This avoids the "single-point" failure modes.

  9. Interactive Complexity • A system in which two or more discrete failures can interact in unexpected ways is described as "interactively complex." In many cases, these unexpected interactions can affect supposedly redundant sub-systems. A sufficiently complex system can be expected to have many such unanticipated failure mode interactions, making it vulnerable to normal accidents.

  10. Tight Coupling • The sub-components of a tightly coupled system have prompt and major impacts on each other. If what happens in one part has little impact on another part, or if everything happens slowly (in particular, slowly on the scale of human thinking times), the system is not described as "tightly coupled." Tight coupling also raises the odds that operator intervention will make things worse, since the true nature of the problem may well not be understood correctly.

  11. The interactive complexity and tight coupling Unexpected interactions will occur An accident will be reduced. Premise: characteristics of system – Not based on frequency. Normal Accident

  12. NASA View • NASA nominally works with the theory that accidents can be • prevented through good organizational design and • management. • Normal accident theory suggests that in complex, tightly coupled • systems, accidents are inevitable. • There are many activities underway to strengthen our safety • posture. • NASA’s new thrust in the analysis of close-calls provides insight • into the unplanned and unimaginable. • To defend against normal accidents, we must understand the • complex interactions of our programs, analyze close-calls and

  13. Complexity – levels of system and organization. Coupling - how closely the systems interact. Redundant pathway – Backup system that would prevent accidents. High Risk – Event with catastrophic potential. Definitions

  14. Discrete Failures – failures of isolated single systems Interactive Complexity Definitions

  15. Systems Individual Components Interactions Feedback systems Definitions

  16. Human error results in most accidents Mechanical failure is the highest cause of accidents. The environment impacts the accident. Design of the system is the most important prevention. Procedures are most important. Questionnaire…

  17. Answers • Eighty percent of Accidents are caused by human error.

  18. Creating Systems Organizations Sub-Organizations Understanding how they interact? Understand the risk? High Risk Systems

  19. Systems • Human Interface – complexity/saturation

  20. Four Distinct Failures Cooling system Valves closed Pilot Operated Relief Valve sticks open False indicators These Four occurred in 13 seconds Three Mile Island

  21. Diagram: Three Mile Island

  22. The Hydrogen Bubble: Hydrogen produced from zirconium Accident Took 33 hours into the accident Overpressure was ½ the design strength Three Mile Island

  23. Errors • Familiar with System • System Design flaws

  24. Benefit of Understanding Reduce Dangers – could TMI happen again? Remove the dangers Better operator Training (Three E’s) More Quality Control Effective Regulation. Risk & Benefits

  25. Operating Experience – Not sufficient Construction – pressure to build Safer Designs = less vulnerability? Defense in Depth (nuclear term) High Risk Systems

  26. Characteristics • High-Risk Technologies Characteristics (Beyond the toxic, explosive dangers) • Complexity • Coupling

  27. Definitions • Complexity A system in which two or more discrete failures can interact in unexpected ways is described as "interactively complex." In many cases, these unexpected interactions can affect supposedly redundant sub-systems. A sufficiently complex system can be expected to have many such unanticipated failure mode interactions, making it vulnerable to normal accidents.

  28. Coupling • Coupling The sub-components of a tightly coupled system have prompt and major impacts on each other. If what happens in one part has little impact on another part, or if everything happens slowly (in particular, slowly on the scale of human thinking times), the system is not described as "tightly coupled." Tight coupling also raises the odds that operator intervention will make things worse, since the true nature of the problem may well not be understood correctly.

  29. High Complexity Unexpected outcome • X Fails, Y was out of order • Interaction Piper Alpha

  30. High Complexity • X Fails, Y Fails, Z was out of order • Interaction Unexpected outcome Bhopal

  31. Learning from Mistakes? • Numerous examples given. • High Risk systems still in use • Still at risk? • How do we evaluate this?

  32. Complexity • Low Complexity – (Linear systems, near linear) • Result: Accident will not spread or be as serious.

  33. High Complexity Systems • Not all Interactions known • Some failure points not identified

  34. Normal Accidents • Why haven’t we had more?

  35. Low Complex Characteristics • Low Complexity (Organization) • Additional Resources available • Time to Spare • Other ways to accomplish task

  36. High Complexity - Organizations • Large organization • Slow for action • Complex Systems • Interconnection • Contradictions

  37. CMM • Definition – Complexity Maturity Model • Reference • Handout

  38. CMM Scoring • One Method

  39. High Complexity - CMM

  40. Coupling Definition: Example:

  41. Coupling • Coupling. (High) • Processes happen fast • Can’t be turned off • Failed parts can’t be isolated • No other way to keep production going safety

  42. High Coupling - decisions • Reluctant to shut down • $ is driver?? • Politics? • Production? • Unable to shut down process • Cost to shut down • Pressure to shut down • Damage to shut down

  43. Cost of Shut Down $300 Million to shut down a Nuclear Power Plant License good for 40 years only

  44. Coupling • Coupling Results: • Recovery is not possible • Disturbance spreads quickly • Irretrievable Results • Operator Action may make it worse

  45. How it Happens? • Normal Accident: Interactive Complexity and Tight Coupling

  46. High Complexity and Coupling • Examples: • Nuclear Power Plants • Laboratories • Industrial Processes

  47. Complex and Linear Interactions Event disrupts Both systems invisible Sub-system System 1 Sub-system Visible simultaneous

  48. Example • Chernobyl • Hot spot was not visible • Graphite rod affects

More Related