1 / 34

Accident Causes- Theoretical Framework-

Accident Causes- Theoretical Framework-. Maurino, Reason, Johnston, Lee Consequences are Dire!. Terminology. Organizational Accident Latent failure Local trigger Active Failure Proximal cause Principle cause Unsafe acts- errors and violations. Individual or Collective errors.

chunt
Download Presentation

Accident Causes- Theoretical Framework-

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Accident Causes- Theoretical Framework- Maurino, Reason, Johnston, Lee Consequences are Dire!

  2. Terminology • Organizational Accident • Latent failure • Local trigger • Active Failure • Proximal cause • Principle cause • Unsafe acts- errors and violations

  3. Individual or Collective errors • The issue of whether accidents are individually caused or collectively caused revolves around three dimensions: • Moral • Scientific • Practical

  4. Moral Issue- much to be gained • Easier to pin legal responsibility on individuals- more direct connection • Issue compounded by professionals willing to accept responsibility- (captain etc.) • Most people highly value personal autonomy- “they should have known better” • We assume big failures result from big mistakes rather than several small ones • Emotional satisfaction in blaming someone

  5. The Scientific Dimension- do we stop with people directly involved or go on back? • Why stop at organizational roots? Why not go back to the beginning of creation? • Answer should be practical- go back so far as to be able to change organizational behavior • Peculiar nature of accidents- initially appear to be the convergence of many failures but we would see the same in any organization frozen in time- why then are failures rare?

  6. What then about the practical? • Moral issue- favors individual approach • Scientific issue- undecided • Answer here depends on two factors: • can latent factors be identified and stopped prior to an accident? • The degree to which improvements can better equip the organization to deal with local failures

  7. What have we learned from complex system failures? • Human error in technology breakdown has increased fourfold in 30 years • Failures are not restricted to the sharp end • How do we design a theoretical framework for the origin of organizational accidents?

  8. Step One- building blocks • What do all complex technological systems have in common? • An Organization: Fig./Table 1-1 • Processes • Cultures (P. 7)- common starting point for failure pathways • Local Conditions- Cockpit/ Tower- where organizational decisions meet the road • Defenses/safeguards

  9. Local Conditions- errors and violations • Those related to the task and its environment • Those related to people’s mental and physical states • These can both be sub-divided into three groups: error factors, violation factors, and common (to both) factors

  10. Defenses and safeguards • Checklists- redundant technology- human backups (copilot) • 2 elements to defenses and safeguards in high tech equipment: • automation- increases efficiency by replacing fallible humans • humans- restore order in the event of automation foul up- must think on feet in less than ideal conditions which we’re not good at.

  11. Defenses and Safeguards Ctn. • Classified along 2 dimensions • Functions Served • creating awareness of hazards • detect and warn of the presence of hazards • protect people and environment • Recover from off-normal conditions and restore system • Enable victim escape • Contain Hazmats

  12. Ctn. • Modes of Application: • Engineered safety devices (FMS, GPWS) • Policies, Standards/Controls • Procedures, instructions, supervision • Training, debriefing, practice • Protective equipment- oxygen mask

  13. Step two- Active and Latent Failures (Fig. 1.3 p. 13) • Distinguished in two ways: • length of time it takes failures to reveal adverse effects- active failures are immediate where latent failures can lie dormant for years • Who creates • Active- line personnel- pilots, controllers, mechanics • Latent- managerial/organizational- those separated in time and space from the immediate human-system interface.

  14. Active Failures- • Committed by those on the sharp end- usually caught by system failures but may occur in conjunction with other failures or in less defended systems to cause an accident. • Active failures may create gaps in system- not having plane de-iced prior to take-off

  15. Latent Failures • Due to loopholes in defenses which exist for sometime and may combine with active failures to produce a “trajectory of opportunity” for an accident. • Most are discovered after a defense has failed- not necessarily an accident • Usually revealed retrospectively- key is to do it prospectively

  16. Active/Latent ctn. • Also differ in their necessary basis for their classification • Active failures- psychological origins • Latent failures- systemic terms

  17. Active failures • Occur at three levels- skill based, rule based, and knowledge based which are distinguished along two dimensions: • conscious to automatic • routine to problematic (fig. 1.4) • Combined gives us an “activity space”

  18. Active Failures ctn. • Skill-based- highly practiced tasks, little thought, largely automatic • Rule based- We detect a need for behavior change- pre-packaged solution- emergency checklist • Knowledge based- When all else fails- very error prone especially in an emergency- United 232

  19. Errors vs. violations • Errors- failure of planned actions to achieve their desired consequences • Plan is adequate but actions deviate (slip)- failure of execution • Actions conform but plan is inappropriate- failure of formulation

  20. Violations • Deviations from safe operating practices/rules • deliberate • erroneous (speeding without being aware) • deliberate violations are of most interest as the actions were intended but not necessarily the bad consequences.

  21. Violations vs. errors • Errors are unintended • Errors derive mainly from informational problems (forgetting inattention, incomplete knowledge) violations are largely motivational problems (poor morale, failure to reward compliance and sanction non-compliance)

  22. Ctn. • Errors deal with what occurs in the mind of an individual where violations occur in a social context • Errors can be improved by improving the quality of information- violations require motivational remedies

  23. 3 types of errors and violations • Skill based slips and lapses: • Attentional slips- failure to monitor progress of routine actions at some critical point • Memory lapses- forgetfulness, most common type of active failure • Perceptual errors- misrecognize some object- we see what we expect to see • Most slips and lapses have minimal consequences- saying “fine” to “hello”- but in the cockpit they can be dire

  24. Rule based mistakes • Misapplication of good rules- braking to avoid a deer on an icy road- Humans tend to apply solutions to familiar problems on the basis of largely automatic pattern matching • application of bad rules- learning shortcuts and cutting corners- usually circumstances are forgiving and you “get by with it”

  25. Knowledge-based mistakes • Due to • limited capacity of working memory • incomplete mental models of the problem • Thinking on one’s feet- confirmation bias (bending the facts to fit a hasty conclusion), over-confidence, similarity bias,and frequency bias

  26. Violations at the skill based level • Again- corner cutting promoted by a largely indifferent environment

  27. Violations at the rule based level • More deliberate than skill based violations • (p. 20 - 21)

  28. Knowledge based violations • Novel circumstance- no specified procedure • Trainers and procedure writers can only address foreseeable situations • Usually Involve the unexpected occurrence of a rare but trained-for situation or an unlikely combination of individually familiar circumstances

  29. Step #3- Accidental events • Event- complete or partial penetration of an accident trajectory through the system’s defensive layers • Active and Latent failure pathways come together to create complete or partial trajectories of accident opportunity • Local triggers also interact here

  30. Gaps in defenses • Longstanding gaps due to dormant weaknesses • Gaps created knowingly as during maintenance • Gaps created by active failures • An accident occurs when the holes in the defenses line up (holes are dynamic) • What may cause an accident one day may not on another day • Consequences range from a free lesson to a smoking hole. In order to learn we must identify the “organizational pathogens”

  31. Causal Pathways- step #4 • Fig. 1.9- Accidents have varying characters. • Some involve all latent failures- challenger • Some involve all active failures- possibly Egypt Air 990. • Most involve some combination of both • Less defended organizations tend to have failures along the active pathway and visa versa (where a single active failure can serve as a trigger)

  32. In closing: • Cicero stated- “To err is human” • Accidents result from a failure of the risk management system to absorb the consequences of unsafe acts and omissions • Human error is stubborn- sophisticated, discrete solutions to human error will likely lead to more sophisticated sources for error

  33. Closing ctn: • We humans often judge people’s actions individually rather than as part of a system • This leads to backward reasoning (from the accident) which ultimately finds a stage where the chain could have been broken and thus “pilot (operator) error” becomes an easy out- we learn little

  34. Summary • P. 28

More Related