1 / 19

SAFE 541 Class 3 Spring 2011

SAFE 541 Class 3 Spring 2011. Objectives:. Students will be able to: List items in a AI plan List items to include in an AI kit Explain why human error could be a cause or a symptom of a system problem. Distinguish between active and latent errors

zoie
Download Presentation

SAFE 541 Class 3 Spring 2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SAFE 541 Class 3 Spring 2011

  2. Objectives: • Students will be able to: • List items in a AI plan • List items to include in an AI kit • Explain why human error could be a cause or a symptom of a system problem. • Distinguish between active and latent errors • Convince management of the need for identifying symptom errors • Begin to identify personal biases • Avoid blame as an outcome of AI based on this theory

  3. Accident Investigation Process • The accident investigation process involves the following steps: • Report the accident occurrence to a designated person within the organization • Provide first aid and medical care to injured person(s) and prevent further injuries or damage • Investigate the accident • Identify the causes • Report the findings • Develop a plan for corrective action • Implement the plan • Evaluate the effectiveness of the corrective action • Make changes for continuous improvement

  4. AI Plan---Let’s discuss • This plan should address: • investigator training-Where? • investigation kits-What? • the investigation priorities-Which ones? • gathering of evidence-Who? • preservation of evidence-Who & How?

  5. Human Error and Accident Management (HEAM) • Human Error and Accident Management offers means and ways to recognize and prevent these behaviors (error). • Provides for a means to control and recover from these behaviors when they do occur and to contain and escape from their adverse. • New approach (last 10 years) • Error Theory is not new (focused on moral decisions) • First focused on just unsafe acts (cause v. system) • Origin in major disasters: Three Mile Island, Aviation Accidents, Challenger Disaster • Caution: Avoid Blame Game

  6. Accidents and Human Errors • Human error is the cause of accidents • To explain a failure, you look for a failure • You must find person’s inaccurate assessments, wrong decisions, and bad judgments • Human error is a symptom of trouble deeper inside a system • To explain failure, do not try to find where people went wrong • Instead, find how person’s assessments and actions made sense at the time, given the circumstances that surrounded them

  7. Participation opportunity • Concept Check: Are we all on the “same page?” • What is your concept of Human Error? • Give examples of Human Error

  8. Examples • forgetfulness, inattention, poor motivation, carelessness, negligence, and recklessness (J. Reason Western Journal of Medicine,, June 2000) • 4 Categories according to James Reason: • slips, • lapses • violations • mistakes

  9. Levels of Human Errors • Random versus Systemic Errors • What’s the difference? • Is one type easier to control than the other?

  10. Human Error Theory (See James Reason) • Based on aviation accidents (pilot error) • Active-Human Error • Cognitive error • Distraction • inattentive • Latent-Systematic • Inadequate supervision

  11. Active Errors • Active errors become very visible in the evolution of an event. • The active errors are also the most obvious occurrences and the most rapidly identified human contributors in an accident.

  12. Latent Errors • The higher in the organization these latent errors are made, the more serious the consequences at the front line operation. • Latent errors of strategic nature, such as defining company policies affect safety attitudes and the safety culture in the organization. • The most serious and dangerous errors to be tackled. • Also see terms in the lit “covert failure” and “Operationally invisible"

  13. Other Techniques • Technique for Human Error Rate Prediction (THERP)-quantitative method

  14. Participation Opportunity • Accident Investigation Process • What are some ways you as an investigator can identify human errors as they contribute to the accident sequence? • Are human errors the root causes for accidents? • Why or why not? • What role does your knowledge about human error play in your investigation process?

  15. Questions for probing the reasons for events that appear to be caused by human error • Was the possibility of the error known? * • Were the potential consequences of the error known? * • What about the activity made it prone to the occurrence of the error? • What about the situation contributed to the creation of the error? • Was there an opportunity to prevent the error prior to it's occurrence? * • Once the error was committed, was there any way to recover from it? * • What about the system sustained the error instead of terminating it? • What fed the error, and drove it to become a bigger problem? • What made the consequences as bad as they were? • What (if anything) kept the consequences from being worse? • * If YES, why did the event proceed beyond this point? If NO, why not?

  16. Human Error Risk Management for Engineering Systems

  17. Conclusions Based on tonight’s discussion you should be able to: • List items in a AI plan • List items to include in an AI kit • Explain why human error could be a cause or a symptom of a system problem. • Distinguish between active and latent errors • Convince management of the need for identifying symptom errors • Begin to identify personal biases • Avoid blame as an outcome of AI based on this theory

  18. Questions & comments

  19. sources • http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1070929/ • http://www.eurocontrol.int/eec/gallery/content/public/document/eec/report/2006/017_Swiss_Cheese_Model.pdf

More Related