1 / 28

Applying Error and Threat Management Concepts to Soaring Operations

Applying Error and Threat Management Concepts to Soaring Operations. Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center. Most Aviation Accidents Attributed to Pilot Error. What does this mean? Lack of skill, vigilance, or conscientiousness? OR

syshe
Download Presentation

Applying Error and Threat Management Concepts to Soaring Operations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center

  2. Most Aviation Accidents Attributed to Pilot Error • What does this mean? Lack of skill, vigilance, or conscientiousness? OR Inherent vulnerability to error? • Substantial body of scientific research on pilot performance, error, and safety • Mostly directed to airline ops • Lack database of human factors in soaring accidents (NTSB and ASRS reports) • Many principles from airlines and military flight safety can be applied to soaring • Will illustrate with soaring examples

  3. Dispense with Fallacies Fallacy: Pilots who have accidents lack “the right stuff” Truth: - Not supported by data - With increasing skill pilots take on greater challenges - Experts in all professions make errors Fallacy: A flight operation is either “safe” or “unsafe” Truth: - Every operation has finite degree of risk - Identify and assess risks and develop plan of action

  4. Line-Oriented Safety Audits (LOSA) • Airline crews typically make one or more errors on most flights • In spite of training, skill, and frequent practice • Humans perform tasks far beyond capabilities of computers • Work with incomplete and ambiguous data, interpret diverse data sets, project downstream consequences • Humans are in airline cockpits to deal with the unexpected • Cognitive features that enable unique human abilities also produce vulnerability to characteristic forms of error • No matter how skillful and conscientious -- error rate is never zero

  5. Research on Two Major Domains of Pilot Error 1) Judgment and Decision-making 2) Prospective memory - remembering to perform intended actions • Talk will be technical in places • Illustrate with real-life soaring situations • Groundwork for practical countermeasures • Address mistakes of experts, not novices • Level of expertise of private pilot rating in gliders • How cognitive processes operate at this level of expertise

  6. Judgment and Decision-Making • Most common human factor cited by NTSB • But this displays hindsight bias • Experts in their training make decisions in ways other than formal analyses Formal analysis = 1) listing and evaluating all relevant aspects of situation 2) identifying all relevant options 3) assessing pros and cons of each option 4) assigning weighted score to each option • Gary Klein et al. studied firefighter scene commanders “Recognition-primed decision-making” • Quickly scanned scene • Identified situation in terms of prototypes from memory • Appropriate solution pops into mind • Naturalistic decision-making • Demonstrated in work of professionals in many fields • Not something we choose -- happens automatically as we develop experience

  7. Comparison • Formal problem-solving methods • Slow • Serial processing • Require substantial mental effort • Difficult to use under time pressure, stress (But can be used to uncover hidden aspects, downstream consequences) • Naturalistic decision-making • Automatic • Fast • Parallel processing • Less mental effort (But subject to characteristic biases that can cause errors)

  8. Anchoring and adjustment Assimilation bias Availability heuristic Base rate fallacy Confirmation bias Conjunction fallacy Conservatism Endowment effect (aka Status quo bias) Escalation of commitment Estimation overconfidence Expectation bias Familiarity bias (similar to Assimilation bias) Favor of causal explanations Framing effect Fundamental attribution bias Future discounting Omission bias Prior hypothesis bias Reasoning by analogy Recency bias Representativeness heuristic Cognitive Biases and Heuristics

  9. Aero-retrieve from Indian Valley Airporta personal experience

  10. Cognitive Factors Contributing to My Flawed Judgment and Decision Representativeness bias • Situation framed by matching similar prototypes in memory • Fair amount of experiences in off-field landings (50-60?) • Appeared to be good match to previous retrieve • Statistical sampling problem • Previous successes may not give true risk probability (wrong in either direction) • Recognition-primed decision-making • Does not inform individual of downstream consequences of novel actions: cocking the tow plane • Does not anticipate consequences of novel combinations of factors: loose dirt, cocking the tow plane, weeds, narrow runway, gullies • Sunk costs: invested time in aero retrieve • Framing of situation and goal: “make aero retrieve work” • Wishful thinking: more attractive option (aero retrieve) may have biased perception of risks

  11. Implications of this Episode • Natural decision methods: • Fail to alert us to hidden dangers • Sometimes bias perception of degree of risk of options • Judith Orasanu: “Plan continuation error” • Failure to revise original plan when circumstances change • One of most common forms of error in airline accidents

  12. Ways to Reduce Vulnerability to Decision Biases • Be aware of limitations and biases inherent in normal cognitive processes • Explicitly identify your assumptions • Example: wind direction for off-field landing • Voicing assumptions may prompt pilot to search for contrary evidence • Share assumptions with other personnel • Ask: “What if …?” • Ask: “Is anything different today from previous encounters?” • If several unusual aspects present, think how they might interact to produce downstream consequences • Individual factors may be benign but combine to produce threat • Always have a back door

  13. Prospective Memory (PM) • Remembering to perform intended actions • Common in everyday life - in aviation can be fatal • Several major airline accidents • Several friends killed or injured because failed to hook up a control rod during assembly • Will discuss prototypical PM situations in soaring • Errors most likely in presence of interruptions, distractions, or concurrent task demands

  14. Prospective memory Prototypical situations 1. Performing Habitual Tasks from Memory • Habitual tasks deeply encoded as procedural memory • Not likely to forget how to perform task • Vulnerable to omitting a step when interrupted or distracted • Habitual tasks (e.g., assembling frequently used sailplane) • Require minimal mental effort: each step of task “pops” into mind automatically • Executing each step automatically triggers retrieval of next step from memory • Environmental stimuli (e.g., parts of sailplane) help prompt retrieval of each step • Sequential execution of steps and environmental stimuli work together

  15. Prospective memory Opportunities for Error While Performing Habitual Tasks Example (1): Interruptions while assembling sailplane • ASW 20 horizontal stab held to vertical stab by bolt • Interruption after finger-tightening bolt • Breaks chain of one step triggering next step • Removes environmental stimulus (view of bolt) • Distinct chance will jump to next step, forgetting to cinch down the bolt Example (2): Landing with gear retracted • Several scenarios, for example: • Develop habit of extending gear on downwind Visual scene provides cues, stimuli that trigger retrieval from memory • Straight-in approach removes normal visual cues • Interruptions, distractions, or high workload can have the same effect

  16. Prospective memory Prototypical situations 2. Intending to Perform a Non-habitual Task that must be Deferred Example: During assembly you notice main gear tire is low • Decide to fill with air after pushing to launch area • Get busy with launch preparations and forget to fill tire • Why do we forget? • Brain has no special mechanisms for these situations • How do we (sometimes) remember to perform deferred intentions? • Depends on noticing environmental cues linked in memory to the intention: cues trigger retrieval from memory • Happenstance process Example: Might notice air bottle by hangar

  17. Prospective memory Prototypical situations 3. Interleaving Several Tasks Concurrently Example: Entering data or commands in flight computer • Going head down interrupts outside visual scan • Similar to using cell phone while driving • Mentally engaging tasks fully occupy consciousness (focal attention) momentarily forgetting outside scan • Eyes stay down longer than intended

  18. Ways to Reduce Vulnerability to Prospective Memory Errors 1) Try to avoid interrupting critical tasks • If interrupted, minimize time eyes are away from tasks 2) If interrupted, create salient cue to remind yourself Example: When interrupted during assembly, put hands in pockets 3) Use checklist for “killer” items • Know difference between “do” list and checklist • Checklist for killer items: short as possible • Checklists provide redundancy 4) When deferring tasks: • Create salient cue • Identify explicitly where and when you intend to perform deferred tasks • Example: put tape on canopy to remind about low tire 5) When going head down: • Develop/maintain habit of performing only one step at a time • Scan horizon between each step

  19. Error and Threat Management (ETM) • Latest development in Crew Resource Management (CRM) • CRM development started in late ‘70s (airlines, USAF, & NASA) • Accidents caused by poor communications, failing to grasp all aspects of situation, failing to manage workload effectively, and failing to develop appropriate plans • CRM originally focussed on preventing errors • Workload management, communications, situation awareness, decision-making, leadership/followership, and automation management • ETM emphasizes detecting and managing errors and threats • Instead of criticizing crews for making mistakes, instructors should reward crews for catching/managing errors • ETM in early stages of development

  20. Applying ETM to Soaring • Robert Sumwalt - USAirways captain and sailplane pilot • Extensive experience in aviation safety • Collaborates with airlines/ALPA/NASA • Remainder of talk combines my ideas with Robert’s ideas on adopting ETM to soaring operations

  21. Principles of ETM • Recognize vulnerability to errors • Especially decision-making and prospective memory

  22. Principles of ETM • Recognize vulnerability to errors • Identify threats • Three domains of threat: (1) Those present of every flight (e.g., rope break) (2) Threats present only on a particular flight (e.g., my Indian    Valley escapade) (3) Threats specific to certain situations (e.g., Carl Herald’s topic    today) • Before takeoff and before each phase of flight: Identify threats, ask “what if…”, and develop plan • Presence of multiple threats is especially hazardous • Vulnerability to error goes way up • Treat as red warning flag

  23. Principles of ETM • Recognize vulnerability to errors • Identify threats • Treat interruptions, distractions, and deferred tasks as red warning flags • Cannot identify threat in advance

  24. Principles of ETM • Recognize vulnerability to errors • Identify threats • Treat interruptions, distractions, and deferred tasks as red warning flags • Redundancy: develop multiple layers of defense • Defenses (countermeasures) are essential • No defense is perfect (e.g., checklists) • Thus need defense in depth • Threats and error much less likely to penetrate multiple layers • Be sure layers are independent of each other

  25. Adapted from Robert Sumwalt Layers of Defense to prevent assembly errors Wing runner check Self-connecting controls Positive control check Independent critical assembly check Pilot assembles glider

  26. Principles of ETM • Recognize vulnerability to errors • Identify threats • Treat interruptions, distractions, and deferred tasks as red warning flags • Redundancy: develop multiple layers of defense • Communicate your perceptions of threats and risk to fellow pilots • Soaring mostly single pilot, but always a team operation • Other pilots may not have noticed the threat • Help each other remember critical items to perform

  27. Principles of ETM • Recognize vulnerability to errors • Identify threats • Treat interruptions, distractions, and deferred tasks as red warning flags • Redundancy: develop multiple layers of defense • Communicate your perceptions of threats and risk to fellow pilots • Standardize critical procedures • Airlines and military rely on SOP to reduce errors and accidents • Soaring ops are different situation, but … • Each pilot should: • Work out explicit procedures for critical tasks • Develop strong habits to standardize execution of tasks • Standardizing your procedures reduces vulnerability to forgetting items or performing incorrectly • Use checklists

  28. Integrated Suggestions for Reducing Vulnerability to Error and Managing Threats 1) Recognize areas of vulnerability to errors 2) Identify and voice your assumptions 3) Identify threats 4) Ask “What if …” 5) Communicate your perceptions of threats to fellow pilots 6) Always have a back door 7) Treat interruptions, distractions, and deferred tasks as red  flags: • Minimize interruptions during critical tasks • Create salient cues as reminders 8) Break head-down tasks into small steps and interleave with  scanning the horizon 9) Standardize critical procedures 10) Use checklists

More Related