1 / 19

Key Dismukes Chief Scientist for Aerospace Human Factors Human Factors Research and Technology Division NASA-Ames Resear

Developing Tools for Threat and Error Management. Key Dismukes Chief Scientist for Aerospace Human Factors Human Factors Research and Technology Division NASA-Ames Research Center 20 August 2003 ALPA Air Safety Week. The Challenge.

dalmar
Download Presentation

Key Dismukes Chief Scientist for Aerospace Human Factors Human Factors Research and Technology Division NASA-Ames Resear

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing Tools for Threat and Error Management Key Dismukes Chief Scientist for Aerospace Human Factors Human Factors Research and Technology Division NASA-Ames Research Center 20 August 2003 ALPA Air Safety Week

  2. The Challenge • Level of safety in modern airline operations a major success story. • ALPA’s historic role • Human factors central • Current operating environment poses new challenges. • Need new tools to maintain/enhance safety.

  3. Threat and Error Management Team (TEM) • Major shift in philosophy: • Move away from blame for errors. • Train crews to recognize threats and vulnerable situations and to detect errors. • Teach ways to manage threats and errors. • Good concept but only the first step. • Must provide crew specific tools for TEM. • Sumwalt’s monitoring program is good example. • What else?

  4. Dispense with Old Fallacies about Crew Errors Fallacy: Pilots who are skillful, conscientious, and vigilant do not make errors. Truth: The best of pilots (and all other experts) make errors. Fallacy: If pilots normally perform a procedure without difficulty, then if an accident results when a crew did not perform that procedure adequately, that crew must have been deficient in some way. Truth: – 100% reliability not possible. – Occurrence of errors by experts somewhat random. – Probability and frequency of errors driven by external factors. • Specific characteristics of tasks performed • Demands on human information processing • Operating environment • Operating norms

  5. Support for TEM Training • Can reduce vulnerability to error by identifying the conditions that affect probability of errors. • NASA Ames Human Factors R&D Division addressing challenges to crew performance: • fatigue - decision-making - use of automation - workload management - etc. • Illustrate with example: concurrent task management • Inadvertent omission of a procedural step often a factor in accidents and incidents.

  6. Flaps Not Set to Takeoff Position Captain’s ASRS Report # 425357 (edited) “The aircraft we had for this flight had an inoperative fuel gauge, and an inoperative APU… DFW was in the midst of a departure rush. Since we had to do a crossbleed start, we did not do the After Start checklist immediately... We got clearance to taxi and requested a spot to do the start. We stopped ... as instructed and completed the start. As we were finishing, ground control was giving instructions to aircraft taxiing mentioning us in reference. We heard this, completed the after start checklist, and told ground we were ready to taxi. We were given instructions to hold short of taxiway WJ, give way to opposite direction traffic, then proceed N to taxiway Z to taxiway HY to taxiway Y across the bridge to taxiway J to taxiway EF for takeoff on runway 17R, follow an ACR aircraft. With all the above, taxiing to a new and unfamiliar runway (for this crew) in the dark, we didn't complete the before takeoff checklist. As we were cleared for takeoff and applied power, the takeoff warning horn sounded and we immediately realized that the flaps had not been extended for takeoff. • There were many distractions leading up to this incident which should have been warning signals. • There was a time pressure element, we were running late and knew we had an airplane change in a     short ground time in Denver before our next flight. • The APU and the necessity for the crossbleed start precluded the normal flow of calling for flaps     and the before takeoff checklist as we taxi from the gate. • I was distracted by calling for taxi, so ground control knew we were ready to move. • Immediately after that call we were given a complicated taxi route with hold short and follow     instructions and we were concentrating on finding the taxiways in the dark…”

  7. Research Questions • Major airline accidents have occurred when crews failed to set flaps, turn on pitot heat, or set control trim. • Why would experienced pilots forget a procedural step they normally perform day in and day out? • Why fail to catch omission with checklist?

  8. Jumpseat Observation Study(Loukopoulos, Dismukes, & Barshi) • Reviewed FOMs, observed line operations, analyzed ASRS, NTSB reports. • Discovered disconnect between FOM/training and actual line operations in area of task management.

  9. Depiction of Cockpit Task Management in FOM/Training • Linear: task A task B  task C in a fixed sequence. • Controllable: tasks are initiated by crew at their discretion. • Predictable: • Information available to crew when needed. • Individuals can communicate as needed. • Overall picture: flight operations are pilot driven and under moment-to-moment control of crew.

  10. Line Observations Reveal a Different Story • Each pilot must juggle several tasks concurrently. • Crews are frequently interrupted. • External demands arrive at unpredictable moments. • Conditions sometimes force task elements to be performed out of normal sequence. • Normal line operations are quite dynamic: • Crews must at times struggle to maintain control of the timing and sequence of their work tasks.

  11. So What? • Pilots become accustomed to concurrent task demands, interruptions, distractions and disruptions. • However these situations substantially increase vulnerability to error, especially omission of critical procedural steps.

  12. ERRORS attributed to concurrent task demands, interruptions, and disruptions (ASRS reports) Forgot logbook at ramp - kept deferring to check it; distractions; busy with preflight - discovered en route Skipped over checklist item - fuel pumps deferred during preflight because refueling - engine starvation in flight Omitted review of charts - distractions - speed violation on departure Entered wrong weight in FMS - tail strike at takeoff Improper setting of pressurization during preflight flow - interruptions - cabin altitude warning light in cruise Omitted flow and checklist items - interruptions; delay; change in departure runway - discover insufficient fuel at 12000 ft Read but not verify checklist item - distractions - pushback with throttles open, damage to aircraft Started taxi without clearance - crew discussing taxi instructions - struck pushback tug Neglected to set flaps -preoccupied with new departure clearance and packs-off operation - aborted takeoff FO failed to monitor CA – busy with flow; night taxi – taxi in wrong direction FO failed to monitor CA -runway change; busy reprogramming FMC - taxied past intended taxiway Omitted setting flap - busy with delayed engine start; rushed to accept takeoff clearance - aborted takeoff Failed to verify new clearance - monitoring convective activity on radar - flew wrong heading PREFLIGHT > PUSHBACK > TAXI > TAEKOFF > CLIMB > CRUISE > DESCEND > LAND Omitted climb checklist - busy copying hold instructions - missed setting altimeter and overshot altitude Failed to reset bleeds on - complex departure; multiple ATC calls; traffic - altitude warning and 02 mask deployment Did not notice wind - preoccupied with annunciator light; handling radios - track deviation Forgot to reset altimeters - distracted by FA in cockpit - TCAS RA and overshot arrival fix Failed to monitor PF - busy reprogramming FMS; weather changes - go around Failed to verify FMC settings - PNF giving IOE to PF; multiple ATC calls; hold instruction - flew pattern in wrong direction ATC instructions too close to turn fix - busy slowing aircraft; approach checklist; radios - failed to make published turn Vectored too close - busy catching up with glideslope; not instructed to switch to Tower - landed without clearance Forgot to switch to Tower at FAF - last minute runway change; busy reconfiguring aircraft - landed without clearance Unstabilized approach - accepted runway change right before FAF; did not review charts or make callouts - tailstrike Did not complete checklist - TCAS alerts; parallel runways in use; GPWS alert - did not extend gear for landing Did not extend gear; checklist interrupted; TCAS alerts; parallel runways in use; GPWS alert - struck ground on go-around

  13. Why So Vulnerable to These Errors? • Ongoing research project. • Very large number of procedural steps must be accomplished in cockpit tasks. • Extensive practice leads to proficiency. • Brain develops memory and control procedures to execute tasks automatically. • Minimum of conscious supervision. • Automatic processing has enormous advantages: efficient, fast, and normally accurate. • Automatic processing also has a major vulnerability: dumb and dutiful.

  14. PAX CT 107, 22, 5 3 WH ATIS Slakfj aslkfj890 Slkdfj 3409589 Slkafj f095j 019 Sa;lskdfjl Lskd LOAD Slakfj aslkfj890 Slkdfj 3409589 Slkafj f095j 019 Sa;lskdfjl Lskd Slkf9 9oy99 Slkdfj A;slkg eri kgj skj 9 FLIGHT PLAN Slakfj aslkfj890 Slkdfj 3409589 Slkafj f095j 019 Sa;lskdfjl Lskd SFAS ALSKFJ XLKAF ALKDFJJ;AL FUEL 107, 22, 5 3 WH PDC Slakfj aslkfj890 Slkdfj 3409589 Slkafj f095j 019 Sa;lskdfjl Lskd Slkf9 9oy99 Slkdfj A;slkg eri kgj skj 9 JEPP 107, 22, 5 • •x Aft Overhead Aft Overhead * * PREFLIGHT Flow (B73-300 - as trained) (checklist items are marked*) * * Forward Overhead Forward Overhead * * * * * Mode Control Panel * * Mode Control Panel * * * First Officer Instrument Captain Instrument * Center Instrument Center Instrument * * Captain Instrument First Officer Instrument * * * * Forward Electronic Forward Electronic * * Control Stand Control Stand * * * * Aft Electronic * Aft Electronic * * * Logbook/Gear Pins Logbook/Gear Pins

  15. Vulnerability of Automatic Processing • If procedural flow is interrupted, chain is broken. • Pause prevents one step from triggering the next. • Initiation of automatic process depends on receiving signal or noticing a cue in the cockpit environment. • If signal does not occur, individual is not prompted to initiate procedure. • High workload and/or rushing prevent conscious supervision of automatic processes. • Exacerbates vulnerability.

  16. Desirable to Minimize Disruptions of Normal Procedures • Analyze actual line ops write procedures to minimize opportunities for disruptions. • Avoid “floating” procedural items allowed to be performed at varying times. • Anchor critical items (e.g., setting takeoff flaps) to distinct step that cannot be forgotten (e.g., pushback).

  17. Cannot Completely Eliminate Disruption of Procedure • Interruptions, deferred tasks, concurrent task demands are inevitable. • How to reduce vulnerability to errors of omission? • Research will provide specific tools • Interim suggestions

  18. Hints to Reduce Vulnerability to Omitting Procedural Steps • Being aware of vulnerability reduces threat. • Especially vulnerable when head-down, communicating, searching for traffic, or managing abnormals (Dismukes, Young, & Sumwalt, ASRS Directline). • When interrupted or deferring a task: Create conspicuous cue as reminder. • Avoid rushing. • Pause at critical junctures to review. • Schedule / reschedule activities to minimize conflicts. • Treat monitoring as essential task (Sumwalt).

  19. For further information: http://human-factors.arc.nasa.gov/ihs/flightcognition/ This work is supported by NASA’s Aviation Safety Program and by the FAA (AFS-230)

More Related