Cognitive Engineering PSYC 530 Automation and Human Performance. Raja Parasuraman. Overview. Characteristics of Automation Human Performance in Automated Systems Designing for Effective Human-Automation Interaction. Automation: Definitions and Characteristics. Automation is Ubiquitous.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Cognitive EngineeringPSYC 530Automation and Human Performance
Characteristics of Automation
Human Performance in Automated Systems
Designing for Effective Human-Automation Interaction
Automation: Definitions and Characteristics
What is Automation?
“A machine or system that accomplishes (partially or fully) a function that was previously carried out (partially or fully) by a human operator”
Source: PARASURAMAN, R., & RILEY, V. (1997). Humans and
automation: Use, misuse, disuse, abuse. Human Factors.
Reasons for the March Towards More Automation
Human Performance in Automated Systems
Vigilance and Monitoring
Trust in Automation
Levels of Automation
HIGH 10. The computer decides everything, acts autonomously, ignoring the human.
9. informs the human only if it, the computer, decides to
8. informs the human only if asked, or
7. executes automatically, then necessarily informs the human, and
6. allows the human a restricted time to veto before automatic execution, or
5. executes that suggestion if the human approves, or
4. suggests one alternative
3. narrows the selection down to a few, or
2. The computer offers a complete set of decision/action alternatives, or
LOW 1. The computer offers no assistance: human takes all decisions and actions.
Source: SHERIDAN, T. B. (1992). Telerobotics, Automation, and Supervisory
Control. Cambridge, MA: MIT Press.
Human-Automation Interaction: Some Empirical Methods
A Field Study?
Automation and Human Performance
Automation can fundamentally change the nature of the cognitive demands and responsibilities of the human operators of system--often in ways that were unintended or unanticipated by designers
Automation and Human Performance: Benefits
Improved precision of performance
Reduced mental workload
Enhanced safety (automated warning systems)
Automation and Human Performance: Potential Costs
Unbalanced mental workload
Loss of situation awareness
Manual skill degradation
Automation: The Double-Edged Sword
Automation often provides clear benefits
Automation can also lead to novel, unanticipated problems and performance costs
Which tasks should be automated and to what level for optimal control, performance, and safety?
Technologists: Automate tasks as fully as technically possible—the ‘technological imperative’
Human factors engineers: Automate to an extent that balances efficiency with safety and ensures a proper role for the human in the resulting system
Automation Can But Does Not Always Reduce Mental Workload
“Clumsy Automation”—Increases mental workload during high task load, reduces it during low task load
“Cognitive Overhead”—Automation is difficult to engage, adjust, or turn off
Sources: WIENER, C. E. (1988). Cockpit automation. In E. L. Wiener & D. C. Nagel (Eds.) Human factors in aviation. San Diego: Academic Press.
KIRLIK, A (1993). Modeling strategic behavior in human-automation interaction:
Why an “aid” can (and should) go unused. Human Factors, 35.
Effects of Level of Automation on Situation Awareness
Levels of SA
Level 1: Perception
Level 2: Comprehension
Level 3: Projection
Source: Endsley, M., & Kiris, E. (1995). The out-of-the-loop performance
Problem and level of control in automation. Human Factors, 37, 390-398.
EFFECTS OF LEVEL OF AUTOMATION
ON OPERATOR SITUATION AWARENESS
SA LEVEL 2 (% Correct)
LEVEL OF AUTOMATION BEFORE AUTOMATION FAILURE
Trust Affects Automation Usage
The goal is to achieve calibrated trust
that is matched to the situation
Over-trust (Complacency)—Inappropriate use and over-reliance on automation
Under-trust (Distrust)—Disuse or turning off of automation
Source: LEE, J., & MORAY, N. (1992). Trust, control strategies, and allocation of function
In human-machine systems. Ergonomics.
PARASURAMAN, R., MOLLOY, R., & SINGH, I. L. (1993). Performance
consequences of automation-induced "complacency." International Journal of Aviation
Automation Trust and Complacency Study
24 Experienced General Aviation Pilots
2 Levels of Difficulty—Single and Multiple-Task
2 Levels of Automation (Manual, Automated)
Task: Carry out primary flight and fuel management tasks manually, monitor automated engine-systems task
Human Operators Are Poor at Monitoring Automated Systems When They Are Simultaneously Engaged in Other Manual Tasks
Designing for More Effective Human-Automation Interaction
Use Display Integration to Improve
the Observability of Automation Behaviors
Engine Indicator and Crew
Alerting System (EICAS)
Engine Monitoring and
Control System (EMACS)
Cost of Automation
DETECTION RATE (%)
Effects of Display Integration on Human-Automation Interaction
Source: Sklar, A. & Sarter, N. (2000). Good vibrations: Tactile feedback in
support of human-automation coordination. Human Factors.
Glass Cockpit Simulator
Tactile Feedback System
Roll Mode Transition
Autothrottle Mode Transition
DETECTION RATE OF UNCOMMANDED
AUTOMATION MODE TRANSITIONS (%)
Evaluative Criteria: Human Performance
Communication and coordination
Trust and complacency
Additional Evaluative Criteria
Production and Operating Costs
Costs of Decision/Action Consequences
Ease of System Integration
Human Factors “Fixes”
HUMAN-MACHINE SYSTEMS APPROACH
Human Factors Science and Engineering