1 / 36

Mental Models and Loss of Control: Understanding Human-Automation Interaction

This talk explores the dichotomous role of flight crew in the ultimate safety system and the challenges they face in understanding both environment dynamics and automation. It introduces a computational human agent model that simulates situation awareness and mental models to predict human-automation interaction.

ismaelg
Download Presentation

Mental Models and Loss of Control: Understanding Human-Automation Interaction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Pilot Mental Models and Loss of Control Sebastien Mamessier, Karen Feigh, Amy Pritchett, David Dickson Georgia Institute of Technology

  2. Dichotomous Role of Flight Crew Source of variance Ultimate safety system Humans are bad! Humans are good!

  3. Reality is a Little of Both • Loss of Control & Automation Surprises trace origin to dipartites between human expectations & reality • Expectations are guided by human mental model "A mental model of a dynamic system is a relatively enduring and accessible, but limited, internal conceptual representation of an external system whose structure is analogous to the perceived structure of that system" (Doyle and Ford, 1999)

  4. Double Challenge for Humans • Humans need to understand environment & vehicle dynamics • Humans need to understand the automation as well Combined this understanding is described as Situation Awareness

  5. Situation Awareness (SA) Extensively described by Endlsey(1995)

  6. How to Predict HAI? • Two methods are often used to assess human automation interaction with new technology • Human in the loop (HITL) experimentation • Simulation of human agents • HITL experimentation has many drawbacks • Expensive due to personnel costs • Inherently limited in number of combinations that can be investigated • Requires mature design, working prototypes and trained pilots • Simulation requires accurate models of human agents • Questions of appropriate fidelity & scope are introduced • Lots of good high-fidelity models, but not always helpful for engineering applications

  7. Agent Model Requirements Must be able to simulate those cognitive constructs known to capture the symptoms of poor human-automation interaction Situation Awareness & Mental Models

  8. This talk describes a computational human agent model simulating both Situation Awareness & Mental Models

  9. Computational Work Modeling Framework

  10. Computational Work Modeling Cognitive Engineering uses work as unit of analysis of socio-technical systems • Capture the combined activities of both human and automation required by a proposed design in a single simulation platform • We call this platform WMC – work models that compute • Based on thinking of joint human-machine work from the cognitive engineering community of practice • Focus not on detailed explanation of cognition; focus instead on situated behavior – behavior in response to the environment • So, what does WMC look like and how is it different?

  11. Modeling Work Using Actions Note: no mention of agents anywhere!

  12. Modeling the Environment as a Collection of Resources • Environment is modeled as a collection of resources • Resources are modeled at the level of abstraction appropriate to the scenario of interest • Resources are registered with actions which can update (set) them and utilize (get) them • Resources maintain a record of when they were last updated • Actions can specify how much earlier a resource value may have been set and still be valid for use, similar to the construct "Quality of Service" in networking models

  13. The Role of Agents • A little different than traditional agent based simulation • Agents organize and regulate the execution of actions • Perfect agent: executes actions perfectly, in minimal time, without contributing interior dynamics • Human agent: maintains an internal representation of tasks, has limited resources to execute tasks, reorganizes tasks according to its capabilities and priorities • Agents maintain records of their actions & serve to help measure taskload over time

  14. Actions During Run Time

  15. Meanwhile, Inside the Human Agent • Our advanced human agents also: • Forget actions too long in delayed and interrupted lists • Maintain internal representations of resource values

  16. Modeling Pilot Monitoring & Situation Awareness

  17. Modeling Pilot Monitoring & Awareness • During simulation pilot activities can be based off of perfect knowledge of the world … but this is unrealistic. • To make realistic simulations we need to account for pilot’s situation awareness (SA) • Situation Awareness may be • Degraded by poor sensing, workload, distraction… • Enhanced by knowledge & experience

  18. Modeling Monitoring • Primary way pilots assess the value of a specific state variable is to monitor it. • Pilot monitoring is modeled with • Monitoring actions that create and update • Mental resources • Monitoring actions • Add error to the pilot’s assessment (noise, bias, sensitivity thresholds, saturation thresholds, and latency) • Be timed to reflect periodic sampling of the environment • Reflect “Level 1” situation awareness • Monitoring actions allow pilot agents to: • Detect events, • Interpret the state of the world • Make decisions & execute actions based on representation generated by models of good and bad monitoring

  19. ATC Workmodel t=17 t=13 t=5 t=0 ATC CheckRadar Resources ATC checkRadar AC Positions Agents ATC clearance ATC giveClearance AC Workmodel AC readBack AC Fly Resources AC Fly AC readBack Agents time Workmodels Simulation core Controller Pilot AC Automation AC Fly ATC CheckRadar Mental Model Mental Model Agents

  20. Observation alone is not enough • Observation not the only way to assess states of the system they control. • Experts are able to anticipate and mentally simulate both the continuous and discrete dynamics of the system. • We need to find a way to model how human’s handle both continuous and discrete dynamics • And – we need to find a framework where we can model both of them working together

  21. Modeling Pilot Knowledge & Awareness of Continuous Dynamics

  22. Pilot’s Model-based Estimation • Observation provides point estimates, but what happens in between? • How do humans assess continuously-evolving states of the system they control? • Mental estimation • Allows reduction in monitoring frequency • Provides second source to verify accuracy of primary source • Partially captures “Level 2” SA, comprehension • Experts can be represented as optimal model-based controllers capable of mental estimation* *Kleinman, D., Baron, S., and Levison, W., An optimal control model of human response part I: Theory and validation, Automatica, Vol. 6, No. 6, 1970, pp. 357–369.

  23. Modeling Pilot Knowledge • Pilot knowledge about aircraft dynamics allows mental simulation of the aircraft dynamics (historic foundation in the Optimal Control Model of human behavior) • Mental simulation is updated by monitoring of instruments and environment of direct and indirect measures • Mental simulation • will drift from actual state • must cover open-loop behavior (sans autoflight) • must cover closed-loop with autoflight • For specifics on implementation see: Mamessier, S. & Feigh, K. M. A Computational Approach to Situation Awareness and Mental Models for Continuous Dynamicsin Aviation. IEEE Transactions on Human-Machine Systems, Under Review

  24. Example: Continuous Dynamics • Work model of a pilot flying a continuous descent approach (CDA) along the LAX RIIVER 2 standard terminal arrival route • Agent models were enhanced with mental belief constructs and mental model capabilities for continuous dynamics • Vertical speed, air speed & altitude • Arrival being flown by the autoflight system with the Lateral Navigation (LNAV) and Vertical Navigation (VNAV) modes engaged

  25. Simulated plot of pilot mental understanding of the aircraft altitude

  26. Modeling Pilot Knowledge & Awareness of Discrete Dynamics

  27. Modeling Discrete Dynamics • Discrete dynamics of autoflight systems can be represented as finite state machines • Different modes define the control behavior applied to the autoflight system including • Axes being controlled • Targets tracked • Actuators uses • Transitions • Can be commanded by either the pilot or the autoflight system • Are subject to many conditionals (only partially observable) • Key aspects to model include: • Knowledge of aircraft behavior within a control mode • Mode transition criteria

  28. Modeling Pilot Knowledge • Pilot knowledge can be described as a finite state machine* • Mirrors the automation underlying state machine • We represent state transition rules as a boolean expression • Transition from one autoflight mode to another will occur if the resulting condition is met • Build on Javaux’s model of mental representation of rules where each conditional has a weighting *Javaux, D., An Algorithmic Method for Predicting Pilot-Mode Interaction Difficulties, Digital Avionics Systems Conference, 1998. Proceedings., 17th DASC. The AIAA/IEEE/SAE, Vol. 1, IEEE, 1998, pp. E21–1.

  29. Modeling Pilot Knowledge • Transition rules are not static, but evolve with time reflecting pilot experience • We model this using Hebbian learning where each conditional is weighted • After each use of the rule, the conditional weights are updated • Used ones are strengthened • Unused ones are weakened • Unused conditionals are slowly devalued until they can be easily forgotten • Workload will impact the likelihood of forgetting a conditional • Javaux showed that this approach can explain why pilots are often surprised by automation

  30. Example & | ALT > MCP_ALT + 10 VNAV_PTH VNAV_SPD

  31. Integrating Models of Continuous & Discrete Dynamics • The discrete dynamics have a large impact on continuous dynamics in aircraft because of the explicit and complex modes of control • Mental models of discrete dynamics greatly impact the maintenance of situation awareness & continuous state estimate • If a pilot fails at monitoring an automatic mode transition, future mental belief will be updated based on wrong mental models, decreasing SA significantly • Pilots with incorrect SA are likely to make wrong actions.

  32. Interaction with situation awareness and continuous dynamics Sampling rate Monitoring Situation Awareness Mental Model Monitoring + + To the actual system Discrete dynamics + - Estimator K (Covariance) G1 G2 G3 Mode selection Manual piloting Decision Making

  33. Mode Reversion Example

  34. Conclusions

  35. Relationship to Loss of Control • Pilot monitoring and awareness drives pilots’ awareness of orientation and energy situation awareness. Converse has been identified in numerous reviews as key contributors to loss of control accidents. • Pilot knowledge and awareness of continuous dynamics drives pilot manual control behavior, as well as interaction with the autoflight system. • Pilot knowledge and learning of discrete dynamics can be used to highlight where pilot mental models of the autoflight system can be faulty or incomplete. • The agent model presented here allows us to evaluate new and existing automated systems to better understand human response to them given the limitations on human cognition

  36. Questions

More Related