1 / 58

www.aptima.com MA ▪ DC ▪ OH ▪ FL

Human-Centered Engineering Perspectives on Simulation-Based Training Daniel Serfaty Emily Wiese Presented to SimTrans Copenhagen, Denmark 22 June 2009. www.aptima.com MA ▪ DC ▪ OH ▪ FL. © 2009, Aptima, Inc. Agenda. Introduction to Aptima

cuthbert
Download Presentation

www.aptima.com MA ▪ DC ▪ OH ▪ FL

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Human-Centered Engineering Perspectives on Simulation-Based Training Daniel Serfaty Emily Wiese Presented to SimTrans Copenhagen, Denmark22 June 2009 www.aptima.com MA ▪ DC ▪ OH ▪ FL © 2009, Aptima, Inc. Ó2009, Aptima, Inc.

  2. Agenda • Introduction to Aptima • Examples of Capabilities in Human-Centered Engineering • Five Emerging Technologies in Simulation-Based Training • Scenario Engineering • Simulation Fidelity • Performance Measurement (with A-Measure toolkit demo) • Cognitive Skills Training • Team Communications Assessment • Discussion Ó2009, Aptima, Inc.

  3. What is Human–Centered Engineering? Social & Organizational Structures Technology Capabilities Human Agents Congruence Mission, Tasks & Work Processes Ó2009, Aptima, Inc.

  4. Aptima, Inc. Interdisciplinary Small Business Founded in 1995 Consistent annual growth (40% CAGR) 100+ staff (80% graduate degrees) Human Centered Engineering Analyze and design complex socio-technical systems Combine social science theory with quantitative, computational methods Serving government and commercial clients 350+ contracts with the Defense Industry Offices Boston/Woburn, MA, (HQ) Washington, DC Dayton, OH Ft Walton Beach, FL Ó2009, Aptima, Inc. 4

  5. Skill Set Command & Control Military Training Leadership Complex Information Display National Security Solutions Medical & Healthcare Aviation Emergency Preparedness Stability & Support Operations Education Safety Educational Backgrounds Domain Expertise Ó2009, Aptima, Inc. 5

  6. Optimizing Performance in Mission-Critical Environments Ó2009, Aptima, Inc.

  7. Examples of Capabilities • Performance Measurement • Socio-Cultural Applications • Organizational Engineering • Training Ó2009, Aptima, Inc.

  8. Ó2009, Aptima, Inc.

  9. Ó2009, Aptima, Inc.

  10. Official U.S. Navy photo. Neither the U.S. Navy nor any other component nor any other component of the Department of Defense has approved, endorsed, or authorized this product [or promotion, or service, or activity]. Ó2009, Aptima, Inc.

  11. Ó2009, Aptima, Inc.

  12. Ó2009, Aptima, Inc.

  13. Ó2009, Aptima, Inc.

  14. Ó2009, Aptima, Inc.

  15. Ó2009, Aptima, Inc.

  16. Ó2009, Aptima, Inc.

  17. Ó2009, Aptima, Inc.

  18. Ó2009, Aptima, Inc.

  19. Ó2009, Aptima, Inc.

  20. Ó2009, Aptima, Inc.

  21. Agenda Introduction to Aptima Examples of Capabilities in Human-Centered Engineering Five Emerging Technologies in Simulation-Based Training Scenario Engineering Simulation Fidelity Performance Measurement (with A-Measure toolkit demo) Cognitive Skills Training Team Communications Assessment Discussion Ó2009, Aptima, Inc. 21

  22. The New Science of Scenario Engineering BEST Engineering the Stimulus for Optimal Learning PRESTO Optimizing Learning Trajectories Using Constraint-Based Logic CROSSTAFF Engineering Training Scenarios from Operational Data VSG DDD Visual Scenario Generator Tool Q: How to Optimize Learning on a Given Simulator?

  23. Vision: Learning from Synthetic Experiences Competencies, Knowledge, Skills… Training Objectives Readiness Training Experiences Next Scenario Performance Measurement Training Opportunities Data What to Measure

  24. BEST: Optimizing Scenario Selection Assess team performance against near-optimal solution Based on that assessment, select training events to optimize the team’s learning curve True State of Competencies s(n-1) s(n) s(n+1) Parameters set by SMEs scenario at n reward at n observation at n reward at n+1 scenario at n+1 • Team Score Parameters set by SMEs X(n-1) X(n) X(n+1) • Training Session System’s Belief about State of Competencies Tool: Partially Observable Markov Decision Process (POMDP) Training Model What sequence of experiences moves a team to a steeper learning curve?

  25. Experiment Result:52% Improvement over Best Practices Team Training

  26. Conventional vs. PRESTO-Based Scenarios Conventional Scenario MSEL Events Actually Occurred Planned Didn’t Occur Actual PRESTO-based Scenario PRESTO-Based Scenario Space Actual

  27. Engineering Training Scenarios from Operational Data (CROSSTAFF) (a) Original Flight Path (b) Key Events Identified (c) Events Generalized (d) Scenario Envelope Generated

  28. Scenario Engineering Using the DDD* Visual Scenario Generator (VSG) DDD*: Distributed Dynamic Decision-making simulator

  29. Pedagogically-focusedAdaptive Scenarios BEST Optimal Vignette Sequencing Ongoing Training Exercise Performance Measurement System (A-Measure) Scenario Initial Conditions SAF/Instructor Advice Exercise Events Participant Advice Trainee Performance Measures PRESTO/ CROSSTAFF TO Conditions Training Objectives

  30. Understanding Simulator Fidelity Requirements 1. There is little guidance and no standard tool for determining the appropriate level of fidelity of training simulators to Achieve specified training objectives, Maintain trainee acceptance, and Fit within budgetary constraints. Training Effectiveness Cost Perceived Actual 2. There are no standard measures designed to be sensitive enough to detect objective performance differences invoked by varying levels of fidelity Simulator Fidelity Ó2009, Aptima, Inc.

  31. RELATE: A Research-Driven Approach • A systematic approach to establish quantitative, predictive relationships between simulator fidelity and training effectiveness • RELATE fuses: • Fidelity requirements defined by end-users; • Existing theory and research about fidelity; and • Objective performance data from fidelity experiments to develop a predictive, computational model. Ó2009, Aptima, Inc. 31

  32. Model-Based Tool • A computational model-based tool to assist with decisions regarding the acquisition and use of training simulators • Tool can help users: • Conduct return on investment analyses to determine which simulator to develop or acquire • Prioritize technology enhancements to improve the effectiveness of existing simulators • Develop an strategy for employing both high- and low-fidelity simulators to meet training objectives Ó2009, Aptima, Inc. 32

  33. PERFORM: Air-Air Combat Fidelity Requirements Conducted research to examine the visual and cockpit fidelity requirements for training air-to-air combat skills to experienced pilots in F-16 simulators Developed a decision-support tool to help the Air Force prioritize technology enhancements to improve the effectiveness of deployable simulators Deployable Tactics Trainer (DTT) Display for Advanced Research and Technology (DART) Ó2007, Aptima, Inc. 33

  34. PREDICT: Air-Ground CombatFidelity Requirements • Conducting research to examine the visual fidelity requirements for training air-to-ground combat skills to inexperienced pilots in F-18 simulators • Developing a decision-support tool to help the Navy develop a strategy for employing both high- and low-fidelity simulators to meet training objectives Ó2009, Aptima, Inc. 34

  35. FLEET: Developing and ValidatingFidelity-Sensitive Measures • Developing measures of pilot performance that are sensitive enough to detect objective performance differences invoked by varying levels of fidelity Fidelity-Sensitive Performance Measures • Conducting research to examine the motion fidelity requirements for training air-to-ground combat skills to inexperienced pilots in T-45 simulators Ó2009, Aptima, Inc. 35

  36. Summary • Aptima seeks to identify the simulator fidelity requirements for effective training by: • Developing a systematic approach to establish quantitative, predictive relationships between simulator fidelity and training effectiveness • Building a computational model-based tool to predict the impact of simulator fidelity on training effectiveness • Creating performance measures and measurement tools that can be used to collect better data in simulator fidelity experiments • Employing the proper level of fidelity will ensure better training results and reduce costs by eliminating investments in unnecessary training and technology Ó2009, Aptima, Inc.

  37. New Approaches to Measuring Performance A well-designed measurement system makes simulation-based practice effective training The right feedback to the right person at the right time leads to better learning Measurement enables assessment of training effectiveness Are people getting the skills they need? Guides selection of training environment Live, virtual, constructive Facilitates appropriate use of measures Measurement technology can turn simulators into training machines

  38. You Can’t Train What You Can’t Measure Why is this hard? Volume of data “We recorded everything...” Lack of meaningful aggregation methods “325,435 messages were received…the average length was 2.35 minutes” Interdependence of behaviors at different locations No one person has the total picture “Correct” behavior depends on dynamic context It is hard to construct, even after the fact, where the team went wrong Hours spent training ≠ Proficiency Real-world experience ≠ Proficiency

  39. Performance Measurement Process

  40. Competency Based Performance Measurement Competency-based performance measures Leverages performance measurement theoryin combination with subject matter expert input Assesses team and individual performance COMPASSSM tells us the what direction to go with measure development The COMPASSSM Methodology is a product of Aptima, Inc.

  41. A▪Measure Product Family Turning simulators into training machines

  42. Cognitive Skills Training Not all skills should be trained the same way Cognitive skills vs. procedural skills Pedagogical Theory: Direct instruction vs. constructivism Most types of training/education employ a direct instruction approach Can be effective for training procedural skills Current research suggests that a problem-based approach may be more appropriate for training cognitive skills What differs is when you tell someone how to do something relative to when they practice doing it E.g., telling students the formula for density and having them practice (the traditional approach) enforces a plug-n-play understanding. Very little transfer. Instead, give students a situation and ask them how they would describe density. Now they get a sense of the principles involved before you give the formula.

  43. The Bransford Model

  44. Aptima’s Balanced Unified Incremental Learning Development (BUILD) Training Approach

  45. Example: How to “train the ear” Air Battle Managers (ABMs) monitor multiple communications (radio) channels at the same time This currently an acquired skill developed through experience and on the job training. How can we train a novice? At its core, this is a skill that relies heavily on cognitive skills like memory and attention. Monitoring multiple communications channels requires… Dedicating limited cognitive resources (memory) To attending to stimuli that must pass a certain threshold (attention) Under stressful conditions (stress)

  46. How can these cognitive concepts help ABMs? “Monitoring multiple comms channels requires… Dedicating limited cognitive resources (memory)” Psychological research demonstrates that one can free limited working memory resources by placing some information in long term memory. The key to long term memory storage is automaticity. When I see X, I do Y The key to training automaticity is repetition Multiple trials

  47. How can these cognitive concepts help ABMs? “Monitoring multiple comms channels requires… Dedicating limited cognitive resources (memory)” To attending to stimuli that must pass a certain threshold (attention)” The threshold is often physical (a certain volume, a certain brightness), but could also be semantic (meaningful). Cocktail party effect Trigger words can break the semantic threshold for attention When I hear an important word, I attend to it This is the mechanism for retrieval.

  48. From Theory to Application: A Layered Comms Training Approach Phase I: Trainees are introduced to ABMtrigger words and their definitions. Begin to store in long term memory Phase II: Trainees recognize trigger words within a stream of communications. Develop automaticity and retrieval Phase III: Trainees recognize a trigger word in a realistic scenario and respond accordingly. Automatically retrieve under stressful conditions

  49. Cognitive Skills Training

  50. CIFTS: Communications Analysis in Operational Environments • Domain: Air and Space Operations Center (AOC) • Numerous centers around the world • Around 100 operators communicating • In the air • On the ground • Around the world • Extremely complex operations must be coordinated • Extensive use of chat to coordinate, assign tasks, exchange information “Communications is at the Heart of Team Performance” Ó2009, Aptima, Inc. 50

More Related