Multimodal Information Exchange and Dynamic Adaptation - PowerPoint PPT Presentation

slide1 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Multimodal Information Exchange and Dynamic Adaptation PowerPoint Presentation
Download Presentation
Multimodal Information Exchange and Dynamic Adaptation

Loading in 2 Seconds...

play fullscreen
1 / 19
Multimodal Information Exchange and Dynamic Adaptation
109 Views
Download Presentation
Download Presentation

Multimodal Information Exchange and Dynamic Adaptation

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Multimodal Information Exchange and Dynamic Adaptation Nadine Sarter Thomas Ferris Shameem Hameed University of Michigan

  2. Multimodal Adaptive Displays • Future battlefield operations will be highly complex and dynamic and require effective information systems/exchange • Our approach: • Multimodal displays (including vision, audition, and touch) • Context-sensitive hybrid adaptive/adaptable information presentation

  3. Research Activities • Created flexible computer-based simulation platform that supports co-located and remote synchronous collaboration • Used platform for series of studies on • Natural patterns and preferences of modality usage • Preattentive monitoring of mission health • Crossmodal spatial and temporal links in attention

  4. Natural Patterns of Modality Usage • People do not necessarily interact multimodally just because a multimodal interface is made available • Multimodal interaction primarily in the context of spatial tasks and to support complementarity • Switch modalities mostly for the purpose of recovering from communication breakdowns • In the context of human-human interaction, most modality combinations were sequential in nature (“contrastive functionality”) • Modality usage patterns evolve as a function of team coordination and change in response to factors such as scenario demands, the mission phase, and group dynamics.

  5. Crossmodal Links in Attention • Crossmodal spatial and temporal links • The modality & location of a stimulus in one modality may facilitate/hinder processing of subsequent stimulus in different modality • Effects manifest only within a certain time interval between stimuli (SOA – Stimulus Onset Asynchrony) • Related information should be co-located… (the binding “problem”)

  6. Previous XSL Studies • Spartan laboratory environments • Simple and artificial cues and tasks • Do we see these effects in more complex environments with real-world tasks and stimuli? SOA e.g., Spence & Driver, 1997 ~100-300ms

  7. Experimental Setup Periscope Display Thermal detection system display FBCB2 shared map display Satellite data uplink display Push-to-talk radio button strapped to index finger Left speaker Right speaker Remote eye-tracking camera Joystick for UAV Tactors worn on wrists Tactors strapped to outsides of thighs

  8. Method • 12 cadets and 3 graduates from the University of Michigan Army ROTC program (7 females and 8 males) • Each participant played Stryker vehicle commander (VC) for the first of a convoy of vehicles in a simulated night-time rendezvous mission • Throughout the mission, participants were presented with 48 visual, auditory, or tactile targets, either in isolation (‘uncued’ trials, n=24) or preceded (various SOAs) by an ipsilateral (same-side, n=12) or contralateral (opposite-side, n=12) peripheral cue in a different modality

  9. Main Findings • Confirm that crossmodal spatial links affect performance in more complex settings • Cuing effects were larger and response times longer and varied to a higher degree than in earlier research • Crossmodal asymmetries: • Ipsilateral crossmodal cuing was beneficial only for auditory cuing of visual targets but not vice versa • Significantly faster responses for contralateral tactile cuing of auditory targets but not vice versa • Visual-tactile cue-target combinations showed a similar trend favoring contralateral presentations • Faster response times for contralateral presentations in some cases may be the result of IOR

  10. Cueing of Visual Target IOR???

  11. Main Findings • Confirm that crossmodal spatial links affect performance in more complex settings • Cuing effects were larger and response times longer and varied to a higher degree than in earlier research • Crossmodal asymmetries: • Ipsilateral crossmodal cuing was beneficial only for auditory cuing of visual targets but not vice versa • Significantly faster responses for contralateral tactile cuing of auditory targets but not vice versa • Visual-tactile cue-target combinations showed a similar trend favoring contralateral presentations • Faster response times for contralateral presentations in some cases may be the result of IOR

  12. Context-Sensitive Display Design • Hybrid approach – “Delegation” • Combine positive aspects of adaptive (system-initiated) & adaptable (user-controlled; management-by-exception) interfaces • Combine/negotiate among multitude of drivers related to operator, cues, and environment

  13. Mental Workload • Heart rate • Confounded by physical component & mental stress • Heart rate variability • Variation in time interval between consecutive heart beats • Relatively stable index of mental workload • Shows shifts from rest state to task state • Shows varying levels of workload in task state • Power spectrum analysis (Workload) From Rowe et al, 1998

  14. Availability/Appropriateness of Modality • Modality availability • Modality may have become temporarily or permanently unavailable due to ambient conditions/events (e.g., explosion/ambush) • Modality appropriateness • Nature and type of information conveyed • Certain modalities more appropriate than others for certain types of information • e.g., spatial information (geographic location) is best conveyed visually

  15. Challenges • Weight assignments for modalities may need to be adjusted • Deadlock arbitration module • Start with “hard” constraints (detectability, availability) • Then, choose modality with high appropriateness index • If all those are same, consider previous cue modality and timing

  16. Experiment – Final Step • Simulation and hybrid interface have been implemented • Experiment is designed and under IRB review – will be conducted in Fall • Will examine feasibility and effectiveness of the approach and compare to “static” information presentation

  17. Tactons Structured, complex tactile signals which communicate abstract messages (Brewster & Brown, 2004) Braille Text Tactons Visual icons

  18. Tactile “Cues” vs. Tactons Tactile “Cues” Tactons • One or few parameters • modulated • Literal, critical to maintain • intuitive mapping • Very limited information • content • Usually multiple parameters • modulated • Abstract patterns represent • larger concepts or messages • Conscious processing required • to decode message Examples: Interruption management, patient monitoring

  19. Conclusion • Much more complex picture emerges for effective multimodal information presentation • Requiring careful choice of modality pairings, location, timing • Calling for context-sensitive presentation of information • Power of tactons far from being exploited • Beware of guidelines (“adapt modalities to user preferences”) but see, for example, Jones and Sarter (2009) for guidance…