1 / 20

Visualization and Fusion

Visualization and Fusion. Craig Scott, Aisha Page, Olusanya Soyannwo, Hamzat Kassim Paul Blackmon, Pierre Knight, Francis Dada Engineering Visualization Laboratory. Outline. Overview Image/Spatial Fusion Motion detection and object tracking Synthetic Battlespace Conclusion. Overview.

chogan
Download Presentation

Visualization and Fusion

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Visualization and Fusion Craig Scott, Aisha Page, Olusanya Soyannwo, Hamzat Kassim Paul Blackmon, Pierre Knight, Francis Dada Engineering Visualization Laboratory

  2. Outline • Overview • Image/Spatial Fusion • Motion detection and object tracking • Synthetic Battlespace • Conclusion

  3. Overview • The objective of this research is to advance ideas that apply to how to enhance battlespace awareness and move tactical decision making closer to the field soldier by applying image and spatial data fusion concepts.

  4. Seed Demo C++ DirectShow Windows AVI video Website (Hendler - UMD), Vignette, Requirements Leroy, Pamela Plans (Monmouth) B3A Data Streams Advanced Interactions Visualizations Highlights WWW JCDB RSS Craig Craig RSS Heterogeneous Requirements, GIS... RSS... Data sources Cognitive Deficiencies Pamela,Leroy Commander/ Content Based Messaging Analysts Imagery DB System (Roy George Willy & Damian Story Knowledge Clark-Atlanta) (Andrew Cowley PNNL) (VS - UMD) Engineering, Tactical Operations Cognitive Models, Center Workstations Soldier HCI Command and Control Leroy,Pamela IMPACT / PASTA / P2P - Willy & Damian (VS, Hendler - UMD) Java/C++ Sequal Server/Oracle UML Cognitive Models Sun JXTA Autonomous Fusion Interactive Fusion and and Navigation Navigation Distinct Sources Summary of Source BORG Experimental Architecture & Platform

  5. Issues/Approach • Exploiting Level of Detail Perception • Multiple channel architectures • Area-of of-interest filters • Subscription-based aggregation • Exploiting Temporal Perception • Render the entity in an accurate location as long as the local user does not interact, • Distingish active and passive entities

  6. Meaning Viewing Volume or Plane Registration Combination Reasoning Data Structures Terrain Layer Image/SpatialData Fusion Image/Spatial Data Fusion: combining complete spatially filled sets if data in 2-D or 3D • 2D optical photos & video, FLIR, SAR • 3D terrain data, buildings, vehicles, weather models, LADAR

  7. Tracking Studies • Objective • To be able to provide one solution to soldiers query about unrecognizable objects (tanks, airplanes, people etc.) through object tracking • Problem • Finding the best tracking algorithm efficient enough to display and track desired object • Implementing different algorithms into frame work which will display best algorithm suitable for the current field operation • Solution • Implement various tracking algorithms with varying environmental conditions • Validate effectiveness through perceptual studies

  8. Motion and Object Tracking Algorithms • Background subtraction • Motion templates • Optical flow • Active contours • Estimators

  9. Background Subtraction (cont’d) • μt = αy+ ( 1 –α)μt • μ - updated image • t – time constant • α – learning rate constant • Specifies how fast (responsive) background model is adapted to changes • 0 <= α <= 1 • y – new observation at time t

  10. Active Contour • Active Contours • Curves defined within an image domain • Can move under influence of • Internal forces coming from within the curve itself • External forces computed from the image data • Allows the computer to generate curves that move within images to • locate object boundaries • find other desired features within an image

  11. Active Contour (cont’d) • Energy equation associated with snake • E= Eint +Eext • Eint- internal energy formed by the snake configuration • E int= Econt+ Ecurv • Econt – Contour continuity energy • Minimizing Econt over all the snake points, causes the snake points to become more equidistant • Ecurv – Contour curvature energy • The smoother the contour is, the less the curvature energy

  12. Active Contour (cont’d) • Eextis the external energy formed by external forces affecting the snake • Eext= Eimg+ Econ • Eimg – Image energy • Two variants of image energy are proposed: • Eimg = -I, where I is image intensity • Eimg = -||grad(I)||, Snake is attracted to image edges • Econ – Energy of additional constraints

  13. KLT: Kanade-Lucas-Tomasi Feature Tracker • Problem • Complex changes occur between frames • Good features are located by: • Examining the minimum eigenvalue of each 2x2 gradient matrix • Key components to feature tracker • Accuracy: relates to local sub-pixel accuracy attached to tracking • Robustness: relates to sensitivity of tracking with respect to changes of: • Lighting • Size of image motion • Goal • To find location on second image • Such that, image one and two are similar

  14. Computing Image Motion • Residual function: • J – second image • I – first image • W – given feature • X – point within image • w(x) – weighting function • A = 1 + D • d – translation (uniform movement) of feature window’s center; from one frame to another • D – deformation (change of shape) matrix • Used to determine if the first and current frames match

  15. Feature Selection • Means to select point in image I • Selection maximizes quality of tracking • Central step of tracking – computation of the optical flow • Optical flow: motion of brightness patterns in image • At this critical step the minimum eigenvalue must be larger than a threshold • This characterizes pixels that are “easy to track”

  16. Test Scene Virtual Laboratory for Data Fusion Studies

  17. Tracking Example BackgroundSubtraction

  18. Battlespace Visualization Battlespace visualization is the process whereby the commander— • Develops a clear understanding of the current state with relation to the environment. • Envisions a desired end state that represents mission accomplishment. • Visualizes the sequence of activity that moves the commander’s force from its current state to the end state. Army Geospatial Guide for Commanders and Planners, TC 5-230, November 2003.

  19. Synthetic Battlespace Research Issues • Perception/Trust • Agent Roles & Interaction • 3D Scene Reconstruction • Automated Feature Extraction • Registration Accuracy • LOD/Bandwidth Tradeoff • “As DoD looks to the future, increasing demands on the warfighter dictate the increased use of simulations in operational situations. Ideally, the simulation power is placed at the immediate disposal of the warfighter so that it can be accessed and employed when needed.”

  20. Conclusion • We have implemented rudimentary prototype object tracking algorithms within a seed demo application to illustrate image/spatial fusion concepts. • The synthetic battlespace concepts require integrating present joint simulation technology with fusion research concepts to articulate a COP that is easy to understand and react to. • The commercial game market supplies a significant amount of talent and resource$ (market) to fuel this area

More Related