1 / 25

Next Generation 4-D Distributed Modeling and Visualization of Battlefield

Next Generation 4-D Distributed Modeling and Visualization of Battlefield. Avideh Zakhor UC Berkeley. MURI on Mobile Augmented Battlespace Visualization. Participants: Avideh Zakhor, (UC Berkeley) Bill Ribarsky, (Georgia Tech) Ulrich Neumann (USC) Pramod Varshney (Syracuse)

abril
Download Presentation

Next Generation 4-D Distributed Modeling and Visualization of Battlefield

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Next Generation 4-D Distributed Modeling and Visualization of Battlefield Avideh Zakhor UC Berkeley

  2. MURI on Mobile Augmented Battlespace Visualization • Participants: • Avideh Zakhor, (UC Berkeley) • Bill Ribarsky, (Georgia Tech) • Ulrich Neumann (USC) • Pramod Varshney (Syracuse) • Suresh Lodha (UC Santa Cruz)

  3. Battlefield Visualization • Detailed, timely and accurate picture of the modern battlefield vital to military • Many sources of info: • eye witness, aerial photographs, sonar, Synthetic Aperture Radar (SAR), Multi-Spectral Imaging (MSI), Hyper-Spectral Imaging (HSI). Foliage PENetration (FOPEN) radar, Electro-Optic (EO), Infra-Red (IR), Moving Target Imaging (MTI)

  4. Major Challenges • Disparate/conflicting sources of info must be combined. • Impossible for ONE individual to collect and comprehend • Specially trained technicians for each info source • Effectiveness of information combining and fusion determined by its usability. • Must avoid information overload in presenting the data.

  5. Historical Perspective • Sand box: • box filled with sand shaped to replicate battlespace terrain. • Commanders moved around small physical replicas of battlefield objects to direct situation.

  6. Historical Perspective • Paper maps and acetate: • As intelligence arrives, technicians use grease pencils to mark new info on acetate. • Commanders draw on the acetate to plan battlefield situations. • Time consuming: several hours to print, distribute and update. • Many opportunities for introducing errors.

  7. Historical Perspective • Joint Maritime Command Information System (JMCIS): • computerized battle space visualization • two fundamental limitations: • clutter when displaying too much info. • Two dimensional display: 3D info is lost.

  8. Historical Perspective • Responsive workbench with Dragon: • Inherently 3D. • Workstation renders 3D, back-projected on a horizontal screen. • Users in the viewing area interact with the bench through 3D mouse, pinch gloves, speech recognition. • Stereographic display with LCS shutter glass.

  9. Improvements over WRB/Dragon • Workbench not suitable for mobile soldier with PDA. • Augmented reality can enhance Dragon/WRB: • sensors distributed among the soldiers, can be used both to navigate and to update the visualization database. • Need to deal with uncertainty: represent, compute, and visualize uncertainty information without cluttering. • Time should become the 4th dimension. 4D model construction. Play back/visualize the last 24 hours.

  10. Agile, Mobile, Collaborative Testbed • A networked, collaborative whole Earth terrain database linking workstations, large projected displays, and mobile handheld systems. • Mobile users carry systems with handheld or augmented displays providing 3D terrain visualizations. • Mobile users receive, record, and transmit information about the world. • Users of stationary 3D displays collect and evaluate information provided by mobile users and route data and intelligence back to mobile users. • Collaboration is through annotation of the virtual geo-spatial database.

  11. Architecture: Agile, Mobile, Collaborative Testbed Mobile Displays Central Terrain Database w/GPS, bearing, tilt sensors and wireless data. Wireless Terrain Datalink Wireless Terrain Datalink Wireless Collaboration Channels Position, Terrain Markup, Route Finding, Weather, Friendly/Foe advisories, etc. Local databases download and cache data according to bandwidth, movement, and rendering speed of each platform. NAVE Virtual Workbench

  12. Research Agenda • Model construction: • initially constructed by registering sensor imagery to reference imagery, maps, elevation data, etc. • Four dimensional: time treated on the same footing as space. • Model update: Distributed mobile or stationary users contribute to updating the database via their sensors. • Mobile, real time visualization, interaction and navigation within the database; augmented reality sensors tracking and registration required; • Uncertainty processing for model construction and update, as well as uncertainty visualization.

  13. Technical Challenges Model Update & Construction Tracking & Registration Uncertainty processing Uncertainty visualization Database Visualization

  14. 3-D and 4D Model Construction • Develop a framework for fast, automatic and accurate 3D model construction for objects, scenes, rooms, buildings (interior and exterior), urban areas, and cities. Incorporate time element (4-D). • Models must be easy to compute, compact to represent, suitable for insertion in large hierarchical visualization databases, to facilitate high quality view synthesis and visualization from views that were not necessarily captured during data collection process. • Strategy: • Fusion of multiple data sources: intensity, range, GPS, panoramic cameras. • Incorporate apriori models, e.g. CAD, DEM, DTED, elevation data, maps

  15. Geo-registration and Tracking • Develop techniques for unemcumbered, wide area and real time tracking for (a) augmentation and (b) visualization. • Strategy: Estimate real time 6-DOF tracking by fusion of multiple data streams with variable uncertainty: GPS, vision, inertial gyros, accelerometers, compass, laser range finders • Real time prototype will use the resulting tracking algorithms on an augmented reality PDA to geo-register and navigate the user within the geo-spatial database.

  16. Mobile Visualization • VGIS: general framework for global geo-spatial data and visualization. • Hierarchical, scalable data structures with fast access that include (a) time; (b) uncertainty; c) all varied products. • Dynamic data structures with fast update: Real time data is put in dynamic cache until system finds time to integrate it with the online data base. • Automated detail management for (a) uncertainty; (b) all visual products; • Intelligent retrieval and visual data mining. • Multi-modal interaction in multiple environments.

  17. Technical Challenges: Uncertainty Computation and Visualization

  18. Uncertainty Processing • Representation and computation issues: • many formalisms: probability, possibility, evidence theory; Transformations; • Fusion: data, feature and decision levels • uncertainty aware fusion algorithms for dynamic distributed networks; various topologies for fusion networks: serial, parallel, tree network, non-tree feedback network. • decentralized statistical inference algorithms • Time Critical computation and quality of service issues: • Data continually arriving, requiring re-computation. • tradeoff between precision and speed of uncertainty computation; fast, yet imprecise answers.

  19. Uncertainty Visualization and Validation • Present uncertainty in an intuitive uncluttered way; • Display devices: screen space, mobility • Modality: vision, audio, haptics • Data types: scalar/vector/tensor, discrete/continuous, static/dynamic • Uncertainty visualization techniques: glyphs, deformation, transparency, texture, superimposing/backgrounding, augmented reality. • Validation with novice and expert users: • Tasks for mobile battlefield; measure accuracy and speed performance of user; conduct stat. analysis.

  20. Proof of Concept : Multi-modal Interactions for PDA • Extend interaction methods for workstations to PDA; • Wired (electromagnetic) and unwired (vision) interaction modes, hand gestures • Visible and infrared sensors added to PDA • Demonstrate a hybrid vision/inertial tracking PDA • demonstrate VGIS visualization on portable display context • Multiple mobile users collaborate on identifying and locating features or targets.

  21. Facilities and Equipment • G. Tech: GVU Future Computing Lab (FCL); Mobile augmented test-bed. • USC: IMSC lab; CGIT lab. • UCSC: VizLab • Syracuse: Sensor Fusion Laboratory (SFL) • Berkeley: Video and Image Processing (VIP) Lab.

  22. Collaboration and Management issues • Dr. Hassan Foroosh currently with University of Maryland, will be the technical liason amongst the five universities. • Telephone conference amongst team members periodically to discuss status of the project. • Live seminars broadcast via video conferencing amongst five campuses, once every two months. • Annual workshops and retreats for PI and graduate students and all other researchers to exchange ideas.

  23. Transitions • Air Force Research Lab (Graniero) • Army Research Lab (Emmerman, Tocarcik) • Navy Research Lab (Rosenblum) • Lawernce Livermore Lab (Ucelton) • Sun Microsystems (Sowizraj) • Intel (Liang) • Planet 9 (Colleen) • Geometrix (Zwern) • Hughes Research Lab (Azuma)

  24. Cross Collaboration

  25. Outline of Talks • 3D model construction for visualization (UC Berkeley) • Geo-registration and tracking for augmentation and visualization(USC) • Mobile visualization in dynamic, augmented battlespace (Georgia Tech) • Uncertainty processing and information fusion (Syracuse) • Uncertainty visualization and validation (UC Santa Cruz)

More Related