1 / 25

The MAGIC Project Final Demonstration and Briefing

The MAGIC Project Final Demonstration and Briefing. Ira Richer 29 July 1999. MAGIC Scenario. Datasets. Globe San Francisco Bay area Menlo Park Ft. Irwin Camp Pendleton (low resolution) Camp Pendleton (high resolution) Laguna Beach Yosemite Colorado Lawrence , KS Ohio

elgin
Download Presentation

The MAGIC Project Final Demonstration and Briefing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The MAGIC ProjectFinal Demonstration and Briefing Ira Richer 29 July 1999

  2. MAGIC Scenario

  3. Datasets • Globe • San Francisco Bay area • Menlo Park • Ft.Irwin • CampPendleton (low resolution) • Camp Pendleton (high resolution) • Laguna Beach • Yosemite • Colorado • Lawrence, KS • Ohio • Washington, DC • Ft. Benning

  4. Background • The MAGIC scenario is based on applications such as mission planning and crisis management • These applications need to: • locate, process, and display real-time and/or archived datasets on-the-fly • fuse multiple types of data (e.g., weather, terrain) • be able to transfer large volumes of data for interactive visualization • support a range of user capabilities (e.g., command center, mobile user) • MAGIC focuses on a terrain visualization application that uses geo-referenced datasets

  5. Objectives • To develop an information system that provides application users with fast, interactive access to very large volumes of remote data distributed over a wide geographic area • To demonstrate the above capability in the context of a 3-D terrain visualization application

  6. MAGIC Participants • Corp. for National Research Initiatives • USGS/EROS Data Center • Lawrence Berkeley National Laboratory • Sprint • SRI International • University of Kansas

  7. Distinguishing Characteristics • Distributed information system, designed from the outset to work over a network • adaptation to network conditions and to system load • No limit on size of viewable datasets • No restrictions on location of datasets • Fusion of multiple datasets • Near real-time visualization of terrain data

  8. MAGIC Components • High-speed internetwork • Distributed parallel storage system (DPSS) • Software for terrain dataset preparation • 3-D interactive terrain visualization application: TerraVision

  9. User User Repository Repository DPSS DPSS DPSS DPSS Processor Processor Generic Configuration High-speed internetwork

  10. Tiles on path of travel 43 71 33 34 35 73 45 25 88 36 24 87 35 74 86 34 73 User 46 33 72 45 32 71 44 31 70 43 S1 D1 S1 D2 S2 D1 S1 D1 S2 D1 S2 D2 S2 D2 Tiles located on DPSS server/disk Tiles retrieved and transmitted SERVER 1 SERVER 2 D1 D2 D1 D2 43 35 34 73 45 33 71 System Operation Server 1 Internetwork

  11. Today’s Demonstration and Briefing • 3-D, interactive visualization of large volumes of remote data • Platform scalability • On-demand data preparation and 3-D flyover • Fusion of multiple types of data from multiple sources • DPSS fault tolerance

  12. DPSS DPSS DPSS DPSS Today’s Configuration ESnet Abilene GPN KU LBNL EDC NREN DARPA ISI-E ATDnet NASA Goddard

  13. Dataset Characteristics

  14. DPSS DPSS DPSS DPSS Data Flow (1) Abilene ESnet GPN KU LBNL EDC NREN DARPA ISI-E ATDnet NASA Goddard

  15. Dataset Characteristics

  16. PlatformScalability • SGI Octane (~$50k) • OC-3, IRIX, DPSS • 60 frame/sec, ~80Mb/s • Dual-processor PC with graphics accel. (~$8k) • Fast Ethernet, Windows NT, HTTP • 30 frame/sec, ~0.5Mb/s (Internet-limited) • Dual-processor PC (~$4k) • OC-3, Linux, DPSS • 10 frame/sec, ~10Mb/s • Laptop (~$3k) • Fast Ethernet, Windows 98, HTTP • ~4 frame/sec, ~0.5Mb/s (Internet-limited)

  17. TerraVision II • Composite datasets: data from multiple sources, at different resolutions • VRML overlays: wind vectors, buildings, roads; viewable with standard browser • Wind-vector dataset hosted at NAVO • Co-registered maps for navigation • Pre-defined flight paths for operations support, testing

  18. DPSS Cache Architecture

  19. DPSS Costs (Jan. ‘99)

  20. Monitoringandmeasurement • Real-time traffic monitor • DPSS system monitor • NetLogger • Logs events at critical points of distributed applications • Provides complete view of system operation • Visualization tools allow for real-time monitoring • Agent-based management of measurement tools

  21. NetLoggerInstrumentation

  22. DPSS Capabilities • Load balancing • Dataset must be replicated • Tile requests sent to server that will provide the best performance, based on current server load and network conditions • Automatic TCP tuning • Window size determined by throughput and delay information • Fault tolerance • Automatic detection of failed server, switchover to alternate servers, and reconnection after recovery

  23. DPSS DPSS DPSS DPSS Data Flow (2) ESnet Abilene GPN KU LBNL EDC NREN DARPA ISI-E ATDnet NASA Goddard

  24. Technology Transfer and “Spinoffs” • DPSS • Earth Systems Grid (climate modeling) • Kaiser Permanente project • Combustion Corridor (diesel modeling) • Matisse (MEMS/Supernet) • NetLogger • Being used by many network and grid projects

  25. Technology Transfer and “Spinoffs” (concluded) • USGS archive • Web-based access to digital earth-science data • Gateway to the Earth: Ohio Pilot (automatic dissemination of Landsat imagery) • TerraVision • BADD (Dissemination of VRML terrain datasets to warfighter) • Digital Earth (fast indexing of geo-data, multimodal capabilities, and education applications) • Netscape plug-in

More Related