1 / 17

Ocean Sciences Cyberinfrastructure Futures

Ocean Sciences Cyberinfrastructure Futures. Report to the ORION Ocean Observatory Workshop San Juan, Puerto Rico January 7, 2004. Dr. Larry Smarr Director, California Institute for Telecommunications and Information Technologies Harry E. Gruber Professor,

Download Presentation

Ocean Sciences Cyberinfrastructure Futures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Ocean SciencesCyberinfrastructure Futures Report to the ORION Ocean Observatory Workshop San Juan, Puerto Rico January 7, 2004 Dr. Larry Smarr Director, California Institute for Telecommunications and Information Technologies Harry E. Gruber Professor, Dept. of Computer Science and Engineering Jacobs School of Engineering, UCSD

  2. The Ocean.US DMAC VisionA Strong Foundation • Interoperability • Open, easy access and discovery • Reliable, sustained, efficient operations • Effective user feedback • Open design and standards process • Preservation of data and products www.dmac.ocean.us Source: John Orcutt, SIO

  3. Components of CI-Enabled Science & Engineering NSF Report on Revolutionizing Science and Engineering through Cyber-Infrastructure (Atkins Report) www.communitytechnology.org/nsf_ci_report/ High-performance computing for modeling, simulation, data processing/mining Humans Instruments for observation and characterization. Individual & Global Connectivity Group Interfaces Physical World & Visualization Facilities for activation, manipulation and Collaboration construction Services Knowledge management institutions for collection building and curation of data, information, literature, digital objects

  4. e-Science Data Intensive ScienceWill Drive Distributed Cyberinfrastructure

  5. NASA Earth System Science IT Challenges • EOSDIS Currently: • Ingests Nearly 3 Terabytes of Data Each Day • In 2003 it Delivered Over 25 Million Data Products • In Response to Over 2.3 Million User Requests • Making It the Largest “e-Science” System in the World • Earth System Modeling is a Driving Requirement for High-End Computing, and will Continue to be so as Models: • Increase in Resolution and • Are Further Coupled • (e.g., Atmosphere-Ocean-Land Processes) Other Agencies are Learning from EOSDIS and are Moving Beyond. As NASA Lays Out the Evolution of its Information Infrastructure to Meet its Earth Science Challenges Over The Next Decade, it will Again Need to Move to The Leading-Edge.

  6. Components of a Future Global System for Earth Observation

  7. NSF is Funding Research on Wireless Cyberinfrastructure for Ocean Observatories http://roadnet.ucsd.edu/ www.cosis.net/abstracts/EAE03/07668/EAE03-J-07668.pdf

  8. The Biomedical Informatics Research Network: a Multi-Scale Brain Imaging Federated Repository Average File Transfer ~10-50 Mbps UCSD is IT and Telecomm Integration Center Part of the UCSD CRBSCenter for Research on Biological Structure National Partnership for Advanced Computational Infrastructure

  9. Large Hadron Collider Cyberinfrastructure Communications of the ACM, Volume 46, Issue 11 (November 2003)

  10. From Shared Internet to Dedicated LightpipesEnabling the “I” in ORION www.skio.peachnet.edu/coop/materials/cora_lowres.pdf Source: Tom West, CEO NLR “National Lambda Rail” Partnership Serves Very High-End Experimental and Research Applications 4 x 10Gbps Wavelengths Initially Capable of 40 x 10Gbps Wavelengths at Build Out

  11. An International-Scale Set of Dedicated Wavelengths is Operational over TransLight NorthernLight UKLight CERN European lambdas to US –8 GEs Amsterdam— Chicago –8 GEs London—Chicago Canadian lambdas to US –8 GEsChicago— Canada —NYC –8 GEs Chicago— Canada —Seattle US lambdas to Europe –4 GEs Chicago—Amsterdam –3 GEs Chicago— CERN European lambdas –8 GEs Amsterdam—CERN –2 GEs Prague—Amsterdam –2 GEs Stockholm—Amsterdam –8 GEs London—Amsterdam TransPAC lambda –1 GE Chicago—Tokyo IEEAF lambdas (blue) –8 GEs NYC—Amsterdam –8 GEs Seattle—Tokyo Source: Tom DeFanti, EVL, UIC

  12. The OptIPuter Project – Removing Bandwidth as an Obstacle In Data Intensive Sciences • NSF Large Information Technology Research Proposal • UCSD and UIC Lead Campuses—Larry Smarr PI • USC, UCI, SDSU, NW, Texas A&M Partnering Campuses • Industrial Partners: IBM, Telcordia/SAIC, Chiaro, Calient • $13.5 Million Over Five Years • Optical IP Streams From Lab Clusters to Large Data Objects NIH Biomedical Informatics Research Network NSF EarthScope http://ncmir.ucsd.edu/gallery.html siovizcenter.ucsd.edu/library/gallery/shoot1/index.shtml

  13. Removing Barriers to Earth Observing & Simulation One Current Barrier: The Low Throughput of Today’s Internet Even Though Internet2 Backbone is 10 Giga bits per second Network is Shared Using TCP/IP Protocol A Remote NASA Earth Observation System User Only Sees: 10-50 Mbps (May 2003) Throughput to Campuses Typically Over Abilene From Goddard, Langley, or EROS UCSD’s SIO to Goddard in May 2003 (ICESAT, CERES Satellite Data) 12.4 Mbps—1/1000 of the Available Backbone Speed! In Contrast, OptIPuter Demonstrated 9.3 Gbps/10 Gbps NCSA to SDSC Using Reliable Blast UDP http://www.evl.uic.edu/cavern/rg/20030817_he

  14. Prototyping a Campus-Scale OptIPuterLinking Scalable Linux Clusters The UCSD OptIPuter Deployment 0.320 Tbps Backplane Bandwidth Juniper T320 20X 6.4 Tbps Backplane Bandwidth Chiaro Estara ½ Mile To CENIC Dedicated Fiber Between Sites SDSC SDSC SDSC Annex SDSCAnnex Preuss High School JSOE Engineering 2 Miles 0.01 ms CRCA SOM Medicine 6thCollege Phys. Sci -Keck Collocation Node M Earth Sciences SIO Source: Phil Papadopoulos, SDSC; Greg Hidley, Cal-(IT)2

  15. Ultra-Resolution Displays Driven by Graphics Clusters for Ocean Sciences Imaging Emmi Ito- U. Minnesota; Frank Rack- Joint Oceanographic Institutes; Jason Leigh, EVL UIC

  16. How Can We Make Scientific Discovery as Engaging as Video Games? GeoWall Linked by Fiber Optics to SIO 6-Week Earth Sciences Unit Aligned to State Standards Interactive 3D APPLICATIONS: Underground Earth Sciences Geography Source: Mike Bailey, Rozeanne Steckler SDSC

  17. Further Reading The Use of e-Science Grids to Support NSF’s Ocean Research Interactive Observatory Networks (ORION) By Larry Smarr ORION Website Information Papers ORION Website Information Papers IOOP

More Related