1 / 23

Global Terabit Research Network: Building Global Cyber Infrastructure

Global Terabit Research Network: Building Global Cyber Infrastructure. Michael A. McRobbie Vice President for Information Technology & CIO Indiana University Internet2 Members Meeting May 2002. Digital Science. Science is becoming almost totally digital

Download Presentation

Global Terabit Research Network: Building Global Cyber Infrastructure

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Global Terabit Research Network:Building Global Cyber Infrastructure Michael A. McRobbie Vice President for Information Technology & CIO Indiana University Internet2 Members Meeting May 2002

  2. Digital Science • Science is becoming almost totally digital • Data is being generated, collected, processed, analyzed, visualized and stored in digital form. • Simulations and modeling are being carried out completely digitally • Historical and contemporary archives of science are being converted into digital form • All this is e-Science

  3. Global e-Science • Network-enabled global collaborative research communities (grids) are rapidly forming – each can number in the 1000s • These communities are based around a few expensive – often unique – instruments or distributed complexes of sensors that produce terabytes & petabytes of data (high energy physics, astronomy, earth sciences, …) • They carry out research based on these huge amounts of data using network-connected computation, storage and visualization facilities distributed world-wide • All of this is global cyberinfrastructure • Digital data of e-Science can be shared with collaborators not just on campus, but across cities, within states, nationally and ultimately internationally • e-Science is becoming completely international – it knows no boundaries

  4. Global Cyberinfrastructure Components • Huge hierarchical data storage facilities located worldwide • Powerful supercomputer arrays located worldwide to analyze data • Software to make use of the above to extract information from data • Support and management structure for hardware, software and applications • Global high-speed research & education (R&E) networks are the critical glue that connects these facilities together and allows the terabytes & petabytes of data to be distributed worldwide

  5. The Need for a Global Research and Education Network • A global R&E network is required to support true global cyberinfrastructure which will underpin global e-science • However international connections very slow compared with regional & national R&E network backbone speeds • Global connection effort not well-coordinated – dominated by bilateral thinking

  6. Regional, National and Multi-National R&E Networks • I-Light as an example of an operational Regional Network • An optical fiber infrastructure owned by Indiana University and Purdue University • Connects IU Bloomington, IUPUI and Purdue University West Lafayette • Installed (2) conduits on each route – one with fiber and one empty for future use, 100% in public right of way • ~700,000 feet of fiber, 99.9% underground • A long term investment by the State in research infrastructure • Provides enough networking capacity for the next 10-20 years between the three main research campuses • Believe Indiana is the first state to deploy such a high-performance R&E network

  7. Regional, National and Multi-National R&E Networks (con’t) • Several successful national and multi-national advanced high speed R&E networks have been operational for some time: • Internet2 Abilene , US Fed nets (e.g. ESnet) (US); CANet3 (Canada); GEANT (Europe) • These have OC48 backbones today • Moving to OC192 as next evolution • Institutions connect to backbone at OC12 or greater (a few connections at OC48) • Native high-speed IPv4 • Motivated by the need for quality of service (QoS) • Support for IPv6 (but at much lower performance due to router constraints)

  8. US/Europe Connectivity

  9. US/Asia-Pacific Connectivity

  10. Americas Connectivity

  11. Requirements for a Global R&E Network • Provides a single global backbone inter-connecting global network access points (GNAPs) that provide peering within a country or region • Provides global backbone speeds comparable to those of the R&E networks, i.e. OC192 in 2002 • Allows coordinated global advanced service deployment (e.g. QoS, IPv6, multicast) • Is based on stable carrier infrastructure or leased or owned fiber or wavelengths. • Is persistent based on long-term agreements with carriers, router vendors and optical transmission equipment vendors • Is scalable – e.g. OC768 by 2004, multiple wavelengths running striped OC768 by 2005, terabit/sec transmission by 2006 • Allow GNAPs to connect at OC48 and above. To scale up as backbone speeds scale up • Provides a production service with 24x7x365 management through a global NOC

  12. Global Terabit Research Network (GTRN) • Announced 18 February 2002 as a production service • A partnership to establish a true world-wide next generation Internet to interconnect national and multinational high speed R&E networks as a critical part of global cyberinfrastructure • Involves NREN-Consortium/Dante in Europe & Internet2, Indiana University, CANARIE, StarTAP/Starlight & Pacific Wave in the United States & Canada • Currently connects the major R&E networks in Europe and North America • Regionally based (initially Europe & North America; soon Asia Pacific,… )

  13. The Global Terabit Research Network • Initially 2 x OC-48 unprotected POS Trans-Atlantic circuits • Trans North American capacity to Seattle (Pacific Wave) via Chicago (StarLight) provided by tunnelling over Abilene • Run as a single AS (AS21230) • Second set of Trans-Atlantic OC-48s planned • A set of 2 x OC12s to connect the Asia Pacific to the GTRN • Governed and managed internationally • NOC services across the globe • www.gtren.net

  14. Global Terabit Research Network (con’t) • Additions underway • GTRN AS at STAR TAP/Starlight • GTRN AS at Pacific Northwest GigaPop (PNG) • Tunneled capacity across Abilene to connect these points • Resulting GTRN topology: Europe, North America; Asia Pacific expected soon • Participation in New York layer two exchange point (Manhattan Landing)

  15. A Global Partnership • Initial Planning Group • Fernando Liello (European NREN Consortium) • Dai Davies (DANTE) • Michael A. McRobbie (Indiana University) • Steven Wallace (Indiana University) • Doug van Houweling (Internet2) • Heather Boyles (Internet2) • Participating and Supporting Individuals (Organizations) • Bill St. Arnaud (CANARIE/CAnet*3) • Tom DeFanti (STAR TAP/Starlight) • Ron Johnson (Pacific Wave)

  16. Future GTRN Expansion & Activities • Further deployment of GNAPs (e.g. in the Asia Pacific) • Extension to the Latin Americas via AMPATH • More formal global NOC services (e.g. GTRN weather map, seamless trouble reporting, etc.) • Formal GTRN inauguration in Brussels May 21

  17. Global Terabit Research Network:Building Global Cyber Infrastructure Michael A. McRobbie Vice President for Information Technology & CIO Indiana University Internet2 Members Meeting May 2002

More Related