1 / 20

International Networking and Cyberinfrastructure

International Networking and Cyberinfrastructure. CISE Reorganization Status International Networking Solicitation Cyberinfrastructure. Douglas Gatchell <dgatchell@nsf.gov> Program Director International Networking National Science Foundation, USA. International Networking.

Download Presentation

International Networking and Cyberinfrastructure

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. International Networking and Cyberinfrastructure • CISE Reorganization Status • International Networking Solicitation • Cyberinfrastructure Douglas Gatchell <dgatchell@nsf.gov> Program Director International Networking National Science Foundation, USA

  2. International Networking • New solicitation • Priorities • Enable and Enhance communications, collaboration and cooperation by connecting aggregation points • Enhance connectivity to new regions • Support connections between share cyberinfrastructure • Cooperate to support domain specific CI

  3. NSF Supported International Links • TrasPAC • Tokyo-LA, Tokyo-Chicago • Euro-Link • Chicago – Amsterdam/CERN • NaukaNet • Chicago – Moscow & China • AMPATH • Miami – Rio & others

  4. TransPac

  5. TransLight Lambdas European lambdas to US –8 GigEs Amsterdam—Chicago –8 GigEs London—Chicago Canadian lambdas to US –8 GigEsChicago-Canada-NYC –8 GigEs Chicago-Canada-Seattle US lambdas to Europe –4 GigEs Chicago—Amsterdam –3 GigEs Chicago—CERN European lambdas –8 GigEs Amsterdam—CERN –2 GigEs Prague—Amsterdam –2 GigEs Stockholm—Amsterdam –8 GigEs London—Amsterdam TransPAC lambda (yellow) –1 GigE Chicago—Tokyo IEEAF lambdas (blue) –8 GigEs NYC—Amsterdam –8 GigEs Seattle—Tokyo

  6. AMPATH

  7. GLORIAD

  8. Cyberinfrastructure Characteristics • Community-Focused • virtual organizations • distributed, • collaborative • Scale and Scope • Multidisciplinary • International • Supporting data- and compute-intensive applications • High-end to desktop • Heterogeneous • Common Technology & Policy Platform(s) • Interoperability • Supports characteristics above

  9. Evolution of the Computational Infrastructure Cyberinfrastructure TCS, DTF, ETF Terascale NPACI and Alliance PACI NSF Networking Prior Computing Investments SDSC, NCSA, PSC, CTC Supercomputer Centers | | | | | | 1985 1990 1995 2000 2005 2010

  10. Domain-specific Cybertools (software) Shared Cybertools (software) Distributed Resources (computation, communicationstorage, etc.) Integrated CI Systemmeeting the needs of a community of communities • Applications • Environmental Science • High Energy Physics • Proteomics/Genomics • … DevelopmentTools & Libraries Education and Training Discovery & Innovation Grid Services & Middleware Hardware

  11. Cyberinfrastructureconsists of … • Computational engines (supercomputers, clusters, workstations, small processors, …) • Mass storage (disk drives, tapes, …) • Networking (including wireless, distributed, ubiquitous) • Digital libraries/data bases • Sensors/effectors • Software (operating systems, middleware, domain specific tools/platforms for building applications) • Services (education, training, consulting, user assistance) All working together in an integrated fashion.

  12. In Ten Years, CI will be… • rich in resources, comprehensive in functionality, and ubiquitous; • easily usable by all scientists and engineers, from students to emertii; • accessible anywhere, anytime needed by authenticated users; • interoperable, extendable, flexible, tailorable, and robust; • funded by multiple agencies, states, campuses, and organizations; • supported and utilized by educational programs at all levels.

  13. Technical Challenges • Computer Science and Engineering broadly • How to build the components? • Networks, processors, storage devices, sensors, software • How to shape the technical architecture? • Pervasive, many cyberinfrastructures, constantly evolving/changing capabilities • How to customize CI to particular S&E domains

  14. Cyberinfrastructure Early Adopters • Network for Earthquake Engineering Simulation (NEES) • National Ecological Observatory Network (NEON) • Biomedical Informatics Research Network (BIRN) • Extensible Terascale Facility (ETF)

  15. TeraGrid (ETF) Configuration

  16. Extensible Terascale Facility

  17. 100 Mpbs to 100 Million Homes • NSF Funded Research Project (10/03) - $7.5M • Stanford, Berkeley, CMU, Rice, Fraser Research, Internet2 • Scope • Economics • Technologies (Backbone and Access) • Protocols • Requires a redesign of the access, metropolitan and backbone networks of the Internet • Applications?

  18. Scaling – Homes TODAY 500Kbps X 10 million homes | | *200 | *10 | | | FUTURE 100Mbps X 100 million homes • Scale by a factor of 2000! • 1 million homes connected at 100Mbps == 100Tbps!! • At the network core petabits per second are required

  19. NSF & Cyberinfrastructure • Douglas Gatchell • International Networking Program Director • NSF: National Science Foundation • CISE: Directorate for Computer Information and Science and Engineering • SCI: Division of Shared Cyberinfrastructure • dgatchell@nsf.gov • www.cise.nsf.gov

More Related