Symposium on knowledge environments for science henp collaboration internet2
This presentation is the property of its rightful owner.
Sponsored Links
1 / 26

Symposium on Knowledge Environments for Science: HENP Collaboration & Internet2 PowerPoint PPT Presentation


  • 48 Views
  • Uploaded on
  • Presentation posted in: General

Symposium on Knowledge Environments for Science: HENP Collaboration & Internet2. Douglas Van Houweling President & CEO, Internet2/UCAID November 26, 2002. Overview. High Energy Physics Computing Challenges Internet2 Infrastructure Issues Observations. HENP Computing Challenges.

Download Presentation

Symposium on Knowledge Environments for Science: HENP Collaboration & Internet2

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Symposium on knowledge environments for science henp collaboration internet2

Symposium on Knowledge Environments for Science:HENP Collaboration & Internet2

Douglas Van Houweling

President & CEO, Internet2/UCAID

November 26, 2002


Overview

Overview

  • High Energy Physics Computing Challenges

  • Internet2 Infrastructure Issues

  • Observations


Henp computing challenges

HENP Computing Challenges

  • Geographical dispersion of people and resources

  • Complexity of the detector and the LHC environment

  • Scale: Tens of Petabytes per year of data

  • Major challenges associated with:

    • Communication and collaboration at a distance

    • Managing globally distributed computing & data resources

    • Cooperative software development and physics analysis

  • 5000+ Physicists

  • 250+ Institutes

  • 60+ Countries


Data grids

Data Grids

  • Data Grids- New Forms of Distributed Systems

  • Four LHC Experiments

    • ATLAS, CMS, ALICE, LHCB

  • Data Stored:

    • ~40+ Petabytes/year

  • CPU:

    • 0.30+ PetaFlOPS/year

  • LHC Experiments producing Exabytes (1 EB = 1018 Bytes)

    • 0.1 EB in 2007

    • 1.0 EB by 2012


Lhc data grid hierarchy

~PByte/sec

CERN 700k SI95 ~1 PB Disk; Tape Robot

Experiment

Tier2 Center

Tier2 Center

Tier2 Center

Tier2 Center

Tier2 Center

HPSS

HPSS

HPSS

HPSS

LHC Data Grid Hierarchy

CERN/Outside Resource Ratio ~1:2Tier0/( Tier1)/( Tier2) ~1:1:1

~100-1500 MBytes/sec

Online System

Tier 0 +1

HPSS

~2.5 Gbps

Tier 1

FNAL: 200k SI95; 600 TB

IN2P3 Center

INFN Center

RAL Center

2.5 Gbps

Tier 2

~2.5 Gbps

Tier 3

Institute ~0.25TIPS

Institute

Institute

Institute

Physicists work on analysis “channels”

Each institute has ~10 physicists working on one or more channels

Physics data cache

0.1–10 Gbps

Tier 4

Workstations


Transatlantic bw reqs

TransAtlantic BW Reqs

Transatlantic Net WG (HN, L. Price), Installed BW. Maximum Link Occupancy 50% Assumed

See http://gate.hep.anl.gov/lprice/TAN


Emerging datagrid community

Emerging DataGrid Community

  • Grid Physics Network (GriPhyN)

    • ATLAS, CMS, LIGO, SDSS

  • Access Grid; VRVS: supporting group-based collaboration

    And

  • Others presented at this symposium


Current grid challenges

Current Grid Challenges

  • Stable High Performance Network Platform

  • Standard Core Middleware

  • Secure Workflow Management and Optimization

  • Maintaining a Global View of Resources and System State

  • Workflow: Strategic Balance of Policy Versus Moment-to-moment Capability to Complete Tasks

  • Handling User-Grid Interactions: Guidelines; Agents

  • Building Higher Level Services, and an IntegratedScalable User Environment for the Above


Datatag project

UK

SuperJANET4

NL

Atrium

SURFnet

VTHD

GEANT

It

GARR-B

Fr

INRIA

DataTAG Project

  • EU-Solicited Project. CERN, PPARC (UK), Amsterdam (NL), and INFN (IT);and US (DOE/NSF: UIC, NWU and Caltech) partners

  • Main Aims:

    • Ensure maximum interoperability between US and EU Grid Projects

    • Transatlantic Testbed for advanced network research

  • 2.5 Gbps Wavelength Triangle from 7/02; to 10 Gbps Triangle by Early 2003

NewYork

ABILENE

STARLIGHT

ESNET

2.5G

GENEVA

Wave Triangle

2.5G

10G

CALREN2

STAR-TAP


Infrastructure issues

Infrastructure Issues

  • Network performance & stability

    • Abilene -> 10 gig wavelength

    • End-to-end performance

    • National Light Rail

  • Middleware

    • NSF Middleware Initiative

    • Core middleware – Shibboleth, etc.

  • Application requirements

    • Multicast, IPv6


National light rail footprint

SEA

POR

SAC

BOS

NYC

CHI

OGD

SVL

DEN

CLE

WDC

PIT

FRE

KAN

RAL

NAS

STR

LAX

PHO

WAL

ATL

SDG

OLG

DAL

15808 Terminal, Regen or OADM site

Fiber route

National Light Rail Footprint

NLR

  • Buildout Startsin 2003

  • Initially 4 10 Gb Wavelengths

  • To 40 10Gb Waves in Future

NREN Backbones reached 2.5-10 Gbps in 2002 in Europe, Japan and US;US: Transition now to optical, dark fiber, multi-wavelength R&E network


Some thoughts

Some thoughts…

  • Technology is rapidly progressing

  • We can move more bits, faster and over many types of media

  • Many changes in scientific practice are emerging

    • Difference between data collectors and analyzers

    • Synchronization of many instruments

    • Combination of simulation and observation

    • Shifting focus from instruments to datasets

    • And many more…


Henp working group

HENP Working Group

  • High Energy and Nuclear Physics Working Group

  • Formed working group in late 2001

  • Needed additional focus on network intensive aspects of their research

  • Currently over 80 individuals participating


Henp experiment example

HENP- Experiment Example

  • Large Hadron Collider (2006)

  • Largest superconductor installation in the world

  • Generating multiple petabytes of data per year, gigabytes per second

  • One in a trillion events might lead to a major physics discovery


Henp applications

HENP- Applications

  • Remote Collaboration, VRVS

  • Distributed Data Storage

  • Distributed Computation and Databases

  • Dynamic Visualizations


Neesgrid

NEESGrid

  • Network for Earthquake Engineering Simulation

  • A “Grid” Project

  • Consists of 10 initial sites across the U.S. addressing the needs of structural, geo- technical and tsunami researchers


Neesgrid applications

NEESGrid- Applications

  • Video as Data

  • Collaboration

  • Remote Instrumentaiton

  • Distributed Data storage

  • Final goal- simultaneous physical and computational experiments


Evlbi astronomy

eVLBI (Astronomy)

  • Electronic Very Long Baseline Interferometry

  • Astronomers combine data from multiple antennas to create a single image that is more accurate than any single antenna could create

  • Requires coordination of multiple physical resources as well as advanced network services


Evlbi experiment example

eVLBI- Experiment Example

  • Astronomers collect data about a star from many different earth based antennae and send the data to a specialized computer for analysis on a 24x7 basis

  • VLBI is not as concerned with data loss as they are with long term stability (unlike physics)

  • The end goal is to send data at 1Gb/s from over 20 antennae that are located around the globe.


Evlbi applications

eVLBI- Applications

  • Advanced network protocol development

  • Cooperation and participation across international networks

  • Remote instrumentation

  • Real time data analysis allows for flexibility and agility in response to transient astronomical events


Symposium on knowledge environments for science henp collaboration internet2

www.internet2.edu


Building global grids

Building Global Grids

  • Implications for Society

  • Meeting the challenges of Petabyte-to-Exabyte Grids, and Gigabit-to-Terabit Networks, will transform research in science and engineering

  • These developments could create the first truly global virtual organizations (GVO)

  • If these developments are successful, and deployed widely as standards, this could lead to profound advances in industry, commerce and society at large

    • By changing the relationship between people and “persistent” information in their daily lives

    • Within the next five to ten years

  • Realizing the benefits of these developments for society, and creating a sustainable cycle of innovation compels us

    • TO CLOSE the DIGITAL DIVIDE


Closing the digital divide

Closing the Digital Divide

  • What HENP and the World Community Can Do

  • Spread the message: ICFA SCIC, IEEAF et al. can help

  • Help identify and highlight specific needs (to Work On)

    • Policy problems; Last Mile problems; etc.

  • Encourage Joint programs [DESY’s Silk project; Japanese links to SE Asia and China; AMPATH to So. America]

    • NSF & @LIS Proposals: US and EU to South America

  • Make direct contacts, arrange discussions with gov’t officials

    • ICFA SCIC is prepared to participate where appropriate

  • Help Start, Get Support for Workshops on Networks & Grids

    • Encourage, help form funded programs

  • Help form Regional support & training groups (requires funding)


Technology stewardship

Technology, Stewardship

  • Access to and development of leading infrastructures and new classes of information-rich systems carries obligations

    • Stewardship

    • Playing a leading role in making these assets usable by a broad sector of the World Community

  • Examples

    • Develop devices and systems for the disabled; With no discrimination against any area of society

    • Develop standardized toolkits and portals for wide access from schools

    • Encourage joint programs and support from industry

    • Strong education and outreach components in all medium and large research proposals (e.g. NSF)


Intro

INTRO

Doug,

Slides 3 to 17 are modified from Harvey’s talk. I am not totally familiar with his stuff.

Slide 18 is blank

Slides 19 to 28 are from my standard slide deck. I know these in depth. Can give detailed talking points.

Feel free to call me on my cell phone if you have questions: 734.730.3300

I will be around all day/evening

- Charles


  • Login