ian bird lhc computing grid project leader lhc grid fest 3 rd october 2008 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
The LHC Grid Service PowerPoint Presentation
Download Presentation
The LHC Grid Service

Loading in 2 Seconds...

play fullscreen
1 / 13

The LHC Grid Service - PowerPoint PPT Presentation


  • 102 Views
  • Uploaded on

A worldwide collaboration. Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008. The LHC Grid Service. Introduction. The LHC Grid Service is a worldwide collaboration between: 4 LHC experiments and ~140 computer centres that contribute resources

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'The LHC Grid Service' - gin


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
ian bird lhc computing grid project leader lhc grid fest 3 rd october 2008

A worldwide collaboration

Ian Bird

LHC Computing Grid Project Leader

LHC Grid Fest

3rd October 2008

The LHC Grid Service

introduction
Introduction
  • The LHC Grid Service is a worldwide collaboration between:
    • 4 LHC experiments and
    • ~140 computer centres that contribute resources
    • International grid projects providing software and services
  • The collaboration is brought together by a MoU that:
    • Commits resources for the coming years
    • Agrees a certain level of service availability and reliability
  • As of today 33 countries have signed the MoU:
    • CERN (Tier 0) + 11 large Tier 1 sites
    • 130 Tier 2 sites in 60 “federations”
      • Other sites are expected to participate but without formal commitment

Ian.Bird@cern.ch

the lhc computing challenge
The LHC Computing Challenge
  • Signal/Noise: 10-9
  • Data volume
    • High rate * large number of channels * 4 experiments
    • 15 PetaBytes of new data each year
  • Compute power
    • Event complexity * Nb. events * thousands users
    • 100 k of (today's) fastest CPUs
    • 45 PB of disk storage
  • Worldwide analysis & funding
    • Computing funding locally in major regions & countries
    • Efficient analysis everywhere
    •  GRID technology

Ian.Bird@cern.ch

tier 0 at cern acquisition first pass processing storage distribution
Tier 0 at CERN: Acquisition, First pass processingStorage & Distribution

1.25 GB/sec (ions)

Ian.Bird@cern.ch

tier 0 tier 1 tier 2
Tier 0 – Tier 1 – Tier 2

Tier-0 (CERN):

    • Data recording
    • Initial data reconstruction
    • Data distribution

Tier-1 (11 centres):

  • Permanent storage
  • Re-processing
  • Analysis

Tier-2 (~130 centres):

  • Simulation
  • End-user analysis

Ian.Bird@cern.ch

evolution of grids
Evolution of Grids

OSG

GriPhyN, iVDGL, PPDG

GRID 3

WLCG

EU DataGrid

EGEE 1

EGEE 2

EGEE 3

LCG 1

LCG 2

Cosmics

Service Challenges

First physics

Data Challenges

Ian.Bird@cern.ch

recent grid use
Recent grid use

CERN: 11%

Tier 2: 54%

Tier 1: 35%

350k /day

The grid concept really works – all contributions – large & small contribute to the overall effort!

data transfer out of tier 0
Data transfer out of Tier 0
  • Full experiment rate needed is 650 MB/s
  • Desire capability to sustain twice that to allow for Tier 1 sites to shutdown and recover
  • Have demonstrated far in excess of that
  • All experiments exceeded required rates for extended periods, & simultaneously
  • All Tier 1s achieved (or exceeded) their target acceptance rates
production grids
Production Grids
  • WLCG relies on a production quality infrastructure
    • Requires standards of:
      • Availability/reliability
      • Performance
      • Manageability
    • Will be used 365 days a year ... (has been for several years!)
    • Tier 1s must store the data for at least the lifetime of the LHC - ~20 years
      • Not passive – requires active migration to newer media
  • Vital that we build a fault-tolerant and reliable system
    • That can deal with individual sites being down and recover

Ian.Bird@cern.ch

slide10
WLCG depends on two major science grid infrastructures ….

EGEE - Enabling Grids for E-Science

OSG - US Open Science Grid

... as well as many national grid projects

Interoperability & interoperation is vital

significant effort in building the procedures to support it

slide11

Grid infrastructure project co-funded by the European Commission - now in 2nd phase with 91 partners in 32 countries

240 sites

45 countries

45,000 CPUs

12 PetaBytes

> 5000 users

> 100 VOs

> 100,000 jobs/day

  • Archeology
  • Astronomy
  • Astrophysics
  • Civil Protection
  • Comp. Chemistry
  • Earth Sciences
  • Finance
  • Fusion
  • Geophysics
  • High Energy Physics
  • Life Sciences
  • Multimedia
  • Material Sciences
slide12

OSG Project :

Supported by the Department of Energy & the National Science Foundation

  • Access to 45,000 Cores, 6 Petabytes Disk, 15 Petabytes Tape
  • >15,000 CPU Days/Day
    • ~85% Physics: LHC, Tevatron Run II, LIGO;
    • ~15% non-physics: biology, climate, text mining,
    • Including ~20% Opportunistic use of others resources.
  • Virtual Data Toolkit: Common software developed between Computer Science & applications used by OSG and others.

HARVARD

ALBANY

MIT

BU

BUFFALO

UMICH

UWM

WSU

CORNELL

BNL

MSU

WISC

UIC

PSU

FNAL

IOWA STATE

LEHIGH

UCHICAGO

UNI

ANL

ND

GEORGETOWN

UNL

NERSC

UIOWA

PURDUE

UIUC

IUPUI

UCDAVIS

UVA

NSF

LBL

INDIANA

IU

STANFORD

RENCI

ORNL

VANDERBILT

CALTECH

UCLA

UCR

UNM

CLEMSON

OU

UMISS

SDSC

TTU

LTU

UTA

SMU

LSU

UFL

  • Partnering with:
    • US LHC: Tier-1s, Tier-2s, Tier-3s
    • Campus Grids: Clemson, Wisconsin, Fermilab, Purdue
    • Regional & National Grids: TeraGrid, New York State Grid, EGEE, UK NGS
    • International Collaboration: South America, Central America, Taiwan, Korea, UK.
the lhc grid service

A worldwide collaboration

Has been in production for several years

Is now being used for real data

Is ready to face the computing challenges as LHC gets up to full speed

The LHC Grid Service