1 / 24

LHC Computing Grid Project

This document discusses the requirements, opportunities, and goals of the LHC Computing Grid Project, including data storage, data analysis, cost containment, and the evolution of distributed computing to grids. It also outlines the two phases of the project and the areas of work involved.

eapple
Download Presentation

LHC Computing Grid Project

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LHC Computing Grid Project GridPP Collaboration Meeting Edinburgh, November 2001 Les Robertson CERN - IT Division les.robertson@cern.ch

  2. The Requirements

  3. The Large Hadron Collider Project 4 detectors CMS ATLAS Storage – Raw recording rate 0.1 – 1 GBytes/sec Accumulating at 5-8 PetaBytes/year 10 PetaBytes of disk Processing – 200,000 of today’s fastest PCs LHCb

  4. Worldwide distributed computing system • Small fraction of the analysis at CERN • ESD analysis – using 12-20 large regional centres • how to use the resources efficiently • establishing and maintaining a uniform physics environment • Data exchange – with tens of smaller regional centres, universities, labs Importance of cost containment • components & architecture • utilisation efficiency • maintenance, capacity evolution • personnel & management costs • ease of use (usability efficiency)

  5. From Distributed Clusters to Fabrics & Grids

  6. WAN application servers mass storage data cache Distributed Computing Distributed computing - 1990’s - locally distributed systems • Clusters • Parallel computers (IBM SP) • Advances in local area networks, cluster management techniques  1,000-way clusters widely available Distributed Computing – 2000’s • Giant clusters  fabrics • New level of automation required • Geographically distributed systems • Computational Grids • Key areas for R&D • Fabric management • Grid middleware • High-performance networking • Grid operation

  7. CERN – Tier 0 2.5 Gbps IN2P3 622 Mbps RAL FNAL Tier 1 155 mbps 155 mbps 622 Mbps Uni n Lab a Tier2 Uni b Lab c   Department  Desktop MONARC report: http://home.cern.ch/~barone/monarc/RCArchitecture.html The MONARC Multi-Tier Model (1999) les.robertson@cern.ch

  8. The opportunity of Grid technology CMS ATLAS Lab m Uni x regional group CERN Tier 1 Uni a UK USA Lab a France Tier 1 Tier3 physics department Uni n CERN Tier2 LHCb ………. Italy CERN Tier 0 Desktop Lab b Germany ………. Lab c  Uni y Uni b physics group   Tier 0 Centre at CERN LHC Computing Model2001 - evolving The opportunity of Grid technology The LHC Computing Centre les.robertson@cern.ch

  9. The Project

  10. The LHC Computing Grid Project Two phases Phase 1 – 2002-04 • Development and prototyping • Approved by CERN Council 20 September 2001 Phase 2 –2005-07 • Installation and operation of the full world-wide initial production Grid

  11. The LHC Computing Grid Project Phase 1 Goals – • Prepare the LHC computing environment • provide the common tools and infrastructure for the physics application software • establish the technology for fabric, network and grid management (buy, borrow, or build) • develop models for building the Phase 2 Grid • validate the technology and models by building progressively more complex Grid prototypes • operate a series of data challenges for the experiments • maintain reasonable opportunities for the re-use of the results of the project in other fields • Deploy a 50% model* production GRID including the committed LHC Regional Centres • Produce a Technical Design Report for the full LHC Computing Grid to be built in Phase 2 of the project * 50% of the complexity of one of the LHC experiments

  12. Funding of Phase 1 at CERN Funding for R&D activities at CERN during 2002-2004 partly through special contributions from member and associate states • Major funding – people and materials - from • United Kingdom – as part of PPARC’s GridPP project • Italy – INFN • Personnel and some materials at CERN also promised by – Austria, Belgium, Bulgaria, Czech Republic, France, Germany, Greece, Hungary, Israel, Spain, Switzerland • Industrial funding – CERN openlab • European Union – Datagrid, DataTag • Funded so far - all of the personnel, ~1/3 of the materials

  13. Areas of Work Applications Support & Coordination • Application Software Infrastructure – libraries, tools • Object persistency, data models • Common Frameworks – Simulation, Analysis, .. • Adaptation of Physics Applications to Grid environment • Grid tools, Portals Grid Deployment • Data Challenges • Integration of the Grid & Physics Environments • Regional Centre Coordination • Network Planning • Grid Operations Computing System • Physics Data Management • Fabric Management • Physics Data Storage • LAN Management • Wide-area Networking • Security • Internet Services Grid Technology • Grid middleware • Scheduling • Data Management • Monitoring • Error Detection & Recovery • Standard application services layer

  14. Synchronised with DataGrid Prototypes

  15. Time constraints proto 3 proto 1 proto 2 continuing R&D programme prototyping pilot technology selection pilot service system software selection, development, acquisition hardware selection, acquisition 1st production service 2001 2002 2003 2004 2005 2006

  16. Organisation

  17. RTAG The LHC Computing GridProject Structure LHCC CommonComputingRRB Reviews The LHC Computing Grid Project Resource Matters Reports Project Overview Board Project Manager ProjectExecutionBoard Software andComputingCommittee(SC2) Requirements, Monitoring implementation teams

  18. EUDataGridProject RTAG The LHC Computing GridProject Structure LHCC CommonComputingRRB Reviews The LHC Computing Grid Project Resource Matters Reports Project Overview Board OtherComputingGridProjects Project Manager ProjectExecutionBoard Software andComputingCommittee(SC2) Requirements, Monitoring Other HEPGridProjects implementation teams Other Labs

  19. A few of the Grid Technology Projects Data-intensive projects • DataGrid – 21 partners, coordinated by CERN (Fabrizio Gagliardi) • CrossGrid – 23 partners complementary to DataGrid (Michal Turala) • DataTAG – funding for transatlantic demonstration Grids (Olivier Martin) European national HEP related projects • GridPP (UK); INFN Grid; Dutch Grid; NorduGrid; Hungarian Grid; …… US HEP projects • GriPhyN – NSF funding; HEP applications • PPDG – Particle Physics Data Grid – DoE funding • iVDGL – international Virtual Data Grid Laboratory Global Coordination • Global Grid Forum • InterGrid – ad hoc HENP Grid coordination (Larry Price)

  20. Grid Technology for the LHC Grid • An LHC collaboration needs a usable, coherent computing environment – a Virtual Computing Centre - a Worldwide Grid • Already – even in the HEP community - there are several Grid technology development projects, with similar but different goals • And many of these overlap with other communities • How do we achieve and maintain compatibility, provide one usable computing system? • architecture? api? protocols? …… while remaining open to external, industrial solutions This will be a significant challenge for the LHC Computing Grid Project

  21. RTAG The LHC Computing GridProject Structure LHCC CommonComputingRRB Reviews Project Overview BoardChair: CERN Director for Scientific ComputingSecretary: CERN IT Division Leader Membership:Spokespersons of LHC experimentsCERN Director for CollidersRepresentatives of countries/regions with Tier-1 center :France, Germany, Italy, Japan, United Kingdom, United States of America4 Representatives of countries/regions with Tier-2 center from CERN Member StatesIn attendance:Project LeaderSC2 Chairperson The LHC Computing Grid Project Reports Resource Matters Project Overview Board Project Manager ProjectExecutionBoard Software andComputingCommittee(SC2) Requirements, Monitoring implementation teams

  22. RTAG The LHC Computing GridProject Structure LHCC CommonComputingRRB Reviews Software and Computing Committee (SC2)(Preliminary) Sets the requirements Approves the strategy & workplan Monitors progress and adherence to the requirements Gets technical advice from short-lived focused RTAGs (Requirements & Technology Assessment Groups) Chair: to be appointed by CERN Director GeneralSecretary Membership:2 coordinators from each LHC experiment Representative from CERN EP Division Technical Managers from centers in each region represented in the POBLeader of the CERN Information Technology DivisionProject LeaderInvited: POB Chairperson The LHC Computing Grid Project Reports Resource Matters Project Overview Board Project Manager ProjectExecutionBoard Software andComputingCommittee(SC2) Requirements, Monitoring implementation teams

  23. RTAG The LHC Computing GridProject Structure • Project Execution BoardGets agreement on milestones, schedule, resource allocation Manages the progress and direction of the project Ensures conformance with SC2 recommendations Identifies areas for study/resolution by SC2 • Membership (preliminary – POB approval required) • Project Management Team: • Project Leader Area Coordinators • Applications • Fabric & basic computing systems • Grid technology - from worldwide grid projects • Grid deployment, regional centres, data challenges • Empowered representative from each LHC Experiment • Project architect Resource manager • Leaders of major contributing teams • Constrain to 15—18 members LHCC CommonComputingRRB Reviews The LHC Computing Grid Project Reports Resource Matters Project Overview Board Project Manager ProjectExecutionBoard Software andComputingCommittee(SC2) Requirements, Monitoring implementation teams

  24. Startup • Collaborations to appoint board members by 12 November • Hope to start POB, SC2, PEB meetings in November • Kick-off workshop in February

More Related