1 / 24

GridPP: Executive Summary

GridPP: Executive Summary. Tony Doyle. Contents. M/S/N EGEE Middleware Applications Dissemination What lies ahead? Beyond GridPP2 Grid and e-Science Support in 2008 Executive Summary. What was GridPP1? What is GridPP2? Vision Challenges LCG Data Challenges Issues

Download Presentation

GridPP: Executive Summary

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GridPP: Executive Summary Tony Doyle Collaboration Board

  2. Contents • M/S/N • EGEE Middleware • Applications • Dissemination • What lies ahead? • Beyond GridPP2 • Grid and e-Science Support in 2008 • Executive Summary • What was GridPP1? • What is GridPP2? • Vision • Challenges • LCG • Data Challenges • Issues • Deployment Status (9/1/05) • Tier-1/A, Tier-2, NGS Collaboration Board

  3. What was GridPP1? • A team that built a working prototype grid of significant scale > 2,000 (9,000) CPUs > 1,000 (5,000) TB of available storage > 1,000 (6,000) simultaneous jobs • A complex project where 88% of the milestones were completed and all metrics were within specification A Success “The achievement of something desired, planned, or attempted” Collaboration Board

  4. What is GridPP2? Structures agreed and in place (except LCG phase-2) • 253 Milestones, 112 Monitoring Metrics at present. • Must deliver a “Production Grid”: robust, reliable, resilient, secure, stable service delivered to end-user applications. • The Collaboration aims to develop, deploy and operate a very large Production Grid in the UK for use by the worldwide particle physics community. Collaboration Board

  5. SCALE: GridPP will deliver Grid middleware and hardware infrastructure to enable the construction of a UK Production Grid for the LHC of significant scale. INTEGRATION: The GridPP project is designed to integrate with the existing Particle Physics programme within the UK, thus enabling full use of Grid technology and efficient use of shared resources. DISSEMINATION: The project will disseminate the GridPP deliverables in the multi-disciplinary e-Science environment and will seek to build collaborations with emerging non-PPARC Grid activities both nationally and internationally. UK LHC COMPUTING: The main aim is to provide a computing environment for the UK Particle Physics Community capable of meeting the challenges posed by the unprecedented data, processing and analysis requirements of the LHC experiments. OTHER UK PARTICLE PHYSICS COMPUTING: The process of creating and testing the computing environment for the LHC will naturally support the current and next generation of highly data intensive Particle Physics experiments. EGEE: Grid technology is the framework used to develop the required capability: key components will be developed as part of the EGEE project and elsewhere. LCG: The collaboration builds on the strong computing traditions of the UK at CERN. GridPP will make a strong contribution to the LCG deployment and operations programme. INTEROPERABILITY: The project is integrated with national and international developments from other Grid projects and the GGF in order to ensure a common set of principles, protocols and standards that can support a wide range of applications. INFRASTRUCTURE: Provision is made for a Tier-1 facility at RAL and four Regional Tier-2s, encompassing the collaborating Institutes. OTHER FUNDING: The Tier-1 and Tier-2s will provide a focus for dissemination to the academic and commercial sector and will attract additional funds such that the full programme can be realised. Vision Collaboration Board

  6. What are the Grid challenges? • Must • share databetween thousands of scientists with multiple interests • link major (Tier-0 [Tier-1]) and minor (Tier-1 [Tier-2])computer centres • ensure all data accessible anywhere, anytime • grow rapidly, yet remainreliablefor more than a decade • cope withdifferent management policiesof different centres • ensure data security • be up and running routinely by2007 Collaboration Board

  7. What are the Grid challenges? 2. Software efficiency 1. Software process 3. Deployment planning 4. Link centres 10. Policies 5. Share data Data Management, Security and Sharing 9. Accounting 8. Analyse data 7. Install software 6. Manage data Collaboration Board

  8. LCG Overview • By 2007: • 100,000 CPUs • - More than 100 institutes worldwide • building on complex middleware being developed in advanced Grid technology projects, both in Europe (Glite) and in the USA (VDT) • prototype went live in September 2003 in 12 countries • Extensively tested by the LHC experiments during this summer Collaboration Board

  9. Data Challenges • Ongoing.. • Grid and non-Grid Production • Grid now significant • ALICE - 35 CPU Years • Phase 1 done • Phase 2 ongoing LCG • CMS - 75 M events and 150 TB: first of this year’s Grid data challenges Entering Grid Production Phase.. Collaboration Board

  10. Data Challenge • 7.7 M GEANT4 events and 22 TB • UK ~20% of LCG • Ongoing.. • (3) Grid Production • ~150 CPU years so far • Largest total computing requirement • Small fraction of what ATLAS need.. Entering Grid Production Phase.. Collaboration Board

  11. LHCb Data Challenge 186 M Produced Events Phase 1 Completed 3-5 106/day LCG restarted LCG paused LCG in action 1.8 106/day DIRAC alone 424 CPU years (4,000 kSI2k months), 186M events • UK’s input significant (>1/4 total) • LCG(UK) resource: • Tier-1 7.7% • Tier-2 sites: • London 3.9% • South 2.3% • North 1.4% • DIRAC: • Imperial 2.0% • L'pool 3.1% • Oxford 0.1% • ScotGrid 5.1% Entering Grid Production Phase.. Collaboration Board

  12. Paradigm ShiftTransition to Grid… 424 CPU · Years May: 89%:11% 11% of DC’04 Jun: 80%:20% 25% of DC’04 Jul: 77%:23% 22% of DC’04 Aug: 27%:73% 42% of DC’04 Collaboration Board

  13. Issues “LCG-2 MIDDLEWARE PROBLEMS AND REQUIREMENTS FOR LHC EXPERIMENT DATA CHALLENGES” First large-scale Grid production problems being addressed… at all levels https://edms.cern.ch/file/495809/2.2/LCG2-Limitations_and_Requirements.pdf Collaboration Board

  14. Coordinates resources that are not subject to centralized control … using standard, open, general-purpose protocols and interfaces … to deliver nontrivial qualities of service YES. This is why development and maintenance of LCG is important. YES. VDT (Globus/Condor-G) + EDG/EGEE(Glite) ~meet this requirement. YES. LHC experiments data challenges over the summer of 2004. 5 Is GridPP a Grid? http://www-fp.mcs.anl.gov/~foster/Articles/WhatIsTheGrid.pdf http://agenda.cern.ch/fullAgenda.php?ida=a042133 Collaboration Board

  15. GridPP Deployment Status (9/1/05) GridPP deployment is part of LCG (Currently the largest Grid in the world) The future Grid in the UK is dependent upon LCG releases Three Grids on Global scale in HEP (similar functionality) sites CPUs • LCG (GridPP) 90 (16) 9000 (2029) • Grid3 [USA] 29 2800 • NorduGrid 30 3200 Collaboration Board

  16. UK Tier-1/A Centre Rutherford Appleton Laboratory Grid Resource Discovery Time = 8 Hours • High quality data services • National and international role • UK focus for international Grid development 1000 CPU 200 TB Disk 60 TB Tape (Capacity 1PB) 2004 Disk Use 2004 CPU Utilisation Collaboration Board

  17. UK Tier-2 Centres ScotGrid Durham, Edinburgh, Glasgow NorthGrid Daresbury,Lancaster, Liverpool, Manchester, Sheffield SouthGrid Birmingham, Bristol, Cambridge, Oxford, RAL PPD,Warwick LondonGrid Brunel, Imperial, QMUL, RHUL, UCL Collaboration Board

  18. * Leeds Manchester * * DL * Oxford RAL * • In future will include services to facilitate collaborative (grid) computing • Authentication (PKI X509) • Job submission/batch service • Resource brokering • Authorisation • Virtual Organisation management • Certificate management • Information service • Data access/integration • (SRB/OGSA-DAI/DQPS) • National Registry (of registry’s) • Data replication • Data caching • Grid monitoring • Accounting Level-2 Grid Collaboration Board

  19. Middleware Development Network Monitoring Configuration Management Grid Data Management Storage Interfaces Information Services Security Collaboration Board

  20. Application Development ATLAS LHCb CMS SAMGrid (FermiLab) BaBar (SLAC) QCDGrid PhenoGrid Collaboration Board

  21. More Applications • ZEUS uses LCG • needs the Grid to respond to increasing demand for MC production • 5 million Geant events on Grid since August 2004 • QCDGrid • For UKQCD • Currently a 4-site data grid • Key technologies used • - Globus Toolkit 2.4 • - European DataGrid • eXist XML database • managing a few hundred gigabytes of data Collaboration Board

  22. Disseminationmuch has happened.. more people are reading about it.. LHCb-UK members get up to speed with the Grid Wed 5 Jan 2005GridPP in Pittsburgh Thu 9 Dec 2004GridPP website busier than ever Mon 6 Dec 2004Optorsim 2.0 released Wed 24 Nov 2004ZEUS produces 5 million Grid events Mon 15 Nov 2004CERN 50th anniversary reception Tue 26 Oct 2004GridPP at CHEP'04 Mon 18 Oct 2004LHCb data challenge first phase a success for LCG and UK Mon 4 Oct 2004Networking in Nottingham - GLIF launch meeting Mon 4 Oct 2004GridPP going for Gold - website award at AHM Mon 6 Sep 2004GridPP at the All Hands Meeting Wed 1 Sep 2004R-GMA included in latest LCG release Wed 18 Aug 2004LCG2 administrators learn tips and tricks in Oxford Tue 27 Jul 2004Take me to your (project) leader Fri 2 Jul 2004ScotGrid's 2nd birthday: ScotGrid clocks up 1 million CPU hours Fri 25 Jun 2004Meet your production manager Fri 18 Jun 2004GridPP10 report and photographs Wed 9 Jun 2004CERN recognizes UK's outstanding contribution to Grid computing Wed 2 Jun 2004UK particle physics Grid takes shape Wed 19 May 2004A new monitoring map for GridPP Mon 10 May 2004Press reaction to EGEE launch Tue 4 May 2004GridPP at the EGEE launch conference Tue 27 Apr 2004LCG2 released Thu 8 Apr 2004University of Warwick joins GridPP Thu 8 Apr 2004 Grid computing steps up a gear: the start of EGEE Thu 1 Apr 2004EDG gets glowing final review Mon 22 Mar 2004Grids and Web Services meeting, 23 April, London Tue 16 Mar 2004EU DataGrid Software License approved by OSI Fri 27 Feb 2004GridPP Middleware workshop, March 4-5 2004, UCL Fri 20 Feb 2004Version 1.0 of the Optorsim grid simulation tool released by EU DataGrid Tue 17 Feb 2004Summary and photographs of the 9th GridPP Collaboration Meetin Thu 12 Feb 2004 138,976 hits in December Collaboration Board

  23. What lies ahead? Some mountain climbing.. Annual data storage: 12-14 PetaBytes per year CD stack with 1 year LHC data (~ 20 km) 100 Million SPECint2000 Importance of step-by-step planning… Pre-plan your trip, carry an ice axe and crampons and arrange for a guide… Concorde (15 km) In production terms, we’ve made base camp  100,000 PCs (3 GHz Pentium 4) We are here (1 km) Quantitatively, we’re ~9% of the way there in terms of CPU (9,000 ex 100,000) and disk (3 ex 12-14*3 years)… Collaboration Board

  24. The Grid is a reality A project was/is needed Under control LCG2 support: SC case being written 16 UK sites are on the Grid MoUs, planning, deployment, monitoring each underway as part of GridPP2 Developments estd.,R-GMA deployed Glite designed inc. web services Interfaces developed, testing phase Area transformed Initial ideas.. consultation reqd. Introduction Project Management Resources LCG Deployment Tier-1/A production Tier-2 resources M/S/N EGEE Applications Dissemination (Beyond GridPP2) Executive SummaryGRIDPP-PMB-40-EXEC Collaboration Board

More Related