1 / 29

Beyond GridPP2

Beyond GridPP2. Tony Doyle. Outline. Context Experiments + Grid = Discovery? Long Term Grid Challenges Grid Components Resource Planning Middleware Support Experiment Requirements LHC Exploitation Planning Review “The Grid for LHC Exploitation” Timeline and Next Steps.

Download Presentation

Beyond GridPP2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Beyond GridPP2 Tony Doyle Collaboration Board

  2. Outline • Context • Experiments + Grid = Discovery? • Long Term Grid Challenges • Grid Components • Resource Planning • Middleware Support • Experiment Requirements • LHC Exploitation Planning Review • “The Grid for LHC Exploitation” • Timeline and Next Steps Collaboration Board

  3. LHC exploitation • Physics discovery requires many elements to work.. The Icemen Cometh 2009 2008 2007 2008-09 “Physics discovery” 2006-07 “Performance Testing” 2004-05 “Functional Testing” 2006 2005 2004 Collaboration Board

  4. What are the Grid challenges? 2. Software efficiency 1. Software process 3. Deployment planning 4. Link centres 10. Policies 5. Share data Data Management, Security and Sharing 9. Accounting 8. Analyse data 7. Install software 6. Manage data Collaboration Board

  5. Grid for LHC exploitation I. Experiment Layer II. Application Middleware III. Grid Middleware IV. Facilities and Fabrics What areas require support? I LHC experiments computing MoU support via Rolling Grants I Future experiments adopt Grid II Generic application development • Reduced application and M/S/N development compared to current III M/S/N experts in each of 6 areas III Production Manager and four Tier-2 coordinators IV Tier-1 system managers running the Data Centre IV Hardware annual upgrade IV Tier-2 System managers typically  (non-PPARC) hardware IV Frontend Tier-2 hardware • Reduced GridPP management: see Steve’s slide Collaboration Board

  6. LCG Tier-1 Planning(CPU & Storage) Experiment requests are large e.g. in 2008 CPU ~50MSi2k Storage ~70PB! They can be met globally except in 2008. UK expected to contribute ~ 7%. [Currently more] First LCG Tier-1 Compute Law: CPU:Storage ~1[kSi2k/TB] Second LCG Tier-1 Storage Law: Disk:Tape ~ 1 (The number to remember is.. 1) Collaboration Board

  7. LCG Tier-1 Planning(Storage) Collaboration Board

  8. LCG Tier-1 Planning • 2006: March 2005 detailed planning (bottom up) v26b • [uncertainty on when within 2006 – PPARC now approved ~£996k] • PPARC ready to sign MoU • 2007-10: • March 2005 detailed planning (bottom up) v26b [old plan] • August 2005 minimal Grid (top down) [input requiring experiments support, part of LHC EXPLOITATION PLANNING REVIEW..] Collaboration Board

  9. LCG Tier-2 Planning Third LCG Tier-2 Compute Law: Tier-1:Tier-2 CPU ~1 Zeroth LCG Law: There is no Zeroth law – all is uncertain Fifth LCG Tier-2 Storage Law:CPU:Disk~5[kSi2k/TB]) • 2006: October 2004 Institute MoU commitments • [Outturn 2005] requirement currently less than “planned”, need to monitor this planning.. • PPARC ready to sign MoU • 2007-10: • 2007 MoU, followed by pessimistic guess [current plan] • August 2005 minimal Grid (top down) [input requiring experiments support, part of LHC EXPLOITATION PLANNING REVIEW..] Collaboration Board

  10. Deliver a 24/7 Grid service to European science build a consistent, robust and secure Grid network that will attract additional computing resources. continuously improve and maintain the middleware in order to deliver a reliable service to users. attract new users from industry as well as science and ensure they receive the high standard of training and support they need. 100 million euros/4years, funded by EU >400 software engineers + service support 70++ European partners Enabling Grids for E-science in Europe is “E-Infrastructure” Collaboration Board

  11. Phase 2 Overview EGEE is the Grid Infrastructure Project in Europe • Take the lead in developing roadmaps, white papers, collaborations • Organise European flagship events • Collaborate with other projects (including CPS) • start date = April 1 2006 • UK(+I) partners • CCLRC+Edinburgh+PPARC (+TCD) (n.b. UK e-Science, not only HEP) • NeSC : Training, Dissemination & Applications • NeSC : Networking • CLRC : Grid Operations, Support & Management • CLRC : Middleware Engineering (R-GMA) • UK phase 2 added partners (T2 co-ordinators) • Glasgow, ICSTM, Manchester, Oxford • Funded effort dedicated to deploying regional grids • Matched funding required Collaboration Board

  12. Middleware Development Network Monitoring Configuration Management Grid Data Management Storage Interfaces Information Services Security Collaboration Board

  13. Technical Design Reports (June 2005) Computing Technical Design Reports: http://doc.cern.ch/archive/electronic/cern/ preprints/lhcc/public/ ALICE: lhcc-2005-018.pdf ATLAS: lhcc-2005-022.pdf CMS: lhcc-2005-023.pdf LHCb: lhcc-2005-019.pdf LCG: lhcc-2005-024.pdf LCG Baseline Services Group Report: http://cern.ch/LCG/peb/bs/BSReport-v1.0.pdf Contains all you (probably) need to know about LHC computing. End of prototype phase. Collaboration Board

  14. UK Support for the LHC Experiments • The basic functionality of the Tier-1 is: ALICE Reconstruction Analysis ATLAS Reconstruction SkimmingScheduled Analysis Calibration CMS Reconstruction Skimming Scheduled Analysis LHCb Reconstruction SkimmingAnalysis • The basic functionality of the Tier-2s is: ALICE SimulationAnalysis ATLAS Simulation Calibration “Chaotic” Analysis CMS Simulation “Chaotic” Analysis LHCb Simulation Collaboration Board

  15. PPARC Review Funding from September 2007 will be incorporated as part of PPARC’s request for planning input for LHC exploitation from the LHC experiments and GridPP that will be considered by a Panel consisting of Prof. G. Lafferty (Chair), Prof. S. Watts and Dr. P. Harris meeting over the summer to provide input to Science Committee in the Autumn. Conclusion: separate e-Science call in March 2006 Collaboration Board

  16. "Beyond GridPP2 and e-Infrastructure" LHC EXPLOITATION PLANNING REVIEW Input is requested from the UK project spokespersons, for ATLAS and CMS for each of the financial years 2008/9 to 2011/12, and for LHCb, ALICE and GridPP for 2007/8 to 2011/12. Physics programme Please give a brief outline of the planned physics programme. Please also indicate how this planned programme could be enhanced with additional resources. In total this should be no more than 3 sides of A4. The aim is to understand the incremental physics return from increasing resources. Input was based upon PPAP roadmap input E-Science and LCG-2 (26 Oct 2004) and feedback from CB (12 Jan & 7 July 2005) 3 page description: “The Grid for LHC Exploitation” submitted in August 2005 Collaboration Board

  17. Beyond GridPP2.. • 3 page description: “The Grid for LHC Exploitation” • “In order to calculate the minimum amount of resource required at the UK Tier-1 and Tier-2 we have taken the total Tier-1 and Tier-2 requirements of the experiments multiplied by a UK ‘share’.” • Experiments should determine the “incremental physics return from increasing resources”. Collaboration Board

  18. Tier-1 Requirements • Minimal UK Grid – each experiments may wish to increase their share (tape omitted for clarity) Collaboration Board

  19. Tier-2 Requirements • Initial requirements can be met via SRIF3 (2006-08..) • Uncertain beyond this.. Collaboration Board

  20. Manpower • Input Requirements for “minimal” Grid • Supports LHC and other experiments • Does not include wider E-Infrastructure (EGEE and beyond) Collaboration Board

  21. FTEs Beyond GridPP2 • Operations support maintained [38 FTEs] • Transition between FY08-10 • Reduced management • Application interfaces moving from LHC to new experiments • Reduced middleware development, support maintained Collaboration Board

  22. Estimated Costs • Naïve Full Economic Cost approach ~£10m p.a. Collaboration Board

  23. Cost Breakdown Total: £9,393k Collaboration Board

  24. Grid and e-ScienceExploitation Timeline • PPARC call assessment (2007-2010) 2004-05 • PPAP initial input Oct 2004 • LHC exploitation review input Aug 2005 • CB ratification Mar 2006 • (GridPP = Grid Support + Tier-1 + 4 xTier-2) • PPARC call Mar 2006 • 06Q1 logbooks, Proforma 2 (Group input) Apr 2006 • Internal panel review May 2006 • draft GridPP proposal Jun 2006 • PPARC GridPP Oversight Committee Jun 2006 • PPARC close of call (final GridPP proposal) July 2006 • PPRP Assessment Sep-Dec 2006 • PPARC (Science Committee) outcome Mar 2007 • Institute Recruitment/Retention Apr-Aug 2007 • GridPP continuation phase? Sep 07-Mar 08 • Grid for LHC Exploitation phase Apr 2008- …. Collaboration Board

  25. Summary • LHC exploitation planning underway • Long-term planning (2007-10..) is uncertain, but defined in World context • GridPP provided PPARC with planning input for the LHC Exploitation Grid • The (full economic) costs involved for even a minimal LHC Computing Grid are significant, and recognised by PPARC • CB should decide either: • One GridPP response to PPARC call in March, or • Agree that there will be responses for • 1 x T1 resource centre • 4 x T2 resource centres • 1 x Grid Operations, Middleware Support, Application Interfaces, Management • Dissolve GridPP • Overall finances will be known with the PPARC call in March – this will constrain the overall planning • Timeline requires internal coordination/planning up to July and responses to PPRP questions during 2006 • GridPP (or its successor) needs to (continue to) demonstrate its wider significance (in order to enhance PPARC’s funding at a higher level) Collaboration Board

  26. Next steps If option 3 is chosen: Collaboration Board

  27. Next steps If either option 1 or 2 is chosen the GridPP internal procedure will follow that from GridPP2 for the relevant components, defining the work areas (see next slide) If option 2 is chosen, the resource centre responses will need to specify: • the proposed level of computing resource (keeping in mind the experiment requirements); • which experiments are enabled (currently all PP VOs are enabled within each of the 5 resource centres) • what resources are allocated to which experiment and how they are supported in terms of: • Reconstruction (expected to be T1 only) • Simulation (expected to be T2 only) • Analysis (expected to be provided at T1 and T2) • the proposed cost Collaboration Board

  28. Four Proforma • Review of existing posts: existing logbooks (06Q1) will be reviewed to examine the post(s) effectiveness (A requirement of the original grants) • Project definition: the project deliverables are to be confirmed or updated by the PMB (This defines the programme of work planned in the ~3-year period) [proforma 2, based upon “Grid for LHC exploitation”] • Tender: describes how effort at a given institute can meet these project deliverables [proforma 3] • Assessment: input from the PMB to the PRSC (Peer Review Selection Committee) [proforma 4] See old CB slides (defining GridPP2 work areas): http://www.gridpp.ac.uk/cb/doc/GridPP_CB_031204_TD.ppt Collaboration Board

  29. Outcome • By the end of this process we will be able to: • Continue the work started in GridPP2 (by many people) • Define the Project Map tasks and metrics • Retain staff in recognised areas to contribute to agreed tasks • Contribute fully to EGEE and beyond • Start to go from Production to Exploitation (and hence analyse real LHC data..) • The exploitation phase will continue for 10 years, so we should ensure that we define this well.. Collaboration Board

More Related