1 / 26

LHCb is Beautiful?

LHCb is Beautiful?. Glenn Patrick GridPP19, 29 August 2007. In the beginning…. LHCb – GridPP1 Era (May 2002). Empty!. LHCb – GridPP2 Era (Mar 2005). Not Beautiful!. p. p. LHCb December 2006. Getting Pretty!. RICH1 VELO. Trackers. Calorimeters. Muon. RICH2. Magnet. 2008

ron
Download Presentation

LHCb is Beautiful?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LHCb is Beautiful? Glenn Patrick GridPP19, 29 August 2007

  2. In the beginning…

  3. LHCb – GridPP1 Era (May 2002) Empty!

  4. LHCb – GridPP2 Era (Mar 2005) Not Beautiful!

  5. p p LHCb December 2006 Getting Pretty! RICH1 VELO Trackers Calorimeters Muon RICH2 Magnet

  6. 2008 Suddenly Beautiful! B0 B0 b b 1000 million B mesons/year d d Summer 2008 – Beauty at Last?

  7. Compute Element Storage Element mss Local disk Job Data globus-url-copy Data register-local-file publish CERN TESTBED Replica Catalogue NIKHEF - Amsterdam REST-OF-GRID replica-get Job Storage Element Data …and so it is with the Grid? Origins of Grid for LHCb … GridPP at NeSc Opening – 25 April 2002

  8. checkData Job JDL Job Receiver Data Optimizer Job Receiver Job Receiver Job Input Job JDL Sandbox JobDB LFC Task Queue checkJob Agent Monitor getReplicas WMS Admin Job Monitor Agent Director Matcher Pilot Job checkPilot SE getProxy RB RB RB CE JDL uploadData VO-box getSandbox DIRAC services putRequest Job Wrapper CE LCG services User Application execute (glexec) WN fork Workload On WN DIRAC WMS Evolution (2006) Pilot Agent

  9. User interfaces Job monitor Production manager GANGA UI User CLI BK query webpage FileCatalog browser BookkeepingSvc FileCatalogSvc DIRAC Job Management Service DIRAC services JobMonitorSvc InformationSvc MonitoringSvc JobAccountingSvc AccountingDB Agent Agent Agent DIRAC resources DIRAC Storage LCG Resource Broker DIRAC Sites CE 3 DIRAC CE gridftp bbftp DIRAC CE DIRAC CE DiskFile CE 2 CE 1 rfio DIRAC Production & Analysis DIRAC1: started 19.12.2002 DIRAC3 (data ready): due 2007 GridPP: Gennady Kuznetsov (RAL) – DIRAC Production Tools

  10. GUI Collective & Resource Grid Services GANGA Histograms Monitoring Results JobOptions Algorithms GAUDI Program GANGA: Gaudi ANd Grid Alliance - 2001 First ideas… Pere Mato: LHCb Workshop, Bologna, 15 June 2001 GridPP - Alexander Soroko (Oxford) Karl Harrison (Cambridge) Ulrik Egede (Imperial) Alvin Tan (Birmingham)

  11. Ganga Evolution: 2001-2007 Replaces LHCb Experiment neutral ATLAS AthenaMC (Production) Athena (Simulation/Digitisation/ Reconstruction/Analysis) Gauss/Boole/Brunel/DaVinci (Simulation/Digitisation/ Reconstruction/Analysis) Executable Local PBS LSF OSG PANDA NorduGrid LHCb WMS US-ATLAS WMS

  12. Scriptor Job details Logical Folders Job Monitoring Job builder Log window Ganga 2007: Elegant Beauty? Screenshot of the Ganga GUI CERN, September 2005 Cambridge, January 2006 Edinburgh, January 2007

  13. Ganga Users - 2007 Other LHCb ATLAS 806 unique users since 1 Jan 2007: LHCb=162 unique users

  14. Ganga by Domain - 2007 Other CERN

  15. Exists Planned LHCb “Grid” - circa 2001 Initial LHCb-UK “Testbed” CERN pcrd25.cern.ch lxplus009.cern.ch RAL CSF 120 Linux cpu IBM 3494 tape robot Institutes LIVERPOOL MAP 300 Linux cpu RAL (PPD) Bristol RAL DataGrid Testbed Imperial College GLASGOW/ EDINBURGH “Proto-Tier 2” Oxford Cambridge

  16. HLT Software 40 kHz Level-1 Software 1 MHz Level-0 Hardware 40 MHz LHCb Computing Model 2 kHz@30 kB/event 60MB/s

  17. Monte Carlo Simulation 2007 700M events simulated since May 2006. 1.5M jobs submitted Record of 9715 simultaneous jobs over 70+ sites on 28 Feb 2007 Raja Nandakumar (RAL)

  18. CERN RAL NIKHEF CNAF IN2P3 Reconstruction & Stripping - 2007 …but not so often we get all Tier 1 centres working together. Peak of 439 jobs.

  19. Data Management - 2007 • Production jobs upload output to associated Tier 1 SE (i.e. RAL in UK). • Multiple “Failover” SE and Multiple VO Boxes used in case of failure. • Replication done via FTS and centralised Transfer DB. eScience PhD: Andrew Smith (Edinburgh)

  20. Data Transfer - 2007 • RAW data replicated from Tier 0 to one of six Tier 1 sites. • gLite FTS used for T0 – T1 replication. • Transfers trigger automated job submission for reconstruction. • Sustained total rate of 40MB/s required (and achieved). 50 Further DAQ –T0 – T1 throughput tests at 42MB/s aggregate rate scheduled later in 2007.

  21. volhcb01 AMGA AMGA Client Read BookkeepingSvc BookkeepingQuery AMGA Client BK Service Tomcat Oracle DB Web Browser Read Read R/W R/W Servlet Bookkeeping (2007) GridPP: Carmine Cioffi (Oxford)

  22. LHCb CPU Use 2005-2007 Swiss CERN Germany Spain UK France Italy Many thanks to: Birmingham, Bristol, Brunel, Cambridge, Durham, Edinburgh, Glasgow, Imperial, Lancaster, Liverpool, Manchester, Oxford, QMUL, RAL, RHUL, Sheffield and all others.

  23. UKI Evolution for LHCb 2004 Tier 1 2007 London NorthGrid Tier 1 SouthGrid ScotGrid

  24. 2004-2007 2007-2008 GridPP3: Final Crucial Step(s) 2001-2004 Beauty! GridPP3 2008-2011

  25. Some 2007-2008 Milestones Sustain DAQ-T0–T1 throughput tests at 40+ MB/s. Reprocessing (second pass) of data at Tier 1 centres. Prioritisation of analysis, reconstruction and stripping jobs (all at Tier 1 for LHCb). CASTOR has to work reliably for all service classes! Ramp up of hardware resources in UK. Alignment. Monte-Carlo done with perfectly positioned detectors…. reality will be different! Calibration. Monte-Carlo done with “well understood” detectors… reality will be different! Distributed Conditions Database plays vital role. Analysis. Increasing load of individual users.

  26. Lyn Evans GridPP3 The End (and the Start) EPS Conference on High Energy Physics, Manchester 23 July 2007

More Related