1 / 43

Collaboration, management, organization Status of Computing and Software

ATLAS Status (Part II). For the detector, shut-down and upgrade activities see talk by M.Nessi. Fabiola Gianotti, RRB, 12/10/2009 CERN-RRB-2009-103. Collaboration, management, organization Status of Computing and Software Detector commissioning with cosmics data

shirleyh
Download Presentation

Collaboration, management, organization Status of Computing and Software

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ATLAS Status(Part II) For the detector, shut-down and upgrade activities see talk by M.Nessi Fabiola Gianotti, RRB, 12/10/2009 CERN-RRB-2009-103 • Collaboration, management, organization • Status of Computing and Software • Detector commissioning with cosmics data • Preparing for physics with early LHC data Fabiola Gianotti, ATLAS RRB, 12-10-2009

  2. Collaboration, Management, Organization Fabiola Gianotti, ATLAS RRB, 12-10-2009

  3. 2885 Scientific participants (1835 contribute to M&O share) 1050 Students 172 Institutions 37 Countries Albany, Alberta, NIKHEF Amsterdam, Ankara, LAPP Annecy, Argonne NL, Arizona, UT Arlington, Athens, NTU Athens, Baku, IFAE Barcelona, Belgrade, Bergen, Berkeley LBL and UC, HU Berlin, Bern, Birmingham, UAN Bogota, Bologna, Bonn, Boston, Brandeis, Brasil Cluster, Bratislava/SAS Kosice, Brookhaven NL, Buenos Aires, Bucharest, Cambridge, Carleton, CERN, Chinese Cluster, Chicago, Chile, Clermont-Ferrand, Columbia, NBI Copenhagen, Cosenza, AGH UST Cracow, IFJ PAN Cracow, SMU Dallas, UT Dallas, DESY, Dortmund, TU Dresden, JINR Dubna, Duke, Edinburgh, Frascati, Freiburg, Geneva, Genoa, Giessen, Glasgow, Göttingen, LPSC Grenoble, Technion Haifa, Hampton, Harvard, Heidelberg, Hiroshima IT, Indiana, Innsbruck, Iowa SU, Iowa, UC Irvine, Istanbul Bogazici, KEK, Kobe, Kyoto, Kyoto UE, Lancaster, UN La Plata, Lecce, Lisbon LIP, Liverpool, Ljubljana, QMW London, RHBNC London, UC London, Lund, UA Madrid, Mainz, Manchester, CPPM Marseille, Massachusetts, MIT, Melbourne, Michigan, Michigan SU, Milano, Minsk NAS, Minsk NCPHEP, Montreal, McGill Montreal, RUPHE Morocco, FIAN Moscow, ITEP Moscow, MEPhI Moscow, MSU Moscow, Munich LMU, MPI Munich, Nagasaki IAS, Nagoya, Naples, New Mexico, New York, Nijmegen, BINP Novosibirsk, Ohio SU, Okayama, Oklahoma, Oklahoma SU, Olomouc, Oregon, LAL Orsay, Osaka, Oslo, Oxford, Paris VI and VII, Pavia, Pennsylvania, Pisa, Pittsburgh, CAS Prague, CU Prague, TU Prague, IHEP Protvino, Regina, Rome I, Rome II, Rome III, Rutherford Appleton Laboratory, DAPNIA Saclay, Santa Cruz UC, Sheffield, Shinshu, Siegen, Simon Fraser Burnaby, SLAC, NPI Petersburg, Stockholm, KTH Stockholm, Stony Brook, Sydney, Sussex, AS Taipei, Tbilisi, Tel Aviv, Thessaloniki, Tokyo ICEPP, Tokyo MU, Tokyo Tech, Toronto, TRIUMF, Tsukuba, Tufts, Udine/ICTP, Uppsala, UI Urbana, Valencia, UBC Vancouver, Victoria, Waseda, Washington, Weizmann Rehovot, FH Wiener Neustadt, Wisconsin, Wuppertal, Würzburg, Yale, Yerevan Fabiola Gianotti, ATLAS RRB, 12-10-2009

  4. Collaboration composition since the last RRB • At its Collaboration Board (CB) meeting on 9 October 2009, the Collaboration took note • of the withdrawal of the following two Japanese Institutions: • Ritsumeikan University: due to the completion of their expected contribution (G4 development) • Hiroshima University: due to the retirement of the remaining active member At the same CB, the Collaboration unanimously admitted five new Institutions (Expressions of Interests had been presented at the July CB): Tokyo Institute of Technology (Tokyo Tech), Japan [Inner detector, trigger; upgrade] Waseda University, Japan [SCT detector; upgrade] University of Edinburgh, UK [Core software, Distributed Data Management; upgrade] University of Sussex, Brighton, UK [Trigger, core software; upgrade] University of Iowa, USA [Pixel detector; upgrade] Members of the above Institutions have been active in ATLAS for some time through affiliation to other Institutions, contributing successfully to technical high-priority tasks for the experiment. Fabiola Gianotti, ATLAS RRB, 12-10-2009

  5. Collaboration composition since the last RRB • At its Collaboration Board (CB) meeting on 9 October 2009, the Collaboration took note • of the withdrawal of the following two Japanese Institutions: • Ritsumeikan University: due to the completion of their expected contribution (G4 development) • Hiroshima University: due to the retirement of the remaining active member At the same CB, the Collaboration unanimously admitted five new Institutions (Expressions of Interests had been presented at the July CB): Tokyo Institute of Technology (Tokyo Tech), Japan [Inner detector, trigger; upgrade] Waseda University, Japan [SCT detector; upgrade] School of Physics and Astronomy, University of Edinburgh, UK [Core software, Distributed Data Management; upgrade] University of Sussex, Brighton, UK [Trigger, core software; upgrade] University of Iowa, USA [Pixel detector; upgrade] Members of the above Institutions have been active in ATLAS for some time through affiliation to other Institutions, contributing successfully to technical high-priority tasks for the experiment. The RRB is kindly requested to endorse the admission of these five new Institutions in the ATLAS Collaboration The total number of Institutions (with voting rights in the CB) increases from 169 to 172 Fabiola Gianotti, ATLAS RRB, 12-10-2009

  6. Collaboration Board (Chair: K. Jon-And Deputy: G. Herten) ATLAS Plenary Meeting Resources Review Board CB Chair Advisory Group Spokesperson (F. Gianotti Deputies: D.Charlton and A.Lankford ATLAS Organization October 2009 Technical Coordinator (M. Nessi) Resources Coordinator (M. Nordberg) Executive Board Inner Detector (P. Wells) LAr Calorimeter (I. Wingerter-Seez) Tile Calorimeter (A. Henriques) Muon Instrumentation (L. Pontecorvo) Trigger/DAQ ( C. Bee) Run Coordinator (C. Clément) Trigger Coordination (T.Wengler) Computing Coordination (D. Barberis) Data Prep. Coordination (A.Hoecker) Physics Coordination (T.LeCompte) Upgrade SG Coordinator (N. Hessey) PubComm Chair (J. Pilcher) Additional Members (T. Kobayashi, M. Tuts, A. Zaitsev) New since April RRB Fabiola Gianotti, ATLAS RRB, 12-10-2009

  7. Operation Task sharing (framework approved by CB in 2007) • ATLAS operation, from detector to data preparation and world-wide computing, • requires 600-700 FTE (note: Physics is not an OT): • Shared in fair way across Institutions: proportional to the number of authors • -- students get favorable treatment as they are weighted 0.75 • -- new Institutions must contribute more the first two years (weight factors 1.5, 1.25) • ~ 60% of these FTE tasks are CERN-based; efforts to reduce this fraction with time • ~ 12% are shifts in the Control Room or on-call;plan is to increase remote monitoring with time • Allocation made in two steps: shifts are distributed first, then other tasks • FTE requirements and FA contributions reviewed and updated yearly Distribution per FA in 2008: 100% means covered=expected share For illustration only: 2008 was the first exercise of the system. Plot demonstrates that tracking tools are in place Funding Agency Fabiola Gianotti, ATLAS RRB, 12-10-2009

  8. Status of Computing and Software Fabiola Gianotti, ATLAS RRB, 12-10-2009

  9. Computing infrastructure and operation ATLAS wLCG world-wide computing: ~ 70 sites (including CERN Tier0, 10 Tier-1s, ~ 40 Tier-2 federations) WLCG Fabiola Gianotti, ATLAS RRB, 12-10-2009

  10. reconstruction 1st pass raw data reconstruction at Tier0 and export analysis processed data Event Summary Data raw data Physics analysis at Tier2 Reprocessing at Tier1 analysis objects (extracted by physics topic) simulation Simulation at Tier1/2 interactive physics analysis • 4 main operations in the ATLAS Computing Model: • First-pass reconstruction of detector data at Tier0 and data export to Tier-1s/Tier-2s • Data re-processing at Tier-1s using updated calibrations • Simulation of Monte Carlo samples at Tier-1s and Tier-2s • (Distributed) physics analysis at Tier-2s and at more local facilities (Tier-3s) The actual Computing Model is much more complex: includes data organization, placement and deletion strategy, disk space organization, database replication, bookkeeping, etc. The challenging operations (e.g. ~ 50 PB of data to be moved across the world every year, 109 raw events per year to be processed and reprocessed, …) and the Computing Model have been stress-tested and refined over the last years through functional tests and data challenges of increasing functionality, size and realism. ATLAS has participated in the STEP09 challenge in June with the other LHC experiments Fabiola Gianotti, ATLAS RRB, 12-10-2009

  11. # of jobs July 2008 - July 2009 Production of MC samples for simulation studies: >30k jobs/day achieved MB/s ATLAS, STEP09, June 2009 4 GB/s Mixture of cosmics and simulated data Data transfer Tier0  Tier-1s and Tier-1s  Tier-1s Higher peak rate than nominal (1-2 GB/s at LHC) sustained over 2 weeks days Fabiola Gianotti, ATLAS RRB, 12-10-2009

  12. Offline Software • Great maturity and realism achieved after years of developments, optimization and • tests with huge variety of simulated physics samples, test-beam data and cosmics data. • In the last year: 1 billion events (cosmics, simulation) reconstructed. • Constant efforts to improve technical performance (to reduce memory, CPU, event size), • as it has impact on computing resources. • Significant further optimization expected after looking at first LHC data. • Software release for data taking being fine-tuned; strategy to propagate quickly • experience from early data to reconstruction and simulation, being finalized. G4 simulation today New G4 physics list reproducing best our combined test-beam data Fabiola Gianotti, ATLAS RRB, 12-10-2009

  13. Software infrastructure, simulation and reconstruction extensively exposed to and optimized with real cosmics data (detector imperfections, etc. …) Computing Model target. In some cases, asynchronous cosmics events require more CPU than collision data. CPU per event in Spring 2009 reprocessing of cosmics data Exercising τ-lepton reconstruction algorithms in cosmic-ray muon events τ-identification variable based on shape of energy deposition in calorimeters. To be used at LHC to distinguish τ’s (narrow) from QCD jets (broader) ET (ΔR<0.1)/ ET (ΔR<0.4) τ jet Fabiola Gianotti, ATLAS RRB, 12-10-2009

  14. Detector commissioning and preparation for physics with cosmics data • Cosmics data samples with full detector operational: • -- ~ 300M events in September-October 2008 runs • -- ~ 100M events in June-July 2009 • This morning: global cosmics data-taking with full detector • restarted. Will run non-stop through first beams (mid November) • O(300) plots approved for talks at conferences • Plan to publish ~ 8 papers before end of year • Achieved level of detector understanding (performance, alignment, • calibration) is far better than expectations for this stage of the • the experiment Fabiola Gianotti, ATLAS RRB, 12-10-2009

  15. Here: only a few examples from a huge amount of results … Fabiola Gianotti, ATLAS RRB, 12-10-2009

  16. A cosmic muon traversing the whole detector, recorded on 18/10/2008 Rate of cosmics events in ATLAS: 1-700 Hz (depending on the sub-detector size and location) Fabiola Gianotti, ATLAS RRB, 12-10-2009

  17. Level-1 muon trigger: η-φ distribution of the cosmic muon signals recorded by the RPC trigger chambers 8/10/2008 Coverage ~ 70% Consolidation and repair work RPC hit maps 12/09/2009 Coverage > 97% Fabiola Gianotti, ATLAS RRB, 12-10-2009

  18. ATLAS High-Level-Trigger(HLT) farm 850 PC (CPU: 2 x quad-core) installed = 35% of the final system Each one of these unreadable lines is a chain exercised in the HLT during cosmics data-taking Fabiola Gianotti, ATLAS RRB, 12-10-2009

  19. Inner Detector • Silicon detectors (Pixels, SCT): • achieved alignment precision with cosmics: • ~ 20 m (ultimate goal 5-10 m) • alignment stability Oct 2008-Jun 2009: • few microns • layer hit efficiency: > 99.8% in most cases • noise occupancy: 10-10 (Pixels), <10-4 (SCT) Cosmics-ray shower recorded in barrel TRT SCT hit efficiency vs layer from cosmics tracks 99.8% Bubble-chamber quality tracking plus transition radiation (red points) for high Lorentz γ-factor particles (typically electrons) 99%

  20. Momentum resolution with full Inner Detector • Muon tracks are split • in the center and refit • separately • can measure resolutions from data Pixels TRT SCT Note: 2009 data use 2008 alignment constants  excellent detector stability Fabiola Gianotti, ATLAS RRB, 12-10-2009

  21. Electromagnetic calorimeter Barrel uniformity (layer 2): ~ 1% (limited by statistics) Fractional difference of muon signal between data and MC cell by cell Energy resolution of level-1 calorimeter trigger from cosmics muons Level-1 trigger readout resolution is ~ 1.5 worse than calorimeter readout resolution  good enough to measure energy (with coarse granularity) in regions with dead optical transmitters (OTx) in the calorimeter readout. Fabiola Gianotti, ATLAS RRB, 12-10-2009

  22. Tile hadron calorimeter Cell by cell: difference between timing with single beams and with cosmics Muon signal compared to noise ATLAS preliminary Agreement to better than 2 ns Radiography of Tile cells (layer 1) Muon tracks are extrapolated from ID and E is measured in Tile cell around impact point Cell boundaries Fabiola Gianotti, ATLAS RRB, 12-10-2009

  23. Muon Spectrometer Cosmics data, field off To observe a new heavy resonance X  as a “narrow” peak above background: ATLAS Muon Spectrometer: E ~ 1 TeV  sagitta ~500 m - s/p ~10% ~50 m - alignment accuracy to ~30 mm Before alignment s/p < 10% for E~ 1 TeV With optical alignment After alignment with tracks Fabiola Gianotti, ATLAS RRB, 12-10-2009

  24. First electrons observed in ATLAS: from ionization of cosmics muons Event display of one of the electron candidates, recorded on 28/9/2008 Muon chambers Silicon TRT Calorimeter cells/clusters Muon chambers Fabiola Gianotti, ATLAS RRB, 12-10-2009

  25. Events with 2 tracks and TRT ratio>0.8 Events with 2 tracks 32 out of 36 electron candidates in the signal region have negative charge electron candidates Shower shape of 32 e- candidates in first compartment of EM calorimeter • Electrons produced by ionization (d-rays) • Events selected by requiring two • reconstructed tracks. • Loose association required: • track - EM cluster with E>3 GeV • Discrimination based mainly on TRT signal • and E(calorimeter)/p(tracker) ratio • 32 out of 36 candidates in the “signal” • region have negative charge data -- simulation Fabiola Gianotti, ATLAS RRB, 12-10-2009

  26. Preparing for physics with early LHC data Fabiola Gianotti, ATLAS RRB, 12-10-2009

  27. End-to-end “Analysis Readiness Walkthroughs” (ARW) Goal : be ready to analyze early collision data fast and efficiently • Consider basic analyses to be made with first data: • J/ψ and Y  μμ; minimum-bias; jets; high-pTleptons (W, Z, … later top), photons, etc. • For each analysis: • -- prioritize goals for Winter 2010 conferences: define realistic set of results, • while leaving room to more ambitious goals/ideas if enough time and people • -- walk through all steps [detector, trigger, calibration, data-quality, reconstruction, • MC simulation, … final plot] needed to achieve the planned physics results • -- make sure all steps are addressed in the right order, are covered by enough people, • and that links and interfaces between steps are in place (“vertical integration”) • The above information is prepared, for each analysis, by a team of ~5 people (from • detectors, trigger, data quality, Combined Performance and Physics groups) and presented • at dedicated open meetings (1 day per analysis). • A “review” panel (with experts from the various experiment activities) makes • recommendations and suggestions for improvements • First two reviews (J/ψ,Y  μμ; minimum-bias) completed: very useful, follow-up foreseen • Next to come: electrons, jets Fabiola Gianotti, ATLAS RRB, 12-10-2009

  28. Analysis roadmap from J/ψ and Y  μμ walkthrough Fabiola Gianotti, ATLAS RRB, 12-10-2009

  29. Example of “check-list” from J/ψ and Y  μμ walkthrough Fabiola Gianotti, ATLAS RRB, 12-10-2009

  30. Analysis timeline strategy from J/ψ and Y  μμ walkthrough Fabiola Gianotti, ATLAS RRB, 12-10-2009

  31. First data samples … Expected number of events in ATLAS for 100 pb-1 after cuts (preliminary estimate - fast simulation) J/ψμμ Wμν Zμμ ttμν+X ttμν+X inside peak (strong cuts) Goals for 2009-2010: 1) Detector calibration and commissioning with physics processes: minimum-bias, Z  ll, … 2)“Rediscover” and measure the Standard Model at √s ≥ 7 TeV: W, Z, tt, QCD jets … 3)Early discoveries ? Potentially accessible: SUSY, Z’, …. surprises ? Fabiola Gianotti, ATLAS RRB, 12-10-2009 Status of ATLAS

  32. First data samples … Expected number of events in ATLAS for 100 pb-1 after cuts (preliminary estimate - fast simulation) J/ψμμ Wμν At √s = 7 TeV with 200-300 pb-1: LHC has discovery potential beyond the Tevatron reach for some scenarios (~ 1 TeV Z’, ~ 450 GeV SUSY, …)  discoveries are possible in 2010 ! Zμμ ttμν+X ttμν+X inside peak (strong cuts) Goals for 2009-2010: 1) Detector calibration and commissioning with physics processes: minimum-bias, Z  ll, … 2)“Rediscover” and measure the Standard Model at √s ≥ 7 TeV: W, Z, tt, QCD jets … 3)Early discoveries ? Potentially accessible: SUSY, Z’, …. surprises ? Fabiola Gianotti, ATLAS RRB, 12-10-2009 Status of ATLAS

  33. One example of possible discovery in 2010: Z’  ll, mass ~ 1 TeV Is this a manifestation of new forces or new dimensions ? From angular distribution of leptons can disentangle Z’ (spin=1) from G (spin=2). Requires more data … 14 TeV, Zχ’  ee Sequential SM • Signal is (narrow) mass peak above • small and smooth SM background • Does not require ultimate EM • calorimeter performance • Discovery beyond Tevatron exclusion • reach (m ~ 1 TeV) possible with 200 pb-1 • and √s ≥ 7 TeV •  perhaps sometime in 2010 ?

  34. Conclusions Fabiola Gianotti, ATLAS RRB, 12-10-2009

  35. ATLAS [detector, trigger and data acquisition, data quality, calibration and alignment, data processing and world-wide distribution] is ready for LHC collision data. During the Winter 2008-2009 shut-down, all components of the experiment have been improved and consolidated. The fraction of non-working channels is at the permil-percent level in most cases. Main concern for the first (long) run is performance and reliability of some components: Inner Detector cooling, CSC ROD, liquid-argon opto-transmitters and LVPS. About 400 M cosmics events, as well as single-beam data in Sept 2008, have been collected successfully in 2008-2009 with the full detector operational. These data demonstrate better detector performance than expected at this stage. Calibration and alignment accuracies good enough for first physics already achieved. Software and Computing have been exercised with massive simulations as well as real detector and real (cosmics) data, and have been confronted with the complexity of a world-wide distributed system. Trigger strategy and menus for early data, as well as preparation for physics, are being finalized. The project proceeded within the framework of the accepted 2002 Completion Plan. All resources requested in that framework are needed to cover the costs of the initial detector now installed. A coherent plan for detector consolidation, repairs and upgrade from now to 2015 (“Full Design Luminosity detector”) will be presented at the RRB in 2010. Fabiola Gianotti, ATLAS RRB, 12-10-2009

  36. We are “even more ready” to take and analyse data than last year .. And we hope that the exciting ATLAS physics program, demonstrated over the years by increasingly realistic detector simulations, will reward 20 years of efforts of the international community (R&D, design, prototyping, construction, accurate quality controls and certification, test beams, installation, commissioning, … ) to build an experiment of unprecedented technology, complexity and performance ATLAS is very grateful to all Funding Agencies for their huge contributions to the success of the experiment and their continuous support during more than 15 years. Fabiola Gianotti, ATLAS RRB, 12-10-2009

  37. Back-up Fabiola Gianotti, ATLAS RRB, 12-10-2009

  38. MS : ~ 60% NMS : ~ 40% ~ constant with time Fabiola Gianotti, ATLAS RRB, 12-10-2009

  39. Muon Spectrometer (||<2.7): air-core toroids with gas-based chambers Muon trigger and measurement with momentum resolution < 10% up toE ~ TeV Length : ~ 46 m Radius : ~ 12 m Weight : ~ 7000 tons ~108 electronic channels 3-level trigger reducing the rate from 40 MHz to ~200 Hz Inner Detector (||<2.5, B=2T): Si Pixels and strips (SCT) + Transition Radiation straws Precise tracking and vertexing, e/ separation (TRT). Momentum resolution: /pT ~ 3.4x10-4 pT (GeV)  0.015 EM calorimeter: Pb-LAr Accordion e/ trigger, identification and measurement E-resolution: ~ 1% at 100 GeV, 0.5% at 1 TeV HAD calorimetry (||<5): segmentation, hermeticity Tilecal Fe/scintillator (central), Cu/W-LAr (fwd) Trigger and measurement of jets and missing ET E-resolution:/E ~ 50%/E  0.03 Fabiola Gianotti, ATLAS RRB, 12-10-2009

  40. Forward detectors LUCID at 17 m ALFA at 240 m ZDC at 140 m Luminosity Cerenkov Integrating Detector (Phase 1 operational since 2008) Zero Degree Calorimeter (Data taking in 2009) ALFA: Absolute Luminosity for ATLAS (Installation in 2010) LoI for Forward Proton detectors at 220 and 420 m (AFP): ongoing ATLAS review Fabiola Gianotti, ATLAS RRB, 12-10-2009

  41. Three major systems Calorimeter Trigger Muon Trigger Central Trigger Processor (CTP) Other triggers and signals also integrated by CTP Minimum bias Luminosity triggers Beam Pick-up CTP distributes all timing information Level-1 Trigger System Fabiola Gianotti, ATLAS RRB, 12-10-2009

  42. Dataflow view of TDAQ infrastructure: DAQ infrastructure Underground Surface Detectors Permanent Storage Readout Drivers Sub Farm Output Dedicated Optical Links Data Flow Manager Readout System Node Readout Buffers Sub Farm Input Data Network Fabiola Gianotti, ATLAS RRB, 12-10-2009

  43. Fabiola Gianotti, ATLAS RRB, 12-10-2009

More Related