1 / 47

Commissioning of the ATLAS High Level Trigger

John Baines. Commissioning of the ATLAS High Level Trigger. Overview of Talk. ATLAS LHC Parameters The ATLAS Trigger UK & RAL involvement Commissioning System Tests Single Beam Cosmics Successes & Lessons Learned Commissioning in 2009 Summary.

eliora
Download Presentation

Commissioning of the ATLAS High Level Trigger

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. John Baines Commissioning of the ATLAS High Level Trigger

  2. Overview of Talk • ATLAS • LHC Parameters • The ATLAS Trigger • UK & RAL involvement • Commissioning • System Tests • Single Beam • Cosmics • Successes & Lessons Learned • Commissioning in 2009 • Summary Material taken from conference talks by: S. Farrington, C. Padilla, R.Hauser, F. Winklmeier, W. Wiedenmann, R. Goncalo, A. Ventura

  3. The ATLAS Detector

  4. LHC Parameters Parameters at full luminosity (L=1034cm-2s-1) - • Bunch crossing interval : 25ns (40MHz) • No. overlapping events : 23 • => event rate ~ 1GHz • Average no. particles : 1400 • About 108 channels to read out. • ➔ Event size: 1.5 Mbyte • ➔ Larger during special runs: > 15 Mbyte • ➔ Tier0/Reconstruction/Grid/Storage: output limit about 200 Hz/300 MByte/s • Example signal & background rates: • 100 GeV Higgs: ~0.1 Hz • SUSY <1 Hz • W ~500 kHz • Z ~80 kHz • Background • Inelastic: ~1GHz • Jets >1kHz

  5. Trigger Architecture p p - 40MHz Calorimeter or Muon (or TRTfastOR) Hardware: FPGA, ASIC Identify Regions of Interest for HLT 2.5ms Level 1 75kHz Software Trigger, commodity PCs Seeded by L1 ROI Full detector granularity. Requests data in RoI from Read Out Buffers Level 2 40ms 2GHz CPU 2 - 3kHz Software Trigger, commodity PCs Seeded by L1 & L2 Has access to entire event. Event Filter 4s 2GHz CPU 200Hz

  6. ATLAS Trigger & DataFlow ~40ms ~4s

  7. ATLAS UK HLT Manchester Oxford Royal Holloway RAL UCL RAL: Fred Wickens Monika Wielers Dmitry Emeliyanov Julie Kirk Bill Scott John Baines Student: Rudi Apolle Trigger Selection Software Inner Detector Trigger Electron/photon Trigger B-Physics Trigger Trigger Release Coordination Trigger Validation Trigger Hardware & Farms

  8. Level-1 3 sub-systems: • L1- Calorimeters • L1- Muons • Central Trigger Processor (CTP) Signature Identification • e/g, t/h, jets, μ • Multiplicities per pT threshold • Isolation criteria • Missing ET, total ET, jet ET CTP • Receive and synchronize trigger information • Generate Level-1 trigger decision (L1A) • Deliver L1A to other subdetectors • Sends the Regions of Interest to the Level 2 trigger

  9. The HLT Farm Ultimately: 2300 processors (L2+EF) Now: ~1600 processors

  10. Multi-core processors • Resource requirements are multiplied with number of process instances • Memory ~ 1–1.5 GByte/Application • file descriptors • network sockets, • number of controlled applications • ~ 7k presently • ~ 20k final systemTrigger

  11. HLT Framework • Level-2 • HLT selection software runs in the Level-2 Processing Unit (L2PU). • Selection algorithms run in a worker thread. • Event Filter(3 kHz→200 Hz) • Independent Processing Tasks (PT) run selection software on Event Filter (EF) farm nodes • HLT Event Selection Software is based on the ATLAS Athena offline Framework • HLT framework interfaces the HLT event selection algorithms to online • Driven by run control and data flow software • Event loop managed by data flow software • Allows HLT algorithms to run unchanged in the trigger and offline environment

  12. HLT Selection Software • LVL2: Reduce rate from up to 75 kHz to 2-3kHz in av. 40ms • Custom algorithms with some offline components • EF: Reduce rate from 2-3 kHz to 200-300Hz in av. 4s. • Offline algorithms run from HLT-specific wrappers • HLT: • Processing in Region of Interest • Only process ~few % of event • At LVL2, request data over network for few % of event • Early rejection – stepwise processing to minimize execution time for rejected events

  13. Level1 Region of Interest is found and position in EM calorimeter is passed to Level 2 EMROI RoI-based, stepwise processing : e/g example L2 calorim. Event rejection possible at each step cluster? Electromagnetic clusters L2 tracking Level 2 seeded by Level 1 Fast reconstruction algorithms Reconstruction within RoI track? match? E.F.calorim. E.F.tracking Ev.Filter seeded by Level 2 Offline reconstruction algorithms Refined alignment and calibration track? e/ reconst. e/ OK?

  14. Trigger Menus • Trigger Menu defines chains of processing steps starting from LVL1 RoI • Menu specified in terms of signatures e.g. mu6, e10, 2j40_xe30etc. • Chains can be prescaled at Level-1 or the HLT • Signatures assigned to inclusive data-streams: • egamma, jetTauEtmiss, muons, minbias, Lar and express Example of electron signatures

  15. B-physics Triggers

  16. Trigger Rates & Streams

  17. Commissioning • System tests with simulated & previously recorded cosmic data • Download data to Read Out Buffers • Can test with collision events • Exercise system at max. LVL1 rate • Cosmic tests: • Individual detectors (“slice weeks”) • Combined runs => Expose algorithms to real detector noise, data errors etc. • Beam: • Single beam • Collisions

  18. System Tests with simulated data

  19. Single beam configuration – injection energy protons circulating in LHC On collision with a collimator, a spray of particles entered the detector Single Beam - 10:19 10/9/2008 Online Offline

  20. Level-1 Commissioning in Single Beam Each trigger component needs to be synchronised with the beam pick up - Bunch crossing 10 -10 -8 Bunch crossing 8

  21. Commissioning with Cosmics

  22. Cosmic Event

  23. Differences in Cosmic v. Beam running • No beam clock • Muon trigger chambers provide timing • Phase issues in read-out of TRT (straw detector) & Muon Drift Chambers • No beam/no IP • Tracks distributed over d0, z0 • L2 dedicated algorithms for fast muon reconstruction (in MDTs) and fast tracking algorithms in inner detector optimized for trajectories pointing towards the beam line • Muons in HLT • The r-z view could not be fully reconstructed at L2 because algorithms are designed for pointing tracks and data access request is in trigger towers pointing to the IP • Possible to relax pointing requirements to study rejection/efficiency • Timing issues cause percent-level loss • Tracking • Level-2 algorithms optimized for tracks from Interaction Point

  24. Calorimeter in e/g & t Triggers Study of performance of clustering algorithm in Tau trigger

  25. e/g Example plot from eg FEX algorithms comparing L2 and EF: Shower shape in 2nd EM sampling Rη=E(3×7)/E(7×7).

  26. Muon Trigger s=17mRad s=0.007

  27. Muons in the Tile Calorimeter Df between tile cluster and ID track

  28. Commissioning the InDet trigger • Want to commission the LVL2 collisions algorithms with cosmic. • But speed-optimisation of Level-2 algos means they are inefficient for tracks more that a ~5 mm from the nominal beam position. • Three strategies: • Use only the small fraction of events that pass close to the I.P. • Loosen cuts in Pat. Rec. (not possible for all Algs.) • Shift points.

  29. Commissioning Level-2 tracking Add an initial step that applies a shifts to all the points, so the track seems to come from the Interaction Point

  30. Level-2 ID Efficiency w.r.t. Tracks reconstructed offline

  31. Cosmics for ID alignment HLT trigger used to select events passing through the ID, sent to the the IDCosmic stream & used for offline alignment

  32. Commissioning with Cosmics 216 millions events 453 TB data 400k files several streams

  33. Data Streaming

  34. Online Handling of Time-Out Events • Time-out Events go to the DEBUG stream • The events are re-processed and streamed as if they had been processed online. The only difference is the file name. • Files registered to the corresponding offline DB and processed normally, producing ESD, AOD, etc. , but still be separated and with the “recovered” tag.

  35. Successes & Lessons learnt • Some highlights: • Trigger ready for First Beam • Single beam events triggered with LVL1 & HLT streaming based on Level-1 • HLT run offline on the CERN Analysis Farm • Trigger including HLT algorithms exercised in cosmic running • ~2 months running, 220 million events • incl. long runs of >2M events • Successfully streamed events incl. IDCosmic stream used for alignment. • Exercised processing of events from the Debug stream • Exercised procedures for evaluating new menus & code fixes on CAF prior to online deployment • Successfully exercised release management in data-taking conditions • deployed patch releases for P1 and HLT

  36. Successes & Lessons learned Improvements for 2009 Running: • Ability to change LVL1 pre-scales during a run was invaluable • put in place infrastructure to enable HLT prescales to also be updated during run • Change of magnetic field required a menu change: => Algorithms now able to configure magnetic field automatically based on magnet current • Problems with calculating online Level-2 & EF trigger rates • Old system too susceptible to problems collecting information from farm nodes. • Improvements in rate calculation and collection of information from nodes • Removal of detectors from readout caused errors in HLT => events in debug stream • Allow algorithms to access Mask saying which detectors are in the run => modify error response • Problems with noisy detectors • Consolidate procedures for making noisy detector masks available online • Improve monitoring, especially detector & trigger info. displayed side-by-side

  37. Plans for 2009/10 Luminosity : ~2x1032 Integrated : ~200pb-1

  38. Collisions • Cosmics • Cosmics with combined L1 muon triggers • First beam menu: Cosmics + beam pickup trigger • Bunch groups commissioned (requires clock commissioning) • High Level Trigger performs streaming • HLT algorithms run offline • Add HLT one piece at a time in tagging mode • Switch on HLT rejection after algorithms validated online • Full 1031 Menu -

  39. Collisions Cosmics Cosmics with combined L1 muon triggers First beam menu: Cosmics + beam pickup trigger Bunch groups commissioned (requires clock commissioning) -

  40. Collisions • Cosmics • Cosmics with combined L1 muon triggers • First beam menu: Cosmics + beam pickup trigger • Bunch groups commissioned (requires clock commissioning) • High Level Trigger performs streaming • HLT algorithms run offline • Add HLT one piece at a time in tagging mode • Switch on HLT rejection after algorithms validated online • Full 1031 Menu -

  41. Conclusion • The trigger was successfully commissioned in Single Beam and Cosmic running in Autumn 2008 • Data has been analysed to validate the trigger operation. • Improvements have been made in the light of experience from these runs Eagerly awaiting collisions!!

  42. Backup Slide

  43. High Level Trigger -

  44. Level 1 Cosmic Rates -

More Related