1 / 29

Triggering at ATLAS

Triggering at ATLAS. Vortrag von Johannes Haller, Uni HH Am ATLAS-D Meeting, September 2006. Trigger Challenge at the LHC Technical Implementation Trigger Strategy, Trigger Menus, Operational Model, Physics Analyses and all that. Physics Goals at the LHC. n. -. e. e. q. -. c. 1. ~.

ady
Download Presentation

Triggering at ATLAS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Triggering at ATLAS Vortrag von Johannes Haller, Uni HH Am ATLAS-D Meeting, September 2006 • Trigger Challenge at the LHC • Technical Implementation • Trigger Strategy, Trigger Menus, Operational Model, Physics Analyses and all that

  2. Physics Goals at the LHC n - e e q - c 1 ~ q q ~ g p p ~ q + m - q ~ m c 0 2 ~ c 0 1 µ + g µ EW symmetry breaking ? - search for the Higgs Boson - Z H H p p p p Z g µ + µ - Extensions of the Standard Model ? - search for SUSY or other BSM physics What else? - top, EW, QCD, B-physics • physics events: m, g, e, t, jets, ET,miss • high pT objects (un-pre-scaled) • low pT objects (pre-scaled or in exclusive selection) • monitor events • calibration events “The” trigger question: What events do we need to take? simple answer?

  3. Event Rates and Multiplicities cross section of p-p collisions R = event rate  = luminosity = 1034 cm-2 s-1 inel = inel. Cross section = 70 mb N = interactions / bunch crossing Dt = bunch crossing interval = 25 ns stot(14 TeV) ≈ 100 mbsinel(14 TeV) ≈ 70 mb R =  x inel = 1034cm-2 s-1 x 70mb = 7·108 Hz N = R / Dt= 7·108 s-1 x 25·10-9 s = 17.5 = 17.5 x 3564 / 2808 (not all bunches filled) = 23 interactions / bunch crossing(pileup) LHC cm energy (GeV) nch = charged particles / interaction Nch = charged particles / BC Ntot = all particles / BC With every bunch crossing 23 Minimum Bias events with ~1725 particles produced nch≈ 50 Nch= nch x 23 = ~ 1150 Nto= Nch x 1.5 = ~ 1725

  4. Looking for Interesting Events 23 min bias events Higgs → ZZ → 2e+2m

  5. another Constraint: ATLAS Event Size pile-up, adequate precision need small granularity detectors Atlas event size: 1.5 MB (140 million channels)  at 40 MHz: 1 PB/sec affordable mass storage: 300 MB/sec • storage rate: < 200 Hz  3 PB/year for offline analysis

  6. The Trigger Challenge σ rate IA rate:~ 1 GHz; BC rate: 40 MHz; storage:~ 200 Hz  online rejection: 99.9995% (!)  crucial for physics (!) total interaction rate • powerful trigger needed: • enormous rate reduction • retaining the rare events in the very tough LHC environment • remember: 0.000005 must be shared: • physics triggers • high pT physics (un-pre-scaled) • low pT physics (pre-scaled, excl.) • technical triggers: • monitor triggers • calibration triggers • … storage rate discoveries ET

  7. Technical Implementation

  8. ATLAS Trigger: Overview 3-Level Trigger System: • LVL1decision based on data fromcalorimetersandmuon trigger chambers;synchronous at 40 MHz; bunch crossing identification • LVL2uses Regions of Interest (identified by LVL1) data(ca. 2%) with full granularity from all detectors • Event Filterhas access to full event and can perform more refined event reconstruction hardware 2.5 ms software ~ 10 ms ~ sec.

  9. LVL1 Trigger Overview Muon trigger Calorimeter trigger Muon Barrel Trigger (RPC) Muon End-cap Trigger (TGC) Pre-Processor (analogue  ET) Jet / Energy-sum Processor Cluster Processor (e/g, t/h) Muon-CTP Interface (MuCTPI) LVL1 latency: 2.5 ms = 100 BC multiplicities of m for 6 pT thresholds multiplicities of e/g, t/h, jet for 8 pT thresholds each; flags for SET, SETj, ETmiss over thresholds Central Trigger Processor (CTP) L1A signal TTC TTC TTC TTC TTC …

  10. LVL1 Calorimeter Trigger • available thresholds: • EM (e/gamma): 8 - 16 • Tau/ hadron: 0 - 8 • Jets: 8 • fwd. Jets: 8 • ETsum, ETsum(jets), ETmiss : 4 (each) electronic components (installed in counting room outside the cavern; heavily FPGA based): • example: e/g algorithm: • goal: good discrimination e/g↔ jets • identify 2x2 RoI with local ET maximum • cluster/ isolation cuts on various ET sums PPM crate 7 JEMs 6 CPMs • output: • at 40 MHz: multiplicities for e/g, jets, t/had and flags for energy sums to Central Trigger (CTP) • accepted events: position of objects (RoIs) to LVL2 and additional information to DAQ

  11. LVL1 Muon Trigger algorithm: dedicated muon chambers with good timing resolution for trigger: • Barrel |η|<1.0 : Resistive Plate Chambers (RPCs) • End-caps 1.0<|η|<2.4 : Thin Gap Chambers (TGCs) • local track finding for LVL1 done on- detector (ASICs) • looking for coincidences in chamber layers • programmable widths of 6 coincidence windows determines pT threshold • Available thresholds: • Muon: 6

  12. LVL1 Trigger Decision in CTP signals from LVL1 systems: 8-16 EM, 0-8 TAU 8 JET, 8 FWDJET 4 XE, 4 JE, 4 TE, 6 Muon CTP: (one 9U VME64x crate, FPGA based) central part of LVL1 trigger system other external signals e.g. MB scintillator, … CTP in USA15: calculation of trigger decision for up to 256 trigger items: e.g. “XE70+JET70”  raw trigger bits internal signals: 2 random rates 2 pre-scaled clocks 8 bunch groups application of veto/ dead time application of pre-scale factors  actual trigger bits CTP note: 2 different dead-time settings: trigger groups with “high” and “low” priority will see different luminosities! L1A all of these steps need to be taken into account in offline data analysis

  13. LVL1 triggers on (high) pT objects L1Calo and L1Muon send Regions of Interest (RoI) to LVL2 for e/g/t-jet-m candidates above thresholds Interface to HLT: RoI Mechanism • LVL2 • usesRegions of Interestas “seed” for reconstruction (full granularity) • only data in RoI are used • advantage:total amount of transfered data is small • ~2% of the total event data • can be dealt with at 75 kHz EF runs after event building, full access to event

  14. ATLAS Trigger & DAQ Architecture • LVL2 and EF run in large PC farms on the surface • DAQ and HLT closely coupled • pre-series (corr. ~10% of HLT) HLT HW: DESY, Humboldt

  15. Staging of HLT Components deferred due to financial constraints max LVL1 rate per L2P: 150 Hz EventBuilder rate per SFI: 40 Hz max EB rate per EFP: 2 Hz physics storage rate per EFP: 0.1 Hz storage rate per storage element: 60 MB/s 40 Hz for 1.5 MB SFOs non-deferred; allow b/w for calib., debug, etc • consequences for physics: • e.g. in 2007/2008: • LVL1 rate: ~40 KHz • (cf. design:75/100 KHz) • physics storage: ~80 Hz • (cf. design: 200 Hz)

  16. Trigger Strategy

  17. HLT Selection Strategy Example: Dielectron Trigger fundamental principles: 1) step-wise processing and decision • inexpensive (data, time) algorithms first, complicated algorithms last. 2) seeded reconstruction • algorithms use results from previous steps • initial seeds for LVL2 are LVL1 RoIs • LVL2 confirms & refines LVL1 • EF confirms & refines LVL2 note: EF tags accepted events according to physics selection ( streams, offline analysis!) • ATLAS trigger terminology: • Trigger chain • Trigger signature (called item in LVL1) • Trigger element

  18. in parallel: Trigger Chains HLT Steering enables running of Trigger Chains in parallel w/o interference Trigger Chains are independent: • “easy” to calculate trigger efficiencies • “easy” to operate the trigger (finding problems, pre-dictable behavior)  scalable system ATLAS follows “early reject” principle: • Look at signatures one by one i.e. do not try to reconstruct full event upfront if no signatures left, reject event • Save resources minimize data transfer and required CPU power in principle: N-Level trigger system but: Only one pre-scale per chain per level. (to be discussed if used in HLT)

  19. Physics Analysis: the Trigger Part Every physics analysis needs dedicated thoughts about the trigger: • trigger rejects 0.999995  more or less hard cuts (in the signal region) • (each) trigger has an inefficiency that needs to be corrected (turn-on curve) • Similar to offline reconstruction efficiency, but important difference: no retrospective optimization: “The events are lost forever.” • trigger optimization (as early as possible) • trigger data quality during data-taking is crucial Example: trigger optimisation: typical turn-on curve: L2Calo

  20. Physics Analysis: the Trigger Part analysis preparation: • setup/ optimize a trigger for your physics signal • define a trigger strategy (based on the available resources) • convert to trigger chain (already existing?) • determine rates and efficiencies from MC • define a monitoring strategy • define trigger chain to be used for monitoring of your physics trigger (efficiency from data) • rates of the monitoring trigger (pre-scales?) • integrate this in the overall trigger menu (done by Trigger Coordination for online running) not OK OK threshold? more exclusive? pre-scaling ? • use the trigger online (take data) • monitor trigger quality • determine trigger eff. (from data) • correct your measurement

  21. Trigger Efficiency from Data • example: possible monitoring of inclusive lepton triggers: • reconstruct good Z0 candidates offline (triggered by at least one electron trigger) • Count second electrons fulfilling trigger • other methods: • di-object samples (J/Y, Z0, Z0+jets) • minimum bias and pre-scaled low-threshold triggers (“bootstrap”) • orthogonal selections in HLT (ID, muon, calo) • … • note: • selection bias to be carefully checked ! • trigger efficiency may depend on physics sample (e.g. electrons in W en and top)  investigate in physics groups rec. Z0-peak electron positron trigger effi. eta time-evolution of accuracy studies of this kind are important and are just starting in ATLAS total efficiency for muons number of events

  22. LVL1 Menu (as of today, TDR) • general trigger problem: • cover as much as possible of the kinematic phase space for physics  low trigger thresholds • keep the trigger rate low  high trigger thresholds •  trigger menu is a compromise LVL1 rate is dominated by electromagnetic clusters: 78% of physics triggers Note: • large uncertainties on predicted rates • study of the global aspects needed: load balancing (e.g. jet triggers)

  23. HLT Menu (as of today, TDR) e/g rate reduced mainly in LVL2 (full granularity in RoI) Note: • large uncertainties on predicted rates (no data!) • these menu give an rough impression of what we will select. • details of the menu are not yet worked out (pre-scales, monitoring, …) • but first examples of realistic trigger menus needed soon

  24. towards a more complete Menu aim: get concrete examples of more complete and realistic trigger menus for discussion at the next trigger and physics weeks. • ad-hoc-group: • started rethinking about the trigger menus • invites input from physics, combined performance and detector groups • study slice-wise: • optimization of cuts • need distributions of rates, rate vs. eff • more realism to algorithms • detailed studies of threshold behaviour, noise • consequences on physics reach • study of the global aspects: • load balancing (e.g. jet triggers balancing) • overlap between selections, optimization • the important details of the menu • monitoring strategy • pre-scaling strategy (dynamic, static) triggers • concurrent data-taking (pre-scales) or sequentially (i.e. dedicated runs)? • time evolution (luminosity, background, etc.) pre-scale changes ala H1/CDF? • technical triggers (bunch-groups, etc.) • … • priorities: • consolidate work on menu for 14 TeV and 1031. • in parallel: limited study for 0.9 TeV and 1029 • later look at 1032 and above • …

  25. Ideas for early Data Taking conditions of early data-taking: initial luminosity: 1031(1029), bunch spacing 75ns (~500ns) BCID not critical, can relax the trigger timing windows • trigger commissioning • understanding of LVL1 is crucial at startup • first phase: • rates are low • DAQ can stand 400 MB/s • LVL1 only, HLT “transparent” • some pre-scaling needed only for very low thresholds. • HLT selections studied offline • second phase: • insert HLT • start with very simple and basic algorithms • minimum bias events: • important esp. at the beginning: • crucial for timing-in of the experiment • for commissioning of detectors/ trigger/ offline selection • physics: as bkg. (important for 14 TeV), per se • possible implementation: • BC LVL1 trigger + selection on LVL2/EF • bias free at LVL1 • MBTS trigger at LVL1 + selection in HLT • some bias at LVL1 (η range; efficiency for MIPS; multiplicity requirements; etc.) • needed where interactions per BC << 1

  26. The technical Side: Trigger Configuration Unique key • TrigConf system under development • real data-taking: trigger menu can change between runs • optimization, • falling luminosity during a fill (pre-scales, cuts) • … • book-keeping of all settings crucial • TriggerDB is central part: • stores all information for the online selection • stores all versions of trigger settings. • identified with a unique key to be stored in CondDB. LVL1 HLT Offline data analyzer users will have to look up the TriggerDB to interpret the trigger result in the events, e.g. to find the settings for their triggers and the corresponding run ranges.

  27. The technical side: Trigger Configuration Java front-end for the TriggerDB under development:TriggerTool • three modes are foreseen: • experts: construct consistent menus in TriggerDB • shift-crew: choice of predefined options (menus, pre-scale sets) • offline user: extract menus in text file for development, or simulation etc, browse DB to find settings of triggers and run ranges

  28. German Contributions • Contributions: • Hardware: • L1Calo Preprocessor Heidelberg • L1Calo Jet-Energy Module Mainz • HLT computing racks DESY, Humboldt • Technical software around trigger: • Trigger Configuration DESY/HH • Trigger Monitoring DESY/Humboldt • Simulation, algorithms, performance: • CTP Simulation DESY/HH • MB Trigger DESY/Humboldt • Jets, ETmiss Mainz • B-physics Siegen (planned), • B-tagging on LVL2 Wuppertal (finished) • Muons MPI (planned for SLHC) • Trigger strategy: • Operation, HLT Steering Mainz, DESY/HH • Combined Trigger Menu DESY/HH • Pre-scaling Heidelberg, Mainz, DESY/HH • Institutes: • Heidelberg • Mainz • DESY/Hum-boldt/HH • (Siegen) • (Wuppertal) • (MPI)

  29. Summary • triggering at the LHC is crucial for physics • only 0.000005 of the events selected • cuts and efficiencies affect the results • each data analyzer must understand the trigger • choice of trigger, trigger optimization • trigger (in-)efficiency • how to measure it (from data)? • how to correct for it? • need to develop more complete and realistic trigger menus for (early) data taking • German contributions in many areas (HW+SW) • very good collaboration !

More Related