1 / 56

Statusbericht CMS-Trigger

Statusbericht CMS-Trigger. M. Arnold, H. Bergauer, M. Eichberger, J. Erö, V. Ghete, Ch. Hartl, M. Jeitler, G. Kasieczka, K. Kastner, I. Magrans de Abril, I. Mikulec, B. Neuherz, M. Oberegger, H. Rohringer, Th. Schreiner, J. Strauss, F. Teischinger, A. Taurok, Th. Themel, Ph. Wagner, C.-E. Wulz

linder
Download Presentation

Statusbericht CMS-Trigger

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Statusbericht CMS-Trigger M. Arnold, H. Bergauer, M. Eichberger, J. Erö, V. Ghete, Ch. Hartl, M. Jeitler, G. Kasieczka, K. Kastner, I. Magrans de Abril, I. Mikulec, B. Neuherz, M. Oberegger, H. Rohringer, Th. Schreiner, J. Strauss, F. Teischinger, A. Taurok, Th. Themel, Ph. Wagner, C.-E. Wulz HEPHY Vorstandssitzung, Wien, 24.11.2008 Präsentiert von I. Mikulec

  2. Muon Trigger Calorimeter Trigger RPC CSC DT HF HCAL ECAL Local CSC Trigger Local DT Trigger RegionalCalorimeterTrigger PatternComparator Trigger 40 MHz Pipeline, Latency< 3.2 ms CSC TrackFinder DT TrackFinder Trigger Supervisor GlobalCalorimeterTrigger 4+4 m 4 m 4 m MIP+ISO bits Global Muon Trigger e, J, ET, HT, ETm, NJ 4 m (with MIP/ISO bits) Global Trigger Responsibility of HEPHY-Vienna HLT max. 100 kHz L1 Accept I. Mikulec

  3. Drift Tube Track Finder

  4. DTTF Developments • PHTF • Firmware upgrade: new version in all Boards since May • Extended Configuration possibilities • mask all own Sector and neighbour Sector Chambers independently • set up delay lines for TF data and CSC exchange data independently • Built-in Pattern Generators • Built-in or loadable Patterns • for Input data • for Internal data • for Output data • Orbit signal synchronization – Patterns generated with given delay after BC0 • allows synchronizing Patterns in several sectors and/or CSC exchange • allows latency measurements up to GT • Built-in rate counters • all Input rates separated from Muon Chambers, classified by hi-lo Quality • all Output rates classified by Quality (all 7 categories)‏ • Clock difference counters • measure BX difference between internal clock and input data • speeds up synchronization process between Front-End and DTTF Slide by J.Erö I. Mikulec

  5. DTTF Developments #2 • ETTF • no Eta data forwarded from Detector • only Firmware patch to get rough Eta values in GMT • Firmware Upgrade in process • similar features as PHTF (Pattern Generator, Counters, new Control scheme)‏ • DCC • Firmware fix to eliminate “Monster Events” • reason was an undocumented FPGA feature • participate in Global Runs – readout by CMS Global DAQ system • DQM group developed corresponding programs Slide by J.Erö I. Mikulec

  6. Global Muon Trigger

  7. GMT status • HW performed without any problem during all global running periods • HW – emulator comparison: 100% agreement • Control SW (TS GMT cell - cleaned up by Ch. Hartl) – very few crashes • Control GUI(dev. by Ph. Wagner in summer 07) heavily used – new developments used already during CRAFT: • LUT configuration and on-the-fly generation, online-offline db interface (by Th. Themel) • HW monitoring GUI (by G. Kasieczka) • Prompt analysis to ensure synchronization of track finder inputs (see later) • Offline: Adapt LUTs for startup conditions (missing parts of RPC) I. Mikulec

  8. GMT LUT configuration panel Developer: Th. Themel I. Mikulec

  9. GMT HW monitoring GUI Developer: G. Kasieczka I. Mikulec

  10. Global Trigger

  11. Global Trigger Hardware Slide by M.Jeitler

  12. Global Trigger & Global Muon Trigger crates • We have one main or “production” 9U VME crate for the Global Trigger and Global Muon Trigger, plus an almost fully equipped spare crate mounted on top of it • For the “conversion boards”, a 6U crate is installed below the 9U crates • Each of the two 9U crates is connected to a separate computer and to the LHC clock • So, in case of problems we can within minutes either replace modules with spares from the spare crate, test the problem in parallel, or switch to the spare crate altogether • A second spare crate has been installed in the lab in building 904 in Prevessin (“CMS electronics integration facility”) where we will also keep extra spare modules (the third module of each kind, where/when available) Slide by M.Jeitler I. Mikulec

  13. HW Performance • During the various test runs and during the short happy life of the LHC this year (September 10-19) our HW systems worked without problems • A few firmware items were being finalized • e.g., “Beam Synchronous Timing (BST)” receiver on the readout board (“GTFE [Global Trigger Front End] board”) • On several occasions, firmware adaptations have been requested by the subsystems that rely on our signals, and Vienna has been fulfilling these requests quickly and to everybody’s satisfaction • Trigger Control System (TCS) • Final Decision Logic (FDL) • Pipelined Synchronized Buffer (PSB) • Global Trigger Logic Board (GTL) Slide by M.Jeitler I. Mikulec

  14. PSB upgrade The fast serial “Infiniband” connection between the “Global Calorimeter Trigger” (built by Imperial College, London) and our input boards (PSBs) is somewhat problematic • with one type of cables (3 meter long) no errors are observed but other (shorter) cables seem to give rise to rare intermittent errors • there are not enough “good” cables available, and anyway this suggests that the links are operating on the edge of reliability • although this does not appear as a major problem, CMS has decided to upgrade the system and replace the links by optical links • Imperial College will built the receiver daughter boards for them and Vienna will modify the PSB mother boards • we consider this a first step towards a Super-LHC upgrade Slide by M.Jeitler I. Mikulec

  15. Ancillary Systems • A PC has been set up at Cern for the trigger menu editor (software by Philipp Wagner, Herbert Bergauer and Vasile Ghete) • this allows to quickly modify the trigger menu and also to produce the corresponding firmware for the Global Trigger Logic board with the Altera compiler (“Quartus”) at Cern • Two new “FedKit”systems and the computers needed for them have been bought and installed to allow better testing of the readout chain • one system installed in Vienna, the other to be used at Cern Slide by M.Jeitler I. Mikulec

  16. L1 GT Offline Changes • DataFormats/L1GlobalTrigger • Data formats for all GT readout records (data and MC): relatively stable • Added prescale factor indices; restructured EVM record to accommodate variable BST format • L1GT Unpacker/Packer • Adapted to new data formats; streamlined code for better performance • L1 GT Emulator • Included evaluation for new condition types: CASTOR conditions, new GCT HF bit counts and energy sums conditions, technical triggers • streamlined/rewritten code for better time performance: • Release 1_7_X: 43 ms/event, rel. 2_1_X: ~2 ms/event: time target achieved! • L1 GT Analyzer • New modules for emulator vs data comparison, pack-unpack validation • Refined L1 GT trigger report Slide by V.Ghete I. Mikulec

  17. L1 GT Configuration via Event Setup • New conditions included in EventSetup • CASTOR conditions, new GCT HF bit counts and energy sums conditions, correlation conditions; JetCounts changes. • Technical triggers. • L1 trigger menu in EventSetup • Persistency changes and inclusion of new conditions • Updated L1 GT records • for consistent hardware vs software configuration • Database issues • Extensive discussions online – offline – DB for transfer and synchronization of the L1 GT event setup (O2O)‏ • Very complex due to multiple environments • Solution under implementation Slide by V.Ghete I. Mikulec

  18. L1 GT: HLT/Fast Sim, Other • L1 GT interface with HLT: HLTLevel1GTSeed module • Changes to adapt to new conditions or to new HLT requirements • Support for online running of L1 GT “offline” software • Review L1 algorithm names & seeding (ongoing)‏ • L1 GT & Fast Simulation • Adapted to various FastSimulation requirements • All offline software packages • migration from text configuration to python configuration • Despite good scripts, required also manual changes • Other ongoing activities (with Thomas T & Gregor R)‏ • Software for GT pattern tests & GT input validation • Documentation, user support, ... Slide by V.Ghete I. Mikulec

  19. L1 Trigger Menus & Software • L1 Trigger Menu: major activity in the last six months • Switch to / write / improve L1 Menu software • (Philipp, Herbert, Christian, Vasile)‏ • L1 Trigger Menu Editor (replaces GTgui; runs in Trigger Supervisor)‏ • VhdlWriter (replaces GTS; runs in CMSSW and Trigger Supervisor)‏ • VhdlWriter and Menu Editor use classes from CMSSW L1 software • L1 menu in DB (uniform CMSSW / hardware implementation)‏ • Prepare L1 menus - working with the Trigger Studies Group • New menus for startup were implemented in time • L1Menu_startup_v* (high luminosity L1 scales) – used online for Sep. 10th • L1Menu_startup2_v* (startup L1 scales) – used online in CRAFT • New menu for MC studies • L1Menu_2008MC_2E30 (luminosities: 2x10^30 – 10^32)‏ • All menus with the new L1 menu software chain: production status Slide by V.Ghete I. Mikulec

  20. Trigger Menu Editor Main developer: Ph. Wagner I. Mikulec

  21. Global Trigger: Level 1 Trigger Menu Database The Global Trigger combines decisions of 128 algorithm and 64 technical trigger bits into L1A candidates The Level 1 Trigger Menu is implemented as firmware and specifies the implementation of all algorithms The menu can be fine-tuned by changing parameters through VME access No need to generate new firmware for threshold changes etc. The L1 Trigger Menu database structure reflects the information contained in a menu "Trigger Menu Editor" generates firmware and populates database Online software accesses database to configure the hardware with the menu: correct firmware is loaded? Set / verify thresholds Loaded menu is translated into object used by offline software to configure emulator New advanced schema provides detailed and trackable versioning Important for flexible interface to GT offline software, HLT, physics analysis users Slide by Ch.Hartl I. Mikulec

  22. Global Trigger: Level 1 Trigger Menu Database The L1 Trigger Menu part of the Global Trigger database... Slide by Ch.Hartl I. Mikulec

  23. Trigger Supervisor

  24. Trigger Supervisor Project • Operation and consolidation phase • TS framework: • Version 1.6 compatible with X 6 featuring alarms and improved GUI • All CMS sub-systems being regularly operated with the TS • Student supervision: • Continuous supervision of 1 CERN fellow and supervision of 4 summer students • Main concerns: • Limited human resources: TS framework consolidation (~User’s support), Interconnection test infrastructure and new configuration service can not be done dues to lack of man power. Slide by I.Magrans I. Mikulec

  25. Trigger Supervisor Project • Project documentation: • PhD Thesis by Ildefons Magrans de Abril, “The CMS Trigger Supervisor: Control and Hardware Monitoring System of the CMS Level-1 Trigger at CERN ” • CMS Note accepted by Marc & Ildefons Magrans de Abril et Al. “Homogeneous User Interface for Expert Control of the Level-1 Trigger ” (under final correction) • Marc & Ildefons Magrans de Abril et Al. “TS User’s guide” (Continuous update) Slide by I.Magrans I. Mikulec

  26. Operation and Prompt Analysis

  27. Global Running • May-August: CRUZET (1-4) (Cosmic RUn at Zero Tesla) ~1week/month, ~300M cosmic triggers • 8.9.-19.9.: Beam operations • 15.10-11.11.: CRAFT (Cosmic Run At Four Tesla) ~300M cosmic triggers with Tracker in DAQ • Main goals: • Collect statistics for alignment, calib., synchronization • Exercise 24/7 running • Test Trigger/DAQ interaction, trigger throttling, signal distribution etc. • Heavy load on trigger personnel • Trigger shifts • On call service • Trigger Field Manager • On site assistance I. Mikulec

  28. DTTF in Global Runs • DTTF CRUZET & CRAFT • CRUZET 1, 2 • 36/72 PHTF Boards delivering Triggers (6 Sectors, top 3 + bottom 3)‏ • no DCC readout • CRUZET 3 • 36/72 PHTF Boards delivering Triggers (6 Sectors, top 3 + bottom 3)‏ • with DCC readout • CRUZET 4 • 54/72 PHTF Boards delivering Triggers (9 Sectors) + DCC readout • CRAFT 1 • 60/72 PHTF Boards delivering Triggers (10 Sectors) + DCC readout • CRAFT 2 • 69/72 PHTF Boards delivering Triggers (12 Sectors) + DCC readout • 1 PHTF Board failed after start • 2 PHTF Boards with broken Optical Input Cards Slide by J.Erö I. Mikulec

  29. DTTF in Global Runs • Global Runs under Trigger Supervisor Control • fully automatic startup and configuration • process refined during Global Runs • most troubles caused by incorrect TTC configuration • must not stop Triggers during “Pause” => missing Trigger data • Scrpits developed to gain overview about ALL sectors • “all_bcdiffcheck” gives an overall picture about detector synchronization • “all_input_rates” shows all sectors' rates • “all_l1_countercheck” verifies at Run end if all Boards received all L1A • Readout delivered data for DT Trigger analysis • compared to Emulator data • discovered Firmware bugs • problems with Cosmic hits out-of-range • new F/W Upgrade fixes Slide by J.Erö I. Mikulec

  30. Auswahl Detektorbereich Rate Counter Panel Popupable ROOT Plots DTTF Monitoring TS DTTF Monitoring Panel Slide by F.Teischinger | 5 I. Mikulec

  31. DTTF Shifter Panel berblick Board Status 6 ROOT Histogramme reprsentieren Counter und lassen hohe Raten (hot channels) schnell erkennen DTTF Monitoring TS DTTF Monitoring Panel Slide by F.Teischinger | 6 I. Mikulec

  32. DTTF Analysis results Eta-Phi distribution with CSC detector data exchange (CSC rate is very low!)‏ by Jorge Troconiz, UAM Madrid I. Mikulec

  33. DTTF Analysis Eta – readout vs. simulated Phi – readout vs. simulated by Jorge Troconiz, UAM Madrid I. Mikulec

  34. DTTF Analysis Quality – readout vs. simulated Pt – readout vs. simulated by Jorge Troconiz, UAM Madrid I. Mikulec

  35. Global Trigger: Online Operation CMS Trigger hardware systems operation is steered through the "Trigger Supervisor" control system during a CMS Run: State machine: "halted" > "configured" > "enabled" > "suspended" "configure" brings the Global Trigger into a well-defined state "Cold Start" option allows to load FPGAs from PROMs Configuration data are predefined in "database keys" Frequently changing settings (trigger masks, prescales etc.) are decoupled from the basic configuration Database interface and operation protocol between online (GT Cell) and offline (GT emulator) software being finalized > "O2O" https://twiki.cern.ch/twiki/bin/view/CMS/GTRunSettings Shifter GUI:Define GT Run Settings for next CMS Run Slide by Ch.Hartl I. Mikulec

  36. Global Trigger: Monitoring Software Monitoring of Trigger Rates and Deadtime: CMS Runs are partitioned into Luminosity Segments (93 seconds)‏ Global Trigger hardware provides rates per Luminosity Segment: Raw Trigger Rate Counts: prescaled Global Trigger Logic output rates for 128 "algorithms" and 64 "technical triggers" L1A Rate Counts: rates of L1As delivered by Trigger Control Systemboard(rates after deadtime)‏ Deadtime Counts: number of bunch crossings contributing to deadtime (split up into several deadtime reasons)‏ Trigger Monitoring Online Software sends monitoring data to standard monitoring infrastructure allows subscription web-based services (e.g. Web Based Monitoring)‏ provides updating GUI front-end to shifter, including history back-end will store data in GT monitoring database https://twiki.cern.ch/twiki/bin/view/CMS/GTTriggerMonitoring Other Global Trigger Hardware Monitoring: status of connected detector partitions etc. more hardware and data monitoring will follow Slide by Ch.Hartl I. Mikulec

  37. Global Trigger: Monitoring Software GUIs Trigger and Deadtime Online Monitoring Detector Status Online Monitoring Slide by Ch.Hartl I. Mikulec

  38. Prompt analysis during Global runs Example of a prompt analysis of rate changes during CRUZET 1 L1A rate 1 Total 2 4 DT 1: timestamp problems 2: RPC noise 3: RPC HW problems 4: real cosmics increase Track direction change only in one direction - from the shaft CSC 3 RPC ECAL timestamp - offset [s] Traced to shaft opening maneuver I. Mikulec

  39. L1 GT: Data versus Emulator • CRUZET Run 43439 – comparison results for algorithms Data Differences All differences understood Emulator Slide by V.Ghete I. Mikulec

  40. Tag&Probe DT trigger efficiency CRUZET 2 Run 47011 same bx ±1 bx    [deg]  [deg] Internal DT trigger synchronization much better in CRUZET 2 Sector 10, wheel -1 missing, Sector 4 wheel +1 known to have bad sync I. Mikulec

  41. Tag&Probe CSC efficiency run 47011 CRUZET2 ±1 bx Tag: DT triggered DT track Probe: extrapolate the tag track to the approximate CRUZET2 endcap position (~ME2) and check if CSC triggered in ±1 bx y [cm] x [cm] I. Mikulec

  42. DT-CSC trigger synchronization DT-CSC intime • CRUZET 2: • 1 endcap closer to barrel – allows to check DT/CSC timing • DT-CSC timing depends on direction (Time Of Flight) of cosmic muons y [cm] z [cm] CSC first - DT next bx DT first - CSC next bx y [cm] y [cm] z [cm] z [cm] I. Mikulec

  43. Muon reconstruction, absorption Why does the bottom side seem to have lower efficiency? Average probability of at least one track segment to be found at bottom if track reconstructed at top - projected to the earth surface: shaft muons are softer tracks extrapolated to the (earth) surface in CMS x,z-coordinates: z-surface [cm] restricted to bottom DT acceptance z-surface [cm] x-surface [cm] x-surface [cm] I. Mikulec

  44. Sun Sep 7 – first beam splash events run 61642 CSChalo,HF HF afterpulse (filtered by trigger rules) BPTX+ Bx • List of all 17 shot event numbers was posted on hn and elog shortly after the data was recorded • Time stamps of events were posted for association with BLM data I. Mikulec

  45. Sep 10 - first multiple orbits from Tim's elog orbit signals BPTX I. Mikulec

  46. First beam RF capture First RF capture attempts (phase drift) 11.9. 21:00 – 12.9. 3:00 • Beam Pickup Trigger • CSC beam halo trigger CSC halo signal Large debunched signal in CSC Lower cosmics rate due to backpressure Initial mistiming of BPTX wrt. CSC halo trigger I. Mikulec

  47. DT/RPC trigger synchronization CRAFT Fraction of triggers with DT and RPC within ±1 BX • 86% of all RPC triggers have DT trigger within ±1 BX • 48% of all DT triggers have RPC trigger within ±1 BX • Conclusion: • RPC is correctly timed to DT • Some spread due to imperfections in the internal synchronization of RPC and DT see slides 10,11 RPC 1BX late 10% RPC trigger bx wrt. L1A Perfect coincidence 70% DT 1BX late 15% Flat muons triggered 2bx before ~1% (suppressed by trigger rules) DT trigger bx wrt. L1A I. Mikulec

  48. DT/CSC trigger synchronization CRAFT • Conclusion: • Delay • bottom of CSC by 1BX • top of CSC by 2BX DT 1BX earlier Situation before Oct 21 Situation after Oct 21 DT–CSC same BX DT–CSC same BX • Configuration: top of CSC – bottom of DT almost completely missing! CSC trigger phi CSC trigger phi DT trigger phi DT trigger phi DT 1BX later I. Mikulec

  49. Close Events: DBX=8BX Flat angle wrt. chamber plane Andrea Venturi I. Mikulec

  50. Retriggering analysis - CRAFT Tracks with large angle wrt. chamber normal retrigger Retriggered event First event noise Max 4d angle wrt chamber normal Flat tracks Dbx Dbx I. Mikulec

More Related