1 / 36

Triggering In High Energy Physics

Triggering In High Energy Physics. Gordon Watts University of Washington, Seattle NSS 2003. N14-1. Thanks to interactions.org for the pictures!. Introduction The Collider Environment Trigger Design Hardware Based Triggers Software Based Triggers Farms Conclusions. Outline.

page
Download Presentation

Triggering In High Energy Physics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Triggering In High Energy Physics Gordon Watts University of Washington, Seattle NSS 2003 N14-1 Thanks to interactions.org for the pictures!

  2. Introduction The Collider Environment Trigger Design Hardware Based Triggers Software Based Triggers Farms Conclusions Outline I am bound to miss some features of the triggering systems in various detectors. I have attempted to look at techniques and then point out examples. Thanks to the many people who helped me with this talk (too numerous to mention). Focus mostly on ongoing discussions and presentations (RT’03)

  3. Why Trigger? • Bandwidth • Raw bandwidth of many detectors is GB/s • Can’t Archive that rate! • A HEP experiment can write TeraBytes of data. • CPU • Reconstruction programs are becoming significant • Farms are 1000’s of computers. • GRID to the rescue? • Physics • Is all the data going to be analyzed (L4)? • Analysis Speed • Small samples can be repeatedly studied, faster Disk & Tape Reconstruction Farm Eliminate Background As Early As Possible Don’t Loose Physics!

  4. Why Trigger? II It’s obvious Detector Still, stunning to consider Trigger Full Readout Information ~7 MHz ~1 kHz Level 1 Level 3 ~10 kHz • Assume 50% duty cycle for accelerator • 15,768,000 working seconds in a year • Event size 250 kB • $ are for raw events only • $0.40/GB Level 2 ~50-70 Hz ~1 kHz Much worse at LHC! 1999: $400k for L3!

  5. Trigger Design Stereotyping Large Scale HEP Experiments Farm Node Farm Node High rate – 40 MHz Raw Trigger Rate DAQ/Trigger Integration Farm Node Trigger Flexibility – move decision out of hardware and into firmware or software as early as possible. Hardware Trigger Detector Multilevel triggers, custom hardware, network topologies Small HEP Experiments, Medical Applications Large Data, lower frequency, fewer sources Smaller CPU/cluster can handle directly with specialized board Detector CPU

  6. HEP Accelerators e+e- ep, pp, pp Source: PDB’02

  7. HEP Detectors

  8. High Rate Trigger/DAQ Data Rates are 100 MB/sec Beam Crossing Rates are MHz Even GLAST uses them Gamma-ray Large Area Space Telescope Many times at edge of technology when first designed But off-the-shelf by the time they are running! Si Tracker L1 Tracking • Level 1: hardware based • Physics Objects, little associations • Coarse Detector Data • Deadtimeless • Level 2 • Hardware to preprocess data • Some Muon processors, Silicon Triggers • Software to combine • Matches, Jet finders, etc. • Level 3: a commodity CPU farm • Complete event information available Cal Hardware L1 Cal 1 of 25 towers L1 Decision Hardware Full Readout L2 CPU (in tower) Upon detection of Gamma-ray burst notify by internet within 5 seconds L3 CPU (Farm) To Earth ATLAS, BaBar, Belle, BTeV, CDF, CMS, DØ, LEP etc.

  9. Trends In Big Trigger Commodity Electronics CPU and Networking Delay the decision as long as possible • More Data Available • More sophisticated decision platform (bug fixing!) DAQ typically sits before a Farm Trigger DAQ is driving deeper into the Trigger System BTeV, ATLAS, CMS… Commodity Electronics Enforces Common Interface Custom Hardware Plethora of data formats Allows for economies of scale Hardware Sophistication HLT Algorithms in Hardware ASICs, FPGAs HLT getting more sophisticated (ENET in a FPGA)

  10. BTeV The Age Of The Network Fast Networking Moves Data from a local environment to a Global One Cheap Commodity Processors allow for powerful algorithms with large rejection Cheap DRAM allows for large buffering, and latency

  11. DØ & CDF Data for > 2.5 year Typical BIG Experiment Trigger – Typical Multi-level Detector ~1 million readout channels Event size is 250 KB 12.5 MB/sec written to tape Trigger Full Readout Information ~2.5 MHz ~1(.3) kHz Level 1 L1:Dead-timeless,pipelined, Hardware L2&L3: can cause deadtime, variable length algorithms L2: Hardware and Software L3: CPU Farms Level 3 ~5(20) kHz 50-100 Hz Level 2 ~1(.3) kHz

  12. Triggering Strategy Similar but Different Approaches driven by Physics CDF: More emphasis on B Physics Jet of particles w/displaced track. Hard to pick out of background Too sophisticated to do well at L1 Pumps the extra data into L2 (CDF) CAL, Tracks (connection), etc…. DØ cuts harder at L1 L1 Trigger better at object correlations CAL, Tracks (connection), etc….

  13. Forward Preshower CFT Layers Track Tracking at the TeV • Track finding is difficult • Must Implement hit searches and loops in hardware. Use FPGAs in the Track Trigger • Fiber Tracker (Scintillating Fiber) – DØ • Open Cell Drift Chamber - CDF • Finds Tracks in bins of pT • 1.5, 3, 5, and 10 GeV/c @ DØ • Done by equations: • Layer 1 Fiber 5 & Layer 2 Fiber 6 & etc. • Inefficiencies are just more equations • Firmware becomes part of the trigger • Versioning • Fast & Flexible - Can be reprogrammed later. • Painful for the trigger simulation! • 80 sectors • 1-2 chips per sector • 16K equations per chip • One eqn per possible track.

  14. DØ’s Fiber Tracker gives instant position information No drift time to account for. CDF divides hits into two classes Drift time < 33 ns Drift time 33-100ns This timing information is used as further input to their FPGAs for greater discrimination. This is not done on every wire. Drift Chamber Tracking Prompt Hit Non Prompt Hit • Babar also uses FPGAs & LUTs to do track finding • 2ns beam crossing time! N14-5 The Central Track Trigger System of D0 Experiment N36-55 The D0 Central Tracking Trigger New Trigger Processor Algorithm for a Tracking Detector in a Solenoidal Magnetic Field N36-57

  15. Simple matching between detectors track-cal Usually first time Decision Logic is always programmable Often part of Run configuration Human Readable Trigger List May manage more than one Trigger Level. Usually contains scalars Keeps track of trigger live-time for luminosity calculations. Trigger Manager Detector A Detector B Detector C Trigger Logic For A Trigger Logic For B Trigger Logic For C 7 BX 4 BX 9 BX BX Synchronization Decision Logic (prescales too) FEC Noti- fication Per Trig Scalars Readout Data to Next Level

  16. Level 2 Architecture Hybrid of Commodity Processor & Custom Hardware CAL Preprocessor Global Similar to L1 Global Si Preprocessor Custom Hardware Control CPU CPU Common platform based on embedded DEC Alpha CPU and common communication lines for Run I N29-4 CDF Pulsar Project CDF: Pulsar Board Upgrade An Impact Parameter Trigger for the DØ Experiment N29-5 DØ: L2 b Boards

  17. High Level Trigger • Move the data from the ROCs • Build the Event • Process Event in Physics Software • Deliver accepted events ~100 Read Out Crates ~100-250 MB/sec ~100ms - 1 second/event processing time Both experiments use networking technology ATM w/concentrators, SCRAMNet for routing Gb Ethernet with central switch, ENet for routing See the Online/DAQ Sessions

  18. Data Flows when Manager has Destination Event Builder Machines One of first to use switch in DAQ. Dataflow over ATM Traffic Shaping Backpressure provided by Manager. CDF

  19. DØ High Level Trigger Network Based DAQ N36-71 Dataflow in the DZERO Level 3 Trigger/DAQ System The Linux-based Single Board Computer for Front End Crates in the DZERO DAQ System N36-70

  20. Peak Luminosity matters for Trigger/DAQ Tevatron Trigger Upgrades are Under review Decreased Tevatron Luminosity Expectations Both experiments still believe they will be required. Fermi TeV Planned Upgrades Accelerator draft plan: Peak luminosities 300 250 ~2.8e32 200 150 Peak Luminosity (x1030cm-2sec-1) Today(4.5x1031) 100 ~1.6e32 50 Start of Fiscal Year 0 2010 2003 2004 2005 2006 2007 2008 2009 • All under Review • COT Readout Speedup • L1 Track Trigger • L2 Si • L2 CPUs • L3 Network DAQ Upgrade • L1 CAL Sliding Window (ATLAS algorithm) • L1 CAL Track Match • Track Stereo info used at L1 • L2 CPUs • L2 Si Trigger improvements 13 racks of ’88 to 3 of present day, and better functionality (planned CAL upgrade in DZERO)

  21. ATLAS & CMS O(10) Larger L1: 75kHz HLT: 100 Hz 75GB/sec Event size: 1MB Both Experiments use full Network DAQ after L1 Solve Data Rate Problem Differently Big Switch (CMS) ROI (ATLAS) Almost no custom Hardware after L1 But Approach Is Similar

  22. CMS Trigger Architecture • CMS Uses only Muon & Calorimeter • Tracking too Complex & Large • Muon Tracking • Both ASIC and FPGA • Pipeline is on-detector (3.2 ms) Must operate at 100 kHz CMS 50 kHz @ startup CMS eISO Card

  23. ATLAS Level 1 Similar Design 2ms Pipe Line Level 2 CAL Cluster Finder CPU Farm (100-150 dual CPU, ~1ms/event) Offline Tracking @ L2 by CDF like XFT proposed

  24. Regions of Interest (ROI) • Bandwidth/Physics Compromise • Farm usually reads out entire detector • Hardware often look at single detector • ROI sits in the middle • Farm CPU requests bits of detector • Uses previous trigger info to decide what regions of the detector it is interested in. • Once event passes ROI trigger, complete readout is requested and triggering commences. • Flexible, but not without problems. • Must keep very close watch on trigger programming! • Farm decisions happen out-of-event-order • Pipelines must be constructed appropriately. • ATLAS, HERA-B, BTeV… L2 Farm CPU L1 Info Basic Cal Ele Conf Full Trigger

  25. CMS High Level Trigger NO L2 Trigger • 1 second buffering in Hardware (100Kx1meg) • 1000 HLT Farm Nodes • 700 ROC, 64 Event Builders Build Events in Two Stages ROC 8x8 8x8 8x8 8x8 8x8 8x8 1 of 8 Planes EVB Will use Myrinet for first plane GB under evaluation 8 Planes

  26. LHC Posters And Papers Full Crate Test and Production of the CMS Regional Calorimeter Trigger System N29-3 The First-Level and High-Level Muon Triggers of the CMS Experiment at CERN N14-2 Beam Test of the ATLAS End-cap Muon Level1 Trigger System N36-68 ATLAS Level-1 Calorimeter Trigger: Subsystem Tests of a Jet/Energy-sum Processor Module N29-1 N14-2 The ATLAS Liquid Argon Calorimeters Readout System Test Beam Results from the ATLAS LVL1 Muon Barrel Trigger and RPC Readout Slice N29-2

  27. BTeV Bs Mixing, Rare Decays 23 million pixels – in Level 1! 132 ns Bunch Spacing 100 kb/event Trigger full rate. 800 GB/sec!! Level 1 • Hit clustering in Pixels – FPGAs • Cluster linking in inner and outer layers – FPGAs • Track finding in B field (pT>0.25) – Embedded CPU Farm • Hard Scatter Vertex Finding – Embedded CPU Farm • DCA test for displaced tracks due to B decays – Embedded CPU Farm 8 Planes of L1+L2/L3 (round robin)

  28. BTeV Trigger Design L2/L3 Farms 78 kHz L1 Farms Global L1 800 GB/sec optical links 200MB/sec 3.8 kHz Buffering for 300k interactions (300ms)

  29. BTeV HLT Algorithm Large # of High End Boxes (dual 4 GHz) Uses ROI – Similar to ATLAS But L2/L3 are interleaved on same box Tremendous amount of simulation work Trigger/DAQ TDR Just Out Data Flow Analysis of a Highly Parallel Processor for a Level 1 Pixel Trigger N29-7 Hash Sorter - Firmware Implementation and an Application for the Fermilab BTeV Level 1 Trigger System N36-52 Failure Analysis in a Highly Parallel Processor for L1 Triggering N36-61 Real-Time Embedded System Support for the BTeV Level 1 Muon Trigger N36-65 Pre-prototype of the BTeV Trigger Level 1 Farm Processing Module. N36-66

  30. Others Too much out there to be covered in this talk! • ZEUS has added a CPU farm to its custom Hardware L1/L2 trigger • LHCb uses a 3D torus network topology to move events through its farm quickly (1200 CPUs) • FPGAs come with Ethernet Controllers built-in • The Programmable NIC as a switch

  31. N29-6 BaBar Level 1 Drift Chamber Trigger Upgrade N14-4 The Trigger System of the COMPASS Experiment A First Level Vertex Trigger for the Inner Proportional Chamber of the H1 Detector N36-59 The Trigger System for the New Silicon Vertex Belle Detector SVD 2.0 N36-60 Rapid 3-D Track Reconstruction with the BaBar Trigger Upgrade N36-62 N36-64 The ZEUS Global Tracking Trigger Electronics for Pretrigger on Hadrons with High Transverse Momentum for HERA-B Experiment A Pipeline Timing and Amplitude Digitizing Front-End and Trigger System N36-69 N36-56 Other HEP Talks and Posters

  32. Air Shower Observatories • Looking for ultra high energy cosmic rays (beyond GZK). • 1600 Auger detectors • 1.5 km between each • 4 communications concentrators in the array • PLCs, ASICS, Power PC 1.5km Low Data Rate (1200bps) Low Power (Solar) 10 second pipeline Central Cell Fires (20 Hz) Look at surrounding rings to decide PLD First Level Surface Detector Trigger in the Pierre Auger Observatory N14-6 N36-54 The Trigger System of the ARGO-YBJ Experiment

  33. Medical Applications Decay Detection BGO Scintillator No Accelerator Clock Positron Emission Tomography (PET) Trigger is a Scintillator Coincidence Reduced matching window increases resolution

  34. PET BGO Signal Threshold doesn’t have good enough timing resolution! Different strength signals Different rise times

  35. PET Fit signal to BGO Model Amplitude Fit and Error Matrix done in FPGA Allows a significant reduction in window width Time (9.2ns) N36-58 Development of a High Count Rate Readout System Based on a Fast, Linear Transimpedance Amplifier for X-ray Imaging N36-63 A New Front-End Electronics Design for Silicon Drift Detector

  36. Conclusions • In HEP the Network rules the day • Enables experiments to get localized data off detector • And into CPU with Global View • Global Trigger Decisions have more discriminating power. • Technology Continues to Make Hardware Look more like Software • FPGA’s increase in complexity • Moves complex algorithms further up the trigger chain • Following commodity hardware – CPUs, Embedded processors, etc. Where next? • Most adopting Ethernet or its near relatives • A few important exceptions. A Great Session with Lots of Good Talks and Posters! Enjoy!

More Related