1 / 34

CMS Trigger Elecctronics

CMS Trigger Elecctronics. CMS HCAL TriDAS is a joint effort between: University of Maryland – Drew Baden Boston University – Jim Rohlf Princeton University – Chris Tully and Dan Marlow University of Minnesota – Jeremy Mans University of Virginia – CMS application pending… Maryland Personnel

todd
Download Presentation

CMS Trigger Elecctronics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CMS Trigger Elecctronics • CMS HCAL TriDAS is a joint effort between: • University of Maryland – Drew Baden • Boston University – Jim Rohlf • Princeton University – Chris Tully and Dan Marlow • University of Minnesota – Jeremy Mans • University of Virginia – CMS application pending… • Maryland Personnel • Physicists • Baden – Level 3 WBS manager for HCAL Trigger/DAQ • Jeremy Mans – with us for FY 2004, just left for new Asst Prof position at Univ of Minnesota • Engineers: • Dr. Tullio Grassi (CMS project funds) – integration and firmware • Bard (HEP) • Department-subsidized electronics instrumentation group as needed • Students • 1 new RA and 3 undergraduates helped with production this summer Drew Baden

  2. Trigger/DAQ Drew Baden

  3. Scope • HCAL Trigger/DAQ • Data flow: FrontEnd→ Receiver/Trigger → Concentrator → Farms • “Receiver/Trigger” onward is our responsibility • aka TriDAS  Trigger/Data Acquisition System in CMS parlance • Ongoing effort for past 6 years • Luminosity • Produce signal for accelerator to tune for CMS • Need something that is not dependent upon CMS trigger running • Use forward Hadron Calorimeter (HF) • Will also be using this to keep track of CMS sensitivity (“Luminosity database”) • Now working with Dan Marlow (Princeton) and Jeremy Mans (UMN) • HF Trigger upgrade • Investigate scheme for running at SLHC (1035!) • Optimize for weak boson fusion Higgs production using forward tagged jets Drew Baden

  4. TriDAS Overview Baden is HCAL “Level 3 WBS Manager” for TriDAS “Front End” Trigger/DAQ Level 2/3 DAQ CMS HCAL QIE HCAL Data Fibers Level 1 Trigger • CMS Trigger: Emphasis is on bandwidth and commercial processors • Level 1 • 3 ms latency inside L1 trigger • 100 kHz average L1 accept rate (1/400) • 100 Gbyte/sec into Level 2 Drew Baden

  5. CMS: Synchronous Pipeline • 25ns between collisions • Similar to Hera (H1 and Zeus) 90ns • Different from Hera since integration times can be over many buckets •  Real-time filtering to associate energy (from >1 bucket) with “event” concept • Pipeline will not stop (except for a full system reset) • Every 25ns, always another entry • Errors must be dealt with in the data flow • Can inhibit L1 accepts if needed, but not L1 queries • Level 1 latency is fixed • Every ~100 buckets, a L1 decision on the event collected ~100 buckets ago • All subsystems must have enough buffering to hold this many events • All subsystems must be able to find the event in their pipeline • Level 2 is a processor farm • Full readout, this is the DAQ path • Event-building implemented via massively parallel commercial structure Drew Baden

  6. HCAL Electronics Overview S-Link: 64 bits @ 25 MHz Trigger Primitives READ-OUT Crate • 1 PC Interface SBS CLK D C C H T R H T R H T R Level 1 TRIGGER Rack CPU • 12 HTRs • 1 Clk board • 2 DCC 20 bits @ 80 MHz =1.6 Gbps FIBERS TTC CERNTransmitter 40 bits @40 MHz FRONT-END Readout Box (RBX) On detector HPD Shield Wall analog optical signals from HCAL CCA QIE GOL QIE CCA QIE QIE GOL CCA QIE QIE FE MODULE Drew Baden

  7. HCAL VME Crate • VME Bridge module (CAEN) • Configuration and monitoring over VME • Fanout module • Receives TTC stream • Clones and fans out timing signals • Global HCAL synchronization w/RCT • HCAL Receiver & Trigger (HTR) module • FE-fiber input, linearizers, filters… • Maintains pipeline • TP output via SLBs to RCT • DAQ output of raw/TP data to DCC • Spy over VME for monitoring • Data Concentrator Card (DCC) • Inputs from HTRs • Output to DAQ • Generates busy if needed • Spy output via VME Front End Electronics TTC fiber Fiber 1.6 Gb/s F a n O u t VME CRATE B R I D G E H TR H T R H T R H T R D C C ... 10m Copper 1.2 Gb/s DAQ Calorimeter Regional Trigger Drew Baden

  8. HTR Principal Functions • Receive HCAL data from front-ends • Synchronize optical links • Data validation and linearization • Form “trigger primitives” and transmit to Level 1 at 40 MHz • Pipeline data, wait for Level 1 accept • Upon receiving L1A: • Zero suppress, format, & transmit raw data to the concentrator (no filtering) • Transmit all trigger primitives along with raw data • Handle DAQ synchronization issues (if any) • Calibration processing and buffering of: • Radioactive source calibration data • Laser/LED calibration data • Support a VME data spy monitoring Drew Baden

  9. HTR Board Description • Digital fiber data from front-end (FE) • 16 fibers per HTR, 3 channels/fiber, 1.6 Gbaud link • Bit Error Rate (BER) requirement < 1 in 1012 bits • We are shooting for < 1 in 1015 bits, with optimism • We have had to become experts in synchronous high speed serial data transmission • Total data rate 160MByte/s per link = 2.56GBybe/s per board • 220 HTR boards running, total bandwidth into our system is 563 GByte/s • Data is stored in a circular buffer (pipeline) • Data is simultaneously sent to Level 1 for consideration • Trigger primitives are prepared, associated with crossing, and transmitted via daughterboards • Level 1 latency ~ 100 clock ticks, sets the size of the pipeline • Level 1 accept 1 in 400 (on average) • Data from trigger ±3 time slots is transmitted to DCC for DAQ after zero suppression • All logic and arithmetic implemented in FPGAs • Field Programmable Gate Array, firmware written by Tullio Grassi Drew Baden

  10. HCAL Trigger/Readout (HTR) Board Async Fifo Princeton Fanout Card (1/VME crate) All I/O on front panel Fiber digital data Copper output to L1 and DCC FPGA logic Fully programmable Fiber Data Serial Optical Data LC Deserializers (8) CLK80 Ref Clk Recovered Clk 20 TTCrx TTC Crystal RX_BC0 RX_CLK40 PLL TTC 40 Clk x2 SLB SYS80 Clk TTC Broadcast SLB SLB SYS40 Clk TPG Path SLB XILINX SLB SLB Drew Baden

  11. HTR Card Production Version (Rev 4) Dual-LC O-to-E VME Stiffeners TTC mezzanine Deserializers 6 SLBs Xilinx XC2V3000-4 Drew Baden

  12. HTR Status Production goal • 270 Rev 4 HTRs total • Be “Ready for Crates” this fall/winter • CMS has postponed this until Spring 06 Current status: http://www.physics.umd.edu/hep/HTR/Rev4/checkout_HTRev4.html • PCB manufacture complete, assembly underway • Vendor produces about 20/week • Some parts procurement issues, nothing major • Checkout at Maryland, shipping to CERN • Currently about 200 boards passed checkout, shipped to CERN • Will finish remainder by Nov 1 this year • Wel have plenty of HTRs to meet near term work needs for HCAL Drew Baden

  13. HTR Firmware • Tullio Grassi is the sole and primary author • Very mature, well simulated, battle tested in numerous HCAL test stands and test beams • Items left to be done: • Summing • HB/HE overlap and HF 2x3 hxf • We need the mapping to start this • Zero suppression for DAQ path (to get to 15% occupancy) • Current scheme: put a threshold on the TPG associated with the crossing • Need MC input on how to do this right • Histogram firmware for HCAL sourcing done • New scope introduced in 2003, written and supported by Baden Drew Baden

  14. Firmware • DAQ format evolving • Maryland/Boston/Princeton collaboration • Top-level view: • See http://www.physics.umd.edu/hep/HTR/preprod/PreProdMainFPGA.pdf Drew Baden

  15. LHC Clocking TIDAL EFFECTS LAKE Geneva EFFECTS • LEP ring is sensitive to: • Distortions in the large (27 km) circumference • Tidal distortions • Pressure from Lake Geneva • Return currents from DC trains running nearby • LHC RF clock keeps 3564 buckets of protons circulating • CMS must remain synchronous with this clock • LEP was concerned about DE~few MeV, LHC will be concerned with Df ~ 25 ppm • We have learned to handle this… Train to Bellgarde EFFECT Drew Baden

  16. Timing Signal Distribution Trigger Timing Control TTC Stream (“RX_CLK”) F A N O U T F A N O U T F A N O U T Rack-to-Rack CAT 7 ECAL H T R H T R H T R H T R D C C F A N O U T H T R H T R H T R H T R D C C F A N O U T Timing is critical in a synchronous pipeline experiment! HCAL VME Crates Drew Baden

  17. Fanout board 2 operating modes: Global or Crate TTCrx TTC Broadcast TTC fiber 40MHz QPLL can run stand-alone G Clk80 G C QPLL 18 Outputs G C RX_CLK = 40MHz INT_BC0 G FPGA RX_BC0 C EXT_BC0 Delay RX_CLK = 40MHz Input from GLOBAL Fanout RX_BC0 EXT 80MHz Drew Baden

  18. Luminosity Drew Baden

  19. HF Luminosity Iron fiber calorimeter. 3 < η < 5 HF • Clients: • LHC accelerator needs something to tune on that is independent of whether CMS is running triggers • We need something to put up on a screen in the control room to tell us our luminosity • We need to keep track of our sensitivity in order to be able to measure cross sections • Schemes (there are several…) • Use HF detector, count min bias overlaps • Optimize for linearity with respect to luminosity Drew Baden

  20. HF Luminosity Readout Path H T R H T R Global Trigger HF- R O U T E R CPU 9 HTRs/VME crate H T R H T R HF+ • 9 HTRs for HF+ and HF- • Each HTR has 1 output with luminosity info • 100Mbps raw ethernet packets sent to router • Router to computer over Gigabit ethernet • Dead time, throttle, etc. info from GCT sent to CPU • This computer will feed LHC, luminosity DB, etc. Luminosity consumers Maryland/Princeton/Virginia Drew Baden

  21. HF Luminosity R&D • Overall responsibility of Dan Marlow… • Additional mezzanine card to sit on SLB site • Built and tested at Maryland • Firmware makes occupancy and sumET histograms in real time • Joint Maryland (Baden) and Minnesota (Mans) • Send histograms via ethernet to CPU • WORKING!!! TOP BOT • Crunch data on CPU, fanout to LHC, database, etc • Virginia (Hirosky) • Combine with live time/trigger scalers for luminosity database • Princeton (Marlow) • Goal • Have all of the hardware/firmware working in SX5 by summer 06 • Spend remainder of the time integrating into LHC (easy) and CMS (not easy) Drew Baden

  22. Self-Triggering/ 2006 Slice Tests Drew Baden

  23. 2006 Tests • Magnetic field mapping • Cosmic tests with HCAL and MUON operating synchronously • Operation a “vertical slice” of CMS • HCAL+ECAL+MUON+”mini tracker”+Solenoid • Add trigger and DAQ and other central systems • HCAL • We will begin preparations for this by implementing a “self-triggering” capability this fall • Joint effort by Jeremy Mans/Drew Baden/Tullio Grassi • Alows us to understand muons in our detector NOW • Used same “luminosity” mezzanine card and majority logic board Drew Baden

  24. “Vertical Slice” DT’s in sector 10 of YB+2, YB+1 and CSC’s of YE+1 lower 60 deg sector provide the principal triggers. TK dummy tube with alignment disk & cables TK (elements in dummy tube to be defined) HB+ active sectors EB supermodule(s) 11 10 Drew Baden

  25. DAQ Software Drew Baden

  26. DAQ Software • In the last year, major work has been completed on the HCAL DAQ and controls software to prepare for operations in 2007. • Updated to new revision of framework (XDAQ) • Connections to the online database • Firmware management and validation • Improved debugging using a web-interface • HCAL DAQ Software is very well advanced and ready for the cosmic challenge • Jeremy Mans (now at UMN) is responsible for 99% of this! Drew Baden

  27. LHC Upgrade/HF Jet Trigger Drew Baden

  28. LHC Upgrade • Add functionality to HCAL • W Boson Fusion (WBF) dominant experimentally accessible rate • Forward jets + central Higgs decay • Tag jets are in HF+HE so HE will need to be included • Higgs id without a tag is very hard • Gluon fusion backgrounds are too high, esp at 1035 • Current trigger at high luminosity will be difficult • Depends on scheme for increasing luminosity of course… h (tagged “forward” jets) C. Tully & H. Pi JetMetPRS Aug 2004 Drew Baden

  29. HF Jet Trigger • Topology: • Each HF HTR receives 12h x 4f towers • Need 9 HTRs per side for the long fibers • Current trigger gangs 3h x 2f for jet clustering • New design: • Implement a 4x4 sliding window at the tower level • Add jet isolation for triggering on “tagged” jet in WBF • Implement in 1 9U VME board using FPGAs for the DSP’ing HTR cards Drew Baden

  30. Who, What, When… • HTR Mezzanine card (Maryland) • Transmits data to jet clustering card • Jet Clustering card (Minnesota) • Forms jets, sorts, transmits to calorimeter trigger boards • Simulation and design and…. • Will eventually include Virginia, Princeton, Wisconsin…and other CMS collaborators • Goal • Have something that is working parasitically (at least) during maiden CMS run • Show proof of concept of a design scalable to rest of HCAL and all of ECAL • This should be the prototype of the upgrade to the CMS calorimeter trigger Drew Baden

  31. Conclusions Drew Baden

  32. Schedule of Activities at UMD • 2006 Testbeam • Run with real ECAL in front – this is our chance for a realistic calibration • Continuing tests of HTR firmware and links • Spring 2006 “Slice tests” • “Cosmic Challenge” • Test Level 1 trigger path with “slice” of trigger using cosmics • Integrate with MUON, ECAL, Tracker • First CMS synchronization • First tests of a “slice” of the DAQ • All HCAL TriDas production cards involved • Finish Production • Spring 06 beneficial occupancy of USC • Installation of all racks, crates, and cards • Firmware / software / timing / troubleshooting • Continue HF Luminosity development • Continue HF WBF Trigger project with Jeremy Mans Drew Baden

  33. Overall TriDAS Project Cost • Contingency: • Effort: 50% • M&S: 75% • Based on the uncertainty in the requirements, which will certainly change over time. Drew Baden

  34. HCAL TriDAS Summary • Project • ~6% of total HCAL project • Finish up construction phase now, move to M&O phase • Production finished by Nov, then Installation/Integration next • TriDAS Progress at UMD • HTR card design successful • Including fab/assembly challenges • Current manpower has proved to be adequate to the task • Integration engineer (Tullio Grassi) has proved to be invaluable to USCMS/HCAL effort • Hopefully, we can convince him to continue on M&O at CERN but no longer a UMD member • Synchronization and clock issue progress • Successful tests at CERN in testbeams, integration tests with calorimeter trigger, etc • HCAL is far ahead of most subsystems, we will be contributing a lot of expertise now • FPGA programming successful (thanks to Tullio – big responsibility, well done) • Firmware coded/simulated long before hardware implementation ready • This allowed us to be ready as soon as the hardware arrived! • Conclusion: HCAL TriDAS project in very good shape all around • We have worked like dogs for 5 years!!!! (and loved [almost] every minute of it!) • HF Luminosity and Trigger projects will keep us busy Drew Baden

More Related