1 / 49

ALICE DAQ Progress Report Comprehensive Review IV P. Vande Vyvre – CERN/PH

ALICE DAQ Progress Report Comprehensive Review IV P. Vande Vyvre – CERN/PH. ALICE DAQ. Institutes, Responsibilities, Milestones DAQ Architecture DDL and D-RORC MOOD (Data quality monitoring) DAQ Fabric Data Challenge V Installation, test and commissioning. ALICE DAQ.

tauret
Download Presentation

ALICE DAQ Progress Report Comprehensive Review IV P. Vande Vyvre – CERN/PH

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ALICE DAQProgress Report Comprehensive Review IVP. Vande Vyvre – CERN/PH

  2. ALICE DAQ Institutes, Responsibilities, Milestones DAQ Architecture DDL and D-RORC MOOD (Data quality monitoring) DAQ Fabric Data Challenge V Installation, test and commissioning

  3. ALICE DAQ Institutes, Responsibilities, Milestones DAQ Architecture DDL and D-RORC MOOD (Data quality monitoring) DAQ Fabric Data Challenge V Installation, test and commissioning

  4. Institutes & Responsibilities • Birmingham • TRG/DAQ simulation • RORC receiver cards • KFKI-Budapest : • DDL: optical links and RORC receiver cards • Radiation tolerance tests (with Technical University/Budapest and Institute of Nuclear Research (ATOMKI/Debrecen) • CERN: • DATE: DAQ software framework • DAQ fabric • Zagreb: • TRG/DAQ simulation • AFFAIR: performance monitoring package • Split: • TRG/DAQ simulation • Storage • Collaborating institutes: Istanbul (Data quality monitoring)

  5. Milestones and outcome of CR3 LHCC Milestones • Detector readout with January 2003 • TDR preparation status March 2003 • Common TDR submission to LHCC December 2003 • DDL pre-production for detector 1Q 2004test and commissioning • D-RORC pre-production for detector 1Q 2004 test and commissioning • DAQ reference system in DAQ lab: 4Q 2004 • DAQ systems for surface tests and commissioning: • SXL2: mounting hall on the surface of point 2 1Q 2005 • Si. lab: ITS surface test 2005 • Milestones Final DAQ • Jan 2006 Final DAQ system ready with all functionalities 20 % of final performance • Nov 2006 30 % for pp and first HI run • Oct 2008 100 % for second HI run (needs and budget)

  6. ALICE DAQ Institutes, Responsibilities, Milestones DAQ Architecture DDL and D-RORC MOOD (Data quality monitoring) DAQ Fabric Data Challenge V Installation, test and commissioning

  7. Physics requirements Detector • Pb-Pb beam • Rate Max. ev. size • Central 20 Hz 86.0 MB • MB 20 Hz 20.0 MB • Dimuon 1600 Hz 0.5 MB • Dielectron 200 Hz 9.0 MB • pp beam • MB 100 Hz 2.5 MB Front-end Buffer Trigger Level 0,1 Readout Buffer Trigger Level 2 Detector Data Link(DDL) 25 GB/s 2.50 GB/s 1.25 GB/s High-Level Trigger Local DataConcentrators (LDC) Running modes A: DAQ B: DAQ+HLT Analysis C: DAQ+HLT Trigger Event-Building Network Global Data Collectors(GDC) Storage network Permanent Data Storage (PDS)

  8. DAQ architecture PDS TDS TDS Rare/All CTP L0, L1a, L2 BUSY BUSY LTU LTU DDL H-RORC L0, L1a, L2 HLT Farm TTC TTC FEP FEP FERO FERO FERO FERO Event Fragment Sub-event Event File 10 DDLs 10 D-RORC 10 HLT LDC 123 DDLs 262 DDLs 329 D-RORC 175 Detector LDC LDC LDC LDC LDC LDC Load Bal. Event Building Network EDM 50 GDC 25 TDS GDC GDC GDC DSS DSS GDC 5 DSS Storage Network

  9. Key building blocks and concepts • Protocol-less push-down strategy • System throttling by X-on/X-Off signals • Detector interface via a standard link (DDL) • Readout • Control and download • Software framework (DATE) • Dataflow (data driven according to TRG generated event tags) • Control (FSM and messages) • Monitoring • Distributed DAQ Fabric • LDC • GDC • DSS

  10. ALICE DAQ Institutes, Responsibilities, Milestones DAQ Architecture DDL and D-RORC MOOD (Data quality monitoring) DAQ Fabric Data Challenge V Installation, test and commissioning

  11. D-RORC Hardware Architecture Configuring JTAG JTAG CMC I/F Conf.Flash P11 P12 MediaI/F 1 250 MB/s APEXFPGA Optical I/F MediaI/F 2 250 MB/s P13 P14 BusyI/F LVDS I/F 528 MB/s 64-bit/66 MHz, PCI/PCI-X

  12. D-RORC Hardware • D-RORC with plug-in DIU • Read out single DDL channel • Detector integration • D-RORC with integrated DIU ports • 2 DDL channels • Integration with the HLT system

  13. D-RORC and DIU • Testing the integration of the D-RORC and the DIU using the new library (v4.2) • rorc_receive –g 3 ... (internal loopback) • rorc_receive –g 1 ... (DIU loopback) PC + single-channel D-RORC CPU: 2 x Xeon 2400 MHzKernel: 2.4.20-30.7

  14. D-RORC and DATE • 2 D-RORC cards • Same PCI bus • Data generation: • Front-end emulator cards • Internal data generator

  15. D-RORC and DATE 1 week 220 106 events 98 MBytes/s

  16. Data splitter for DAQ/HLT interface FEE Emulator PC + twin-channel D-RORC CPU: Intel P3 800 MHzKernel: 2.4.20-24.7rorc_receive DAQ Detector PC + single-channel D-RORC HLT CPU: 2 x Xeon 2400 MHzKernel: 2.4.20-30.7rorc_receive

  17. ALICE DAQ Institutes, Responsibilities, Milestones DAQ Architecture DDL and D-RORC MOOD (Data quality monitoring) DAQ Fabric Data Challenge V Installation, test and commissioning

  18. MOOD • Monitoring Of Online Data • Detector Debugger (raw data visualizer) • Written in C/C++ • Based on ROOT • Based on DATE • Data quality monitoring framework for all ALICE detectors

  19. MOOD: Current Detector Scheme • Detectors currently implemented : • ITS – SDD • TPC Sector • HMPID Photocathode • Detectors to be implemented : • All detectors individually • Test setups • ALICE as a whole

  20. TPC Sector Test Tool Bar Pad Plane 3D View Charge Plane Charge Plane 3D Event Size Distribution

  21. HMPID Photocathode (1) • Tabs • Photocathode • 3D View • Size Distribution • Event Dump • Logs • …

  22. HMPID Photocathode (2) • Tabs • Photocathode • 3D View • Size Distribution • Event Dump • Logs • …

  23. ITS - SDD (1) • Tabs • SDD Display • 3D View • Size Distribution • Event Dump • Logs • …

  24. ITS - SDD (2) • Tabs • SDD Display • 3D View • Size Distribution • Event Dump • Logs • …

  25. ALICE DAQ Institutes, Responsibilities, Milestones DAQ Architecture DDL and D-RORC MOOD (Data quality monitoring) DAQ Fabric Data Challenge V Installation, test and commissioning

  26. DAQ Reference System 1 KVM switch 4 LDCs (Local Data Concentrator): - rackmount PCs- U1, U2, U4 height- equipped with 6 D-RORCs 6 DDLs 2 GDCs (Gobal Data Collector): - rackmount PCs- U1 height - equipped with FC cards 1 FC switch 2 TDS (Transient Data Storage): - rackmount disk array- U2 height- IDE and FC disks standard LAN 1 GE switch 1 DSS (DAQ Services): - rackmount PC- U4 height- hot-swap SCSI disks

  27. Reference Setup – Front • DAQ lab • Detect and address integration issues • System for development and support L3 rack 2x LDCs: - 4U height- dual Xeon- 2x D-RORC cards DDL 2x GB Ethernet switch: 3COM SuperStack 3 KVM switch: Raritan Paragon UMT2161 2x GDCs: - 1U height- dual Xeon- Qlogic QLA2310F cards Fibre Channel switch: Broacade SilkWorm 3800 2x disk arrays: - Infortrend IFT-6330 - DotHill SANnet II DSS: - 4U height- quad Xeon- 3x 36GB SCSI disks

  28. Reference Setup - Rear Cat5 cables: - Ethernet- KVM- RJ45 connectors Mounting rails (!) Power distributor Optical cables: - DDL- FiberChannel- 2 Gbit/s multimode - LC-LC connectors L3 rack

  29. ALICE DAQ Institutes, Responsibilities, Milestones DAQ Architecture DDL and D-RORC MOOD (Data quality monitoring) DAQ Fabric Data Challenge V Installation, test and commissioning

  30. ADC V Hw Architecture 10 Tape Server 4 x GE 32 IA64 HP-rx2600 Servers 2 x 1 GHz Itanium-2 2 GB RAM Broadcom NetXtrem BCM5701 (tg3) RedHat Advanced Workstation 2.1 6.4 GB/s to memory, 4.0 GB/s to I/O 3COM 4900 10GE 4 x GE 10GE 32 x GE 10GE 4 x GE 3COM 4900 16 x Gbit Enterasys E1 OAS 12 Gbit, 1 x 10 Gbit Enterasys ER16 16 slots 4/8 x Gbit or 1 x 10 Gbit/slot 4 x 7 Disk servers 2 x 2.0 GHz Xeon 1 GB RAM Intel 82544GC 3COM 4900 ~ 80 CPU servers 2 x 2.4 GHz Xeon, 1 GB RAM, Intel 8254EM Gigabit in PCI-X 133 (Intel PRO/1000), CERN Linux 7.3.3

  31. Achievements (1) • System size (lack of resources in LCG testbed) • System scalability • Performance test with ALICE data traffic • ALICE-like traffic • ALICE-like events simulated data used:Realistic (sub-)event size on tape (ALICE year 1) • DATE load-balancing demonstrated and used • Sustained bw to tape not achieved • Peak 350 MB/s. Sustained 280 MB/s over 1 day • Reached production-quality level only last week of test • IA-64 from Openlab successfully integrated in the ADC V • Simulated raw data used for performance test • Data read back from CASTOR and verified

  32. ALICE DC: Scalability

  33. ALICE DC – MSS Bw

  34. Achievements (2) • Network • LDCs and GDCs: stable and scaleable including trunking • Between GDCs and disk servers: • Unreliable • Truncking not scaling as expected • Module broken and replaced twice in Enterasys • 10 Gbit Eth. Backbone • New generation of NIC cards (Intel Pro 1000) • NIC from Broadcom unreliable. Replaced by Intel Pro 1000. • Storage • Hardware problem on the disk servers • Several last minute workarounds needed (scripts for monitoring and reconfiguring)

  35. Performance Goals (1) 650 MB/s

  36. Performance Goals (2) MB/s to Mass Storage 300 MB/s

  37. Open issues and future goals • CASTOR: • Recovery from malfunctioning disk server • New stager • Special daemon between CPU and disk server instead of standard RFIO daemon. Needed to achieve adequate performance. Should be put back in main development. • DAQ • Increase performances • Network • First prototypes of 10 Gbit Eth. equipment from Enterasys unreliable • Enterasys support not effective on this case • Meeting scheduled with LCG PEB to present results and address the open issues

  38. ALICE DAQ Institutes, Responsibilities, Milestones DAQ Architecture DDL and D-RORC MOOD (Data quality monitoring) DAQ Fabric Data Challenge V Installation, test and commissioning

  39. DAQ Commissioning (1) • What: • DAQ itself • How the DAQ will help the commissioning of other systems • Where: • ACR: ALICE Control Room. • PX24-CR1: ALICE DAQ Counting Room located in the access shaft. • SXL2: mounting hall on the surface of point 2. • UX25: experimental underground area • When: • 1 Q 2005: all DAQ functionalities. Hw at the final location. 20 DDLs for readout of 2 TPC sectors in SXL2 + other detectors • Jan 2006 Final DAQ system ready with all functionalities 20 % of performance • Nov 2006 30 % of performance • Oct 2008 100 % of performance (needs and budget)

  40. DAQ Commissioning (2) • Tests at the construction site of hardware elements • Verification of the DDLs and D-RORCs • Test of the cards with a test station made of a PC and DDL test software • Tests at the development sites of software elements: • AFFAIR, DATE, DDL sw, CASTOR • Combined tests well before installation: test beams, Data Challenges • Standalone tests in experimental area • DAQ: possibility of injecting data at every stage of the data flow. • Each segment of the dataflow first tested in isolation and then in combination with other elements • DAQ integration at Point 2: • Integration with ECS • Tests with TRG, HLT, DCS • Detector test and commissioning • Test with cosmic and pulser trigger • From June 2005: a TTC-based trigger to trigger autonomous DDL data sources for global tests of the DAQ involving DDLs. • From January 2006: a cosmic or pulser trigger for the commissioning of detectors involving Trigger and DAQ

  41. DAQ Tools for Detector Commissioning DDL Simulator • Lab test • DDL Simulator: standalone daughter-card • Detector readout with DDL and DATE • Beam test • Detector readout by DDL and DATE • VME for trigger and older electronics (Silicon telescope e.g.) • Point 2 • DAQ system at Point 2 in 2004 • Detector test and commissioning • System will evolve in size and performances according to the needs. • Concurrent tests of several detectors (~3 in 2004, all in 2006) • Complete capabilities since the start • Control from ACR or any computer • Other tests • ITS integrated test Detector Readout card under test

  42. DAQ Tools for Detector Commissioning Input Data from Pattern Generator • Lab test • DDL Simulator: standalone daughter-card • Detector readout with DDL and DATE • Beam test • Detector readout by DDL and DATE • VME for trigger and older electronics (Silicon telescope e.g.) • Point 2 • DAQ system at Point 2 in 2004 • Detector test and commissioning • System will evolve in size and performances according to the needs. • Concurrent tests of several detectors (~3 in 2004, all in 2006) • Complete capabilities since the start • Control from ACR or any computer • Other tests • ITS integrated test Optical Fiber to LDC SIU (DDL)

  43. DAQ/TOF integration 4.5 days 59 106 events 86 MBytes/s.

  44. DAQ Tools for Detector Commissioning Trigger Logic Detector readout DDL SIU Silicon tel. readout DDL LDC (PC/Linux) DDL DIU LDC (VME/Linux) RORC DATE V4 Event Building Network Data Storage in Computing Center GDC • Lab test • DDL Simulator: standalone daughter-card • Detector readout with DDL and DATE • Beam test • Detector readout by DDL and DATE • VME for trigger and older electronics (Silicon telescope e.g.) • Point 2 • DAQ system at Point 2 in 2004 • Detector test and commissioning • System will evolve in size and performances according to the needs. • Concurrent tests of several detectors (~3 in 2004, all in 2006) • Complete capabilities since the start • Control from ACR or any computer • Other tests • ITS integrated test

  45. DAQ/Detector integration status

  46. DAQ Tools for Detector Commissioning • Lab test • DDL Simulator: standalone daughter-card • Detector readout with DDL and DATE • Beam test • Detector readout by DDL and DATE • VME for trigger and older electronics (Silicon telescope e.g.) • Point 2 • DAQ system at Point 2 in 2004 • Complete capabilities since the start • Detector test and commissioning • System will evolve in size and performances according to the needs • Concurrent tests of several detectors (~3 in 2004, all in 2006) • Control from ACR or any computer • Other tests • ITS integrated test

  47. DAQ Tools for Detector Commissioning • Lab test • DDL Simulator: standalone daughter-card • Detector readout with DDL and DATE • Beam test • Detector readout by DDL and DATE • VME for trigger and older electronics (Silicon telescope e.g.) • Point 2 • DAQ system at Point 2 in 2004 • Complete capabilities since the start • Detector test and commissioning • System will evolve in size and performances according to the needs • Concurrent tests of several detectors (~3 in 2004, all in 2006) • Control from ACR or any computer • Other tests • ITS integrated test

  48. DAQ for Test & Commissioning SR Hall SX Hall SXL Hall (ALICE sub-detector assembly) (Networking) WR1 WR2 ACR DDL PX24/CR1 (DAQ) LAN LAN PX24/CR2 (HLT) DDL Patch Panel Access Shaft PX24/CR3 (DCS) PX24/CR4 (Misc.) 1 Q 2005 : DAQ System at Point 2 12 DDLs for TPC 8 DDLs for others 3 partitions 2005: similar system in Si. lab. UX25 (Experimental Area)

  49. Final DAQ System SR Hall SX Hall SXL Hall (ALICE sub-detector assembly) (Networking) WR1 WR2 ACR PX24/CR1 (DAQ) LAN DDL PX24/CR2 (HLT) DDL Patch Panel Access Shaft PX24/CR3 (DCS) PX24/CR4 (Misc.) UX25 (Experimental Area) Installation staging: (% of final DAQ performance) 2006: 20% 2007: 30% 2008: 100%

More Related