1 / 11

Data Acquisition :: Status

Data Acquisition :: Status. Presentation before the IceCube Science Advisory Committee W. R. Edwards / K. Hanson for DAQ Mar 29, 2006 – Madison, WI. DAQ Capabilities.

Download Presentation

Data Acquisition :: Status

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Acquisition :: Status Presentation before the IceCube Science Advisory Committee W. R. Edwards / K. Hanson for DAQ Mar 29, 2006 – Madison, WI

  2. DAQ Capabilities • Ideally, DAQ should be a transparent layer between signal and analysis; don’t want to worry about things like deadtime, saturation, charge/time resolution, other detector effects. • We must live with physical detector elements: ice, PMT, … • DAQ hardware (and downstream, software) designed to faithfully pass on photon arrival time information • Waveform digitization at ~300 MHz to 600 MHz, • Dynamic range of 1 – 10,000 p.e. • Low noise background: in event window of 10 us, 0.5 noise hits per string • Time-stamping of detector ‘hits’ to global precision of ~ 3 ns • Low deadtime: dominated by readout of ATWD: 30 us per ch, two ATWDs can be operated in “ping-pong” mode. • One potential problem is depth of ATWD digitizer – only 128 samples per channel. For longer pulses one needs to fall back to FADC which is slower and has less dynamic range. Sample ATWD from DOM showing high, medium, and low gain channels for very large pulse. Inset shows typical SPE pulse

  3. DAQ - Overview • DOMs asynchronously collect hits at approx 500 Hz rate (optional LC trigger requirement limits rate to 5-30 Hz) • Hard LC mode – only send up hits with neighbor DOM coincidence hits – up to 4 distant. This results in some loss of hits. • Soft LC mode – hit compression dictated by presence of neighbor hit but some information propagated for all hits. No loss of hit information (maybe in SP) but carries higher overhead on DAQ to support additional data rate. • Periodic readout into Hub – hub sends data packets out on TCP sockets to SP. • Time transformation done in SP using RAPCal information (0.2 – 1 Hz rate of TCAL). Local oscillator / cable delays accounted for in real time! • SP merges hits on global timestamp – sends trigger packets to trigger processors* • Trigger processors merge hits from various SPs – form trigger based on requirement. Then send readout request to EB. • EB readouts data from SP* and builds event.

  4. DOM Mainboard • In final leg of production – MB for 70-75 strings produced by early next year. No significant changes from 5.0 (first production article). • DOM firmware ‘complete’ with some TODOs: • DOM mainboard compression – this will reduce hit size by approx factor of 3. This is lossless compression – no information thrown away, bits packed more efficiently. • IceTop – specific enhancements; requirements not yet well understood. • Some worrisome failures observed at Pole this year (1%) – under investigation – not necessarily MB per se: • 2 DOMs drawing higher-than-normal current – appear to be operating normally however one also, perhaps not coincindentally, has LC problems • 4 DOMs do not power up • Broken LC – with Hard LC requirement these DOMs are lost channels – will change with Soft LC DAQ • Bad flash sector on one DOM – this DOM is out of operation for DAQ currently but can be brought back.

  5. Surface Software • DAQ S/W not delivered in time for last year deployment. Data taken on String-21 with TestDAQ which was the primary DAQ tool for the DOM testing arena. • Shortcomings of TestDAQ were that • FPGA image not optimized for rapid data taking – several kludges to take data meant that deadtime was large (0.5 ms) • Surface component did not trigger in real time: hits written to disk in 15-min chunks and analyzed by background process (monolith) • However, TestDAQ still viable lightweight testing tool which is flexible and easy to deploy – so will continue to be used for ‘odd jobs’ such as commissioning and debugging. • For this pole season, de-scoped DAQ delivery (called Plan A-) • Original design called for ‘lookback’ mode from EB  SP to gather full hit information. SP only passed downstream the trigger primitives to triggers to reduce the bandwidth, • In ‘Hard’ LC mode network / processors able to handle transmission of full hit info; EB collects all hit information internally and self-references during event building process. • DAQ code delivered just-in-time (final cut of first release right before station close). Has been primary data-taking application since 2/13.

  6. PY05 DAQ Software activities • Improve stability of current software base (see later slide) • Implement new features through series of major releases: • Supernova scalers • AMANDA Integration • Hit compression / SLC support • Plan A+ • IceTop specific features • Migration to 64-bit computing platforms and certification of DAQ support for 25 strings next season • Continue to work on components to improve efficiency, reliability, and maintainability for future operations mode • Primary focus in near term is on improving stability of DAQ

  7. IceCube Events

  8. Simple Majority Trigger – IceCube 9 strings SMT currently set to 8 hits in time window of 2 us (with hits in 10 us window built into event) Don’t have plot but this agrees reasonably well with MC data: 30% normalization error but shape

  9. DOM Noise Rates Deployed Jan 2005

  10. DAQ Livetime – Last 2 Weeks • Since 2/13 DAQ has collected 200 million events (cf. 80 million for entire 2005). • DAQ livetime averaged over period of 3/10 to 3/23 was 42% - some small portion of this was due to sharing of detector with verification activities. • DAQ run scripts modified 3/24 to detect run crash sooner and restart runs more quickly • DAQ livetime averaged over period since then has improved considerably (77% average livetime immediately following upgrade). • Still short of target 90% livetime. This estimate is calculated from counting events in 24-hour period. At the current trigger setting we should have a steady rate of 138 events per second (83 physics events per sec) == 11.8 million events per day. Since 3/24, 30% of runs terminate abnormally before completion. Major contributors to crashes are (1) JVM crash; (2) stalled splicer queues

  11. Conclusions • DAQ H/W in very good shape – no indications that DOM design will not absolutely meet science goals for IceCube • Production of DAQ H/W (DOM-MB, DOR, DSB, Hubs, Master Clock) well underway – nothing to indicate sign of serious problems here: DOM-DOR communication could be more well understood but currently meets our in-ice requirement of 1 Mb / s data rate. • DAQ S/W had problems with delivery and is still struggling to catch up but currently functions for data taking. It’s a large animal, however, we need to proceed with measured progress. Some concern about maintainability of this software long term.

More Related