5 th Workshop on ALICE Installation and Commissioning - PowerPoint PPT Presentation

5 th workshop on alice installation and commissioning n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
5 th Workshop on ALICE Installation and Commissioning PowerPoint Presentation
Download Presentation
5 th Workshop on ALICE Installation and Commissioning

play fullscreen
1 / 27
5 th Workshop on ALICE Installation and Commissioning
99 Views
Download Presentation
ada
Download Presentation

5 th Workshop on ALICE Installation and Commissioning

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. 5th Workshop on ALICE Installation and Commissioning April 1rst & 3rd , CERN Muon Tracking (MCH) Conclusion of the February/March 08 ALICE COSMIC run. Christophe.Suire@ipno.in2p3.fr

  2. Status of Hardware during Run Dimuon Forward Spectrometer (DFS) Tracking Chambers : 5 stations, each station has 2 detection planes or chambers 1 chamber/plane gives an (x,y) coordinates primary vertex μ+ X Y Z μ- 3 5 4 Station 1 Station 2 read out during the cosmic run. [Chamber 5] and for the first time, triggered by the muon trigger Christophe.Suire@ipno.in2p3.fr

  3. Status of Hardware during Run • Dimuon Forward Spectrometer (DFS) Tracking Chambers : • 5 stations, each station has 2 detection planes or chambers • 1 chamber/plane gives an (x,y) coordinates • During the Feb/Mar cosmic run : • everything was installed (except chamber 6, inside the magnet) and local commissioning (St 4 &5) was being done in parallel • 2/5 of the DFS was powered • Station 1 and 2 • part of Station 3 (chamber 5) has been readout as well (max configuration : 3 LDC, 11 DDL) • air cooling (St1 and St2 is very stable), temperatures look fine • still some HV trips ! Never seen during previous tests • → How sure are we of the gas quality ? Christophe.Suire@ipno.in2p3.fr

  4. Data Taking • Two periods of data taking : • Acorde trigger (~80Hz) : from run 21392 to run 24567 • very large number of events ; should be enough to check clustering algorithms. Tracks have a (too) large angle and offer limited interest • Muon Trigger (~80mHz): from run 24836 to run 26024 • a total number of events is around 8840 (∑ runs requested to be reconstructed) • a first analysis shows shower events are dominant but a few horizontal tracks • → very interesting to look at Christophe.Suire@ipno.in2p3.fr

  5. DAQ No real DAQ issues : compliance with DAQ achieved really stable : 1 run 25402 of 107 events with St1+St2 @ 40MHz 2 runs 24021-06 of 2.106 events with St1+St2+Ch5 @ 40MHz Readout times (in 0-suppressed mode after taking pedestal runs, threshold calculation with a DA included in our rpm, etc… = conditions of real data taking ): Readout rate is slower than expected (lab measurements gave busy ~ 220 μs). Well aware of the problem ; need time to be investigated since other tasks have higher priority. Goal is to decrease deadtime (well) below 200 μs. Entry 34175 in the loogbook Christophe.Suire@ipno.in2p3.fr

  6. DCS/ECS/Trigger/HLT DCS : final state machine for low & high voltage works temperatures, gas flow is monitored (still some improvements -alarms handing and interlocks- to be done) ECS : scripts are ready and working. No issues. + Bridge tested and OK Trigger : Some problems but is it really the trigger ? Clearly some busy problem when run stops “unexpectedly” at DAQ level. It happened a lot during the first week (w9) but not a single problem during w10 (in the Alice_multi partition with MTR & V0). (we didn’t forgot the trigger questionnaire) HLT : We all have seen some results ! Haven’t we ? Christophe.Suire@ipno.in2p3.fr

  7. HLT Monitoring/Analysis HLT readout on the ldc : 1- subtract pedestal (on the 0-supp data) 2- apply a rough cut (all digits > 10 ADC) 3- scan interesting events 4- display the results on the ACR big screen All plots available at : http://idas.web.cern.ch/idas/dimuon-collaboration/trigger_tracker http://www.cern.ch/Indranil.Das/dimuon-collaboration/HLT-OnlineDisplay Christophe.Suire@ipno.in2p3.fr

  8. HLT Monitoring/Analysis Fired pads isolated with a sharp 10 ADC cut on 0-suppressed and pedestal substracted data. Christophe.Suire@ipno.in2p3.fr

  9. HLT Monitoring/Analysis A straight line passing the fired pads points clearly to ACORDE. Christophe.Suire@ipno.in2p3.fr

  10. HLT Monitoring/Analysis A straight line passing the fired pads/strip points towards the center of ALICE (muon trigger decision algorithm) Christophe.Suire@ipno.in2p3.fr

  11. HLT Monitoring/Analysis Christophe.Suire@ipno.in2p3.fr

  12. Online Monitoring/Offline Online Monitoring : All with MOOD pedestal, noise and gains visualization some global histograms (pad occupancy, hit map, errors) Data from gdc have been monitored during run in global New Mchview tool : Not online but can read all information at the OCDB level histogram all quantities for any granularity (from single pad to full chamber) Christophe.Suire@ipno.in2p3.fr

  13. Online Monitoring/Offline MCHVIEW : Analysis of calibration run 23125 for St. 1 and 2 Using the file from the OCDB (copied from alien) → Full chain is working from the ECS calibration script up to analysis Christophe.Suire@ipno.in2p3.fr

  14. Online Monitoring/Offline MOOD : Online display of run 24849 Fired pads on chamber 4 Christophe.Suire@ipno.in2p3.fr

  15. Online Monitoring/Offline MOOD : Online display of run 24849 Fired pads on chamber 3 Christophe.Suire@ipno.in2p3.fr

  16. Online Monitoring/Offline MOOD : Online display of run 24849 Fired pads on chamber 2 Christophe.Suire@ipno.in2p3.fr

  17. Online Monitoring/Offline MOOD : Online display of run 24849 Fired pads on chamber 1 Mapping problem now fixed. Christophe.Suire@ipno.in2p3.fr

  18. Online Monitoring/Offline ALIEVE : offline display of the same event (126) run 24849 from reconstructed digits. From offline muon clustering, 5 hits are reconstructed : ************************************************************* * Row * ch * charge * size * absPosX * absPosY * absPosZ * ************************************************************* * 1742 * 0 * 242.06466 * 5 * 11.254493 * -82.11767 * -529.9099 * * 1743 * 0 * 6055.5385 * 6 * 41.617515 * -74.68550 * -529.9099 * * 1744 * 1 * 1795.1381 * 10 * 43.26791 * -68.99257 * -548.9899 * * 1745 * 2 * 495.44882 * 3 * 56.334308 * -31.87123 * -673 * * 1746 * 2 * 721.78381 * 3 * 56.963699 * -30.87374 * -673 * * 1752 * 3 * 989.74383 * 9 * 58.672157 * -25.90610 * -692 * ************************************************************* Christophe.Suire@ipno.in2p3.fr

  19. Online Monitoring/Offline • Data Storage : • data are on CASTOR • AliRoot reconstruction has been run on a sample of data (by muon “offline shifters); 13 runs • → MUON.digits available on alien • Data Quality Checks : • QA/AMORE is a work in progress - To do List for QA (Mar08) To complete QA classes with: From  raw-data: - Hit distribution per chamber   (TH1 with the chamber number in the horizontal axes); - Hit distribution per detection element   (TH1 with the DE number in the horizontal axes) - Hit distribution per buspatch   (10 TH2 (one per chamber)  with the buspatch geographical distribution (as in mchview) From recpoint: - Trigger efficiency plots (Diego) - but now in analysis task - Check of the trigger code (Who?) - Trigger scaler values(Diego) - Strip hit distribution (4 TH2 one per chamber with the strip distribution)(Diego) - Local board hit distribution (1 TH2 with the local reponse of each local board)(Diego) From ESD - p, pt, y distribution - already in place - track multiplicity = number of tracks per event   (somehow related to tracking efficiency) - track cluster multiplicity distribution = number of clusters per track   (somehow related to clustering efficiency); - Multiplicity of clusters per chamber (10 TH1); - Multiplicity of cluster per detection element (144 TH1); - Charge of the cluster per chamber (10 TH1); - Charge of cluster per detection element (144 TH1); - Normalized Chi2 of the track (somehow related to tracking quality) - Distance between clusters and track   (somehow related to clustering quality and alignment accuracy) - Number of track matching trigger   (somehow related to tracker/trigger efficiency) NEW Christian, Philippe P., Diego Christophe.Suire@ipno.in2p3.fr

  20. COSMIC Data, Analysis Results AliEve display the MUON digit from reconstructed data For the moment, clusters are reconstructed afterwards: see the clustering algorithm at work on real data implementing trigger selection look for horizontal tracks All pedestal runs (so far) show nominal/expected noise. Magnets (dipole+L3) effect on measured noise : comparison of runs 23097 – 23355 Relative noise difference : mean=0.04 ADC and σ=0.07 (nominalnoiseis 1.1 adc) compatible with single channel noise fluctuations. Christophe.Suire@ipno.in2p3.fr

  21. Christophe.Suire@ipno.in2p3.fr

  22. Christophe.Suire@ipno.in2p3.fr

  23. Christophe.Suire@ipno.in2p3.fr

  24. Hardware/Software Status for next cosmic run + Goals To do for next Run : Some electronics to be fixed on Station 1 & 2 Some software to be installed (related with the trigger/busy) → already scheduled for w15 (along with DAQ/Trigger test) Some work on the interlocks needs to be done (DCS) Goals : Pedestal/noise, calibration procedures are ready and tested Clustering and hit reconstruction seems to work fine. Alignment and tracks : needs tracks ! → tracking algorithm modified to reconstruct tracks without all the chambers (to be tried on Feb/Mar cosmic data) During (or before) the next run, St 3+4+5 have to be included in the readout. As soon as a chamber is validated at the “local commissioning” level, it can be included easily in the global DAQ. Christophe.Suire@ipno.in2p3.fr

  25. Action to Completion Station 1 : ready for beam (dead channels ? but <<1%, all HV ok ≡ stable @ 1650 V ) Station 2 : ready for beam (dead channels ? but <<1%, all HV ok except 1 group/24) Station 3 : finishing the local commissioning ch5 is almost done (readout ok, some very localized pedestal issues) ch6 is installed (open position to finish ch5) ; cabling just started Station 4 : finishing the local commissioning ch7 outside is used to test the noise. Wiener LV PS modified → nice improvements (noise went down between 1.5 and 3.5 ADC ) ch 7 inside has some problems. Needs to be retested ch8 not tested. Readout cabling to be done Station 5 : finishing the local commissioning ch9 readout is ok (a few slats have problems). HV to be tested ch10 cabled, readout tests starting next week. Still a lot of work to do before the next cosmic run. Dimuon tracking meeting tomorrow afternoon to define goals and strategy. 1- St 3,4,5 local commissioning implies no dipole field 2- Probably muon trigger only during next run Christophe.Suire@ipno.in2p3.fr

  26. Conclusion • Very good behavior of St1+St2 was achieved during data taking • Readout (CROCUS) is compliant with DATE and Trigger patterns handling is fine (L1/2 reject are implemented and tested in lab) • 2 LDCs ok, 3 LDCs (when addinh chamber 5) was straightforward • → very confident to include easily remaining chambers • software for data taking (DA, ECS scripts, MOOD) are in very good shape • AMORE/QA are being developed • offline reconstruction has been running fine (2 runs with problems in the data transfer → fixed ) • Dimuon « offline shifters » have developed tools and ran the reconstruction • → first results with clusters/single muon triggers • → working on getting tracks Christophe.Suire@ipno.in2p3.fr

  27. Muon_trk questions • One issue related to the debugging of the DA (and maybe other debugging). The online cluster is completely isolated from the outside word. To be more precise, we need : - A machine where we can play with the data and with the DA program (sources, compiling, debugging) in the same conditions as an LDC.     We have some machines @ CERN (dimlab04, 05 , ...) where the code can be installed but we don't have the data (in the DATE format, as we have online). During the last cosmic run we fail to reproduce a segmentation fault observed online in one of our machines @ CERN. Since we don't made the test with the same data, how can we compare ? Christophe.Suire@ipno.in2p3.fr