1 / 42

DØ and CDF Detectors/Computing

This presentation provides an overview of the CDF and DØ detectors, their current operations, computing systems, future plans, and collaborations. It also highlights the achievements and progress made in data taking efficiency and performance.

wiless
Download Presentation

DØ and CDF Detectors/Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bill Lee Fermilab DOE Annual Science & Technology Review July 12-14, 2010 DØ and CDF Detectors/Computing

  2. Acknowledgements My thanks to my CDF colleagues who assisted me with the preparation of this presentation. • Massimo Casarsa • Phil Schlabach • Richard St. Denis Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  3. Brief Outline Overview of Detectors Current Operations Computing Future Operations Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  4. Fermilab You Are Here Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  5. CDF and DØ Detectors Muon systems EM and Had Calorimeters Solenoid Tracker Silicon Vertex Detector Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  6. CDF and DØ Collaborations Europe 19 institutions • The DØ Collaboration • 19 Countries • 86 Institutions • 492 collaborators • 11% FNAL North America 32 institutions Asia 8 institutions • The CDF Collaboration • 15 Countries • 59 Institutions • 538 collaborators • 15% FNAL Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  7. CDF and DØ Collaborations Europe 19 institutions North America 32 institutions Asia 8 institutions • Congratulations to Spain on their World cup victory. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  8. Special Projects M. Johnson DØ Technical Organization Spokespersons Technical Integration Coordinator G. Ginther (FNAL) Run Coordination S. Gruenendahl, W. Lee Electrical Operations: M. Matulik Mechanical Operations: R. Rucinski Triggermeister L1 CTT S. Gruenendahl Data Quality A. Jonckheere N. Khalatyan SMT N. Parua S. Youn L1 Muon/Cal Track N. Khalatyan LuminosityMonitor I. Katsanos M. Prewitt Fiber Tracker/Preshowers J. Warchol —————— Fiber Tracker J. Warchol Preshowers A. Evdokimov Detectors Muon: T. Diehl CFT/PS: M. Corcoran SMT: Z. Ye CTT: M.Corcoran Cal: J.Sekaric, L. Zivkovic L1CAL: S. Cihangir Lum: G.Snow L2 M. Mulhearn Central Muon A. Ito —————— PDT’s P. Kasper Trigger counters A. Ito L2STT D. Boline V. Parihar Calorimeter D. Schamberger S. Dyshkant (Deputy) —————— L1 Cal S. Cihangir D. Edmunds ICD L. Sawyer A. White L3/DAQ J. BackusMayes G. Watts Forward Muon V. Evdokimov —————— MDT detectors V. Malyshev MDT Electronics P. Neustroev Pixel detectors S. Kulikov Pixel electronics T. Fitzpatrick Online W. Lee Controls G. Savage Global Monitoring E. Cheu V. Sirotenko Solenoid H. Fisk Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  9. CDF Operations Organization Detector Operations Massimo Casarsa Philip Schlabach Trigger Dataset Working Group Heather Gerberich Simone Donati (FNAL) Admin. Support Safety Coordinator Dee Hahn Associate Head, Shift Operations JJ Schmidt Asscociate Head, Online Systems Jonathan Lewis Associate Head, Detector Systems Farrukh Azfar Associate Head, Detector Infrastructure Del Allspach - Steve Hahn Operations Managers Process Systems Bill Noe(Leader) Dean Beckner Cutchlow Cahill Jim Humbert Jim Loskot Bruce Vollmer Wayne Waldon Data Acquisition Bill Badgett TSI/Fred Jonathan Lewis Silicon Michelle Stancari Sebastian Carron Trigger L1/L2 PierluigiCatastini Pedro Fernandez Daily/Weekly Ops Shift Crews Sci-Co Aces Co CSL Willis Sakumoto EVB/L3 Farm Pasha Murat L3 Filter FarrukhAzfar COT Bob Wagner AseetMukherjee Calorimeter/TOF Larry Nodulman Willis Sakumoto Electrical and Mechanical Dervin Allen(Leader) John Bell Roberto Davila Jamie Grado (Bldg. Manager) Lew Morris George Wyatt Sys. Admin./ Database Comp. Div. DQM M. Martinez-Perez Muon Systems Phil Schlabach Giovanni Pauletta CLC Iuri Oksuzian N. Goldschmidt Monitoring/Valid Kaori Maeshima Pasha Murat BSC Ken Hatakeyama Jim Lungu Slow Controls Steve Hahn(Leader) JC Yun Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  10. CDF and DØ Collaboration Scientific Operations/Computing Effort Shifts not included in effort Fermilab continues to provide a significant portion of the effort. The ongoing streamlining of detector operations has resulted in a reduction of effort without negatively impacting performance. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  11. CDF Operations Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  12. CDF Data Taking Efficiency average Last 12 months: • Recorded 83% of delivered luminosity, 79% with the full detector. Run II Average: • 83% acquired, 73% good with full detector. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  13. CDF Data Taking Performance In the last 12 months (includes the 2009 shutdown) • Delivered: 2.13 fb-1 • Recorded: 1.77 fb-1 (83%) • With full det.: 1.68 fb-1 (79%) Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  14. CDF Total Integrated Luminosity in Run II 9.00 fb-1 7.48 fb-1 Integrated Luminosity with full detector: 6.55 fb-1 (73%); Depending on run quality requirements, analyses use 6.3-7.2 fb-1. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  15. CDF Shutdown 2009: Highlights Silicon Detector: • Cooling system: replaced the elbows on the ISL cooling lines and attached COT face tubing, sealed a few leaks; • Power supply maintenance: replaced aged capacitors in 21 CAEN power supply modules. Drift Chamber: • Recovered many channels replacing blown resistors in the wires readout circuit. Calorimeter: • Plug calorimeter sources maintenance. Front end crates preventative maintenance: replaced fan packs and filters, new fuse installation, heat exchanger and drip sensor cleaning. Replaced 64 nodes of the L3 farm. Tied in new diesel emergency generator. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  16. CDF Start-up after Shutdowns Luminosity delivered and CDF data-taking efficiency in the first 35 days after the last three shutdowns: • 2007: 11 weeks duration; • 2008: 1 week duration; • 2009: 12 weeks duration. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  17. Operations Improvements at CDF Track fitter upgrade (GigaFitter) in the level-2 track trigger: • More powerful FPGAs: 1 board in place of 16 boards, more compact, easier maintenance; • More memory available: possibility to extend the track acceptance in impact parameter and momentum. Optimization of the trigger bandwidth: • Optimized track trigger selection at level-2 to fill up the bandwidth at low luminosity. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  18. DØ Operations Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  19. DØ has recorded >91% of the delivered luminosity over the past year. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  20. Over the past 12 months • Delivered luminosity: 2.14 fb-1 • Recorded Luminosity: 1.98 fb-1 Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  21. DØ Shutdown 2009: Highlights New Old Replace scintillator in luminosity monitor. Recover individual silicon HDIs. • Largest fraction of functioning channels in DØ history. Liquid nitrogen dewar vacuum leak repair. Routine maintenance and power supply recovery. • Refurbished rack blowers. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  22. DØ start-up after shutdowns • 2007: 11 weeks; • 2008: 1 week; • 2009: 12 weeks. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  23. Operational Improvements at DØ Reduced the downtime at the beginning and end of stores. New FPGA programming increased the efficiency of the L1 central track trigger. Updated trigger lists to address higher peak luminosities. Enhanced monitoring. Documentation improvements to facilitate smoother downtime recoveries. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  24. Accessing the Detectors Installation of Run IIb upgrades On average CDF and/or DØ access their collision halls 1-2 times per week. • Sometimes there is a high priority need to access the hall. • One of the detectors has a data quality problem. • Most other accesses are opportunistic. • Tevatron problem or other issue allows access. A few times per year CDF or DØ will need a long access (>6 hours) Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  25. Safety at CDF and DØ Safety remains a central aspect of the collider experiments at Fermilab. • Over the past year PPD has not had a DART case. Shutdown Safety: • Safety is integrated in all aspects of shutdown activities. • A Job Hazard Analysis is a vital portion of the planning of any shutdown job. • Personnel are reminded to keep safety first. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  26. DØ Computing Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  27. DØ Data Reconstruction • Plot includes 125 million events that have been processed twice to remove a calorimeter hot cell. • Currently the D0 farms are processing data within 3-4 days after recording. • This is our minimum allowed processing delay (to accommodate calibrations. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  28. DØ Monte Carlo Production DØ uses farms throughout the world to produce Monte Carlo events. IN2P3 is a dedicated D0 MC site. Other sites are Grid. Over the last year, the total number of generated Monte Carlo has almost doubled to four billion. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  29. DØ Processing Time • Data taken at higher luminosity values takes longer to process. • Higher occupancy/multiplicity. • Average luminosity is not expected to greatly increase. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  30. CDF Computing Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  31. CDF Data Collection for the Past Year • 1.9 billion raw events → 2.5 billion reconstructed events → 4.8 billion ntuple events. • Additionally 1.9 billion Monte Carlo ntuple events were produced. • Almost 1.2 PetaBytes of data. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  32. CDF Monte Carlo Production The North American Grid provides Monte Carlo production. • Steady usage of NAmGrid. • Peaks tend to occur before conference periods. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  33. CDFGrid usage CDF Grid provides the full environment for data handling. • Provides the majority of computing for analyses. Peaks show over 30k queued jobs • Also conference dependent. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  34. CDF Open Science Grid Usage The Open Science Grid (OSG) has provided CDF over 40 million hours of computing over the last year. The use of the OSG has been fruitful and more resources are expected to come online. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  35. Common Tools DØ and CDF use a variety of tools common to both experiments. • Enstore – the underlying data transport mechanism for moving data from online to tape. • The tape silos are a shared responsibility. • SAM – a data storage and retrieval tool • GRID – the Open Science Grid is used to distribute processing to locations throughout the world. • GlideinWMS – a work management system which eases submission of computing jobs. The FNAL Computing Division provides common system management • Oracle services, farm management, desktop support, security, data storage management, and more. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  36. Future Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  37. DØ 2010 Shutdown plans This is a 4 week shutdown beginning next week. Replace luminosity scintillator. • And ~16 PMTs. Silicon HDI recovery. • And other individual channel recovery. Alignment measurements. Calibrations. Trigger framework maintenance. General Maintenance. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  38. CDF Shutdown 2010 Plans Silicon detector • Cooling system check. • Junction cards reseating. • Power supply maintenance: replacement of aged capacitors in power supply modules. Drift chamber • Replacement of failing resistors in the wire readout circuit. • Low voltage short repair. General preventative maintenance. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  39. Running in FY 2011 Staffing will continue to need attention. • Control room shifters, detector experts, on-call personnel, algorithms, support groups and system administrators. • DOE and Fermilab support of visitors and guest scientists for operational/computing continues to be very valuable to the experiments. (Details in Kilminster/Verzocchi talks.) Tevatron experiments’ computing budgets are 25% below FY10 and half of the experiments’ request. This will force the experiments to depend upon beyond warranty CPUs and will not support planned increases in data storage capacity and analysis speed. • Anticipated FY12 computing budget is ~40% lower than FY10 and below that required to efficiently and reliably support computing and analysis power of the experiments during active stage of Tevatron data analysis. The experiments expect to continue to maintain the high efficiency of the past year. Keep the delay between acquiring data and its reconstruction to a minimum. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  40. Summary Collider experiments are operating smoothly and efficiently. Offline processing is maintaining pace with data collection. Total delivered luminosity should reach 12 fb-1 with around 10 fb-1 recorded by the end of FY11. • Will require a dedicated effort from the Tevatron and collider experiments. Challenges maintaining effective Operations: • Increasing pressure on available resources • Personnel and computing. • Ageing detectors and infrastructure. CDF and D0 will take advantage of the shutdown to keep up detector maintenance. • Both experiments continue to improve on their ability to come out of a shutdown efficiently. The DOE's support facilitates our ability to capitalize on these opportunities. • Fermilab has made and continues to make significant contributions to CDF and DØ. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  41. BACKUP Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

  42. D0 Processing Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010

More Related