1 / 15

ATLAS DC2 & Continuous production

ATLAS DC2 & Continuous production. ATLAS Physics Plenary 26 February 2004 Gilbert Poulard CERN PH-ATC. Outline. Goals Operation; scenario; time scale Resources Production; Grid and Tools Role of Tiers Comments on schedule

Download Presentation

ATLAS DC2 & Continuous production

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ATLAS DC2 & Continuous production ATLAS Physics Plenary 26 February 2004 Gilbert Poulard CERN PH-ATC

  2. Outline • Goals • Operation; scenario; time scale • Resources • Production; Grid and Tools • Role of Tiers • Comments on schedule • Analysis will be covered in tomorrow’s talk and in the next software week. G. Poulard - ATLAS Physics plenary

  3. DC2: goals • At this stage the goal includes: • Full use of Geant4; POOL; LCG applications • Pile-up and digitization in Athena • Deployment of the complete Event Data Model and the Detector Description • Simulation of full ATLAS and 2004 combined Testbeam • Test the calibration and alignment procedures • Use widely the GRID middleware and tools • Large scale physics analysis • Computing model studies (document end 2004) • Run as much as possible of the production on LCG-2 G. Poulard - ATLAS Physics plenary

  4. DC2 operation • Consider DC2 as a three-part operation: • part I: production of simulated data (May-June 2004) • needs Geant4, digitization and pile-up in Athena, POOL persistency • “minimal” reconstruction just to validate simulation suite • will run on any computing facilities we can get access to around the world • part II: test of Tier-0 operation (July 2004) • needs full reconstruction software following RTF report design, definition of AODs and TAGs • (calibration/alignment and) reconstruction will run on Tier-0 prototype as if data were coming from the online system (at 10% of the rate) • output (ESD+AOD) will be distributed to Tier-1s in real time for analysis • part III: test of distributed analysis on the Grid (August-Oct. 2004) • access to event and non-event data from anywhere in the world both in organized and chaotic ways • in parallel: run distributed reconstruction on simulated data G. Poulard - ATLAS Physics plenary

  5. DC2: Scenario & Time scale September 03: Release7 March 17th: Release 8 (production) May 3rd 04: July 1st 04: “DC2” August 1st: Put in place, understand & validate: Geant4; POOL; LCG applications Event Data Model Digitization; pile-up; byte-stream Conversion of DC1 data to POOL; large scale persistency tests and reconstruction Testing and validation Run test-production Start final validation Start simulation; Pile-up & digitization Event mixing Transfer data to CERN Intensive Reconstruction on “Tier0” Distribution of ESD & AOD Calibration; alignment Start Physics analysis Reprocessing G. Poulard - ATLAS Physics plenary

  6. Task Flow for DC2 data Bytestream Raw Digits ESD Digits (RDO) MCTruth Bytestream Raw Digits Mixing Reconstruction Events HepMC Hits MCTruth Geant4 Digitization Bytestream Raw Digits ESD Digits (RDO) MCTruth Events HepMC Hits MCTruth Pythia Reconstruction Geant4 Digitization Digits (RDO) MCTruth Events HepMC Hits MCTruth Geant4 Pile-up Bytestream Raw Digits ESD Bytestream Raw Digits Mixing Reconstruction Events HepMC Hits MCTruth Digits (RDO) MCTruth Geant4 Pile-up Bytestream Raw Digits 18 TB 5 TB 24 TB 75 TB ~2 TB Event Mixing Digitization (Pile-up) Reconstruction Detector Simulation Event generation Byte stream Persistency: Athena-POOL TB Physics events Min. bias Events Piled-up events Mixed events Mixed events With Pile-up G. Poulard - ATLAS Physics plenary Volume of data for 107 events

  7. For Tier-0 exercise Event generation Detector simulation Pile-up Detector response RDO Event mixing Byte-stream No MCTruth Data transfer Reconstruction ESD production AOD production Streaming Distribution of data For Physics & Detectors studies Event generation Detector simulation Pile-up Detector response RDO Event mixing Byte-stream Data transfer Reconstruction ESD production AOD production Scenario for DC2 Q1: Trigger & Filtering in the chain? Q2: ATLFAST? Both are needed for Computing model studies G. Poulard - ATLAS Physics plenary

  8. Production scenario G. Poulard - ATLAS Physics plenary

  9. DC2 resources G. Poulard - ATLAS Physics plenary

  10. DC2 resources (based on Geant3 numbers) G. Poulard - ATLAS Physics plenary

  11. DC2; Grid & Production tools • We foresee to use: • 3 Grid flavors (LCG; Grid3; Nordu-Grid) • “batch” systems (LSF; …) • Automated production system • New production DB (Oracle) • Supervisor-executer component model • Windmill supervisor project • Executers for each Grid and LSF • Data management system • Don Quijote DMS project • Successor of Magda • … but uses native catalogs • AMI for bookkeeping • Going to web services • Integrated to POOL G. Poulard - ATLAS Physics plenary

  12. Atlas Production System schema Task = [job]* Dataset = [partition]* AMI JOB DESCRIPTION Location Hint (Task) Task (Dataset) Task Transf. Definition Data Management System + physics signature Human intervention Job Run Info Location Hint (Job) Job (Partition) Partition Transf. Definition Executable name Release version signature Supervisor 1 Supervisor 2 Supervisor 3 Supervisor 4 US Grid Executer NG Executer LSF Executer LCG Executer Chimera RB RB US Grid LCG NG Local Batch G. Poulard - ATLAS Physics plenary

  13. Tiers in DC2 • Tier-0 • 20% of simulation will be done at CERN • All data in ByteStream format (~16 TB) will be copied to CERN • Reconstruction will be done at CERN (in ~10 days). • Reconstruction output (ESD) will be exported in 2 copies from Tier-0 ( 2 X ~5 TB). • Primary AOD can be produced at CERN • Still under discusssion G. Poulard - ATLAS Physics plenary

  14. Tiers in DC2 • Tier-1s will have to • Host simulated data produced by them or coming from Tier-2; plus ESD (& AOD) coming from Tier-0 • Run reconstruction in parallel to Tier-0 exercise (~2 months) • This will include links to MCTruth • Produce and host ESD and AOD • Provide access to the ATLAS V.O. members • Tier-2s • Run simulation (and other components if they wish to) • Copy (replicate) their data to Tier-1 • It would be easier if both Tier-1s and Tier-2s are on Grid • Strongly recommended • ATLAS is committed to LCG • All information should be entered into the relevant database and catalog G. Poulard - ATLAS Physics plenary

  15. Comments on schedule • The change of the schedule has been “driven” by • ATLAS side: • the readiness of the software • Combined test beam has a highest priority • The availability of the production tools • The integration with grid is not always easy • Grid side • The readiness of LCG • We would prefer run Grid only! • Priorities are not defined by ATLAS only • For the Tier-0 exercise • It will be difficult to define the starting date before we have a better idea how work the “pile-up” and the “event-mixing” processes. G. Poulard - ATLAS Physics plenary

More Related