1 / 36

ATLAS Simulation/Reconstruction Software

ATLAS Simulation/Reconstruction Software. Reported by Jim Shank, work done by most US Institutes. DOE/NSF review of LHC Software and Computing Projects Fermilab 27-30 November, 2001. Outline. Activities in all systems (mostly by physicists): Pixels, TRT, EM Cal, Tile Cal, Muons, Trigger

lexine
Download Presentation

ATLAS Simulation/Reconstruction Software

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ATLAS Simulation/Reconstruction Software Reported by Jim Shank, work done by most US Institutes. DOE/NSF review of LHC Software and Computing Projects Fermilab 27-30 November, 2001

  2. Outline • Activities in all systems (mostly by physicists): • Pixels, TRT, EM Cal, Tile Cal, Muons, Trigger • Well integrated into overall ATLAS computing effort. • In particular, the US core efforts on Athena and DB. • Review of recent activity, by sub-system • Future work: Data Challenges. J. Shank ATLAS Simulation/Recon. SW.

  3. ATLAS Subsystem/Task Matrix Other US roles: D. Quarrie (LBNL), Chief Architect; P. Nevski (BNL), Geant3 simu coord; H. Ma (BNL), Raw data coord; C.Tull (LBNL), Eurogrid WP8 liaison J. Shank ATLAS Simulation/Recon. SW.

  4. Subdetector SW Activities Summary • Performance/design studies • G3 based simulation • Test beam • Athena integration • Reconstruction development in C++ • G4 based simulation development • G4 physics validation • XML based detector description • Database • Conditions DB • Trigger/DAQ J. Shank ATLAS Simulation/Recon. SW.

  5. Pixels-Conditions DB (Berkeley) • Goal: To develop general mechanism for retrieving time-dependent alignment constants from database and using them in reconstruction • Requires additions to Athena infrastructure • Requires extension of existing detector description interface • Will prototype using silicon and pixel detectors as the use case • Misalignments calculated from numbers stored ”Conditions Database” • Delivered through a general ”Time Dependent Conditions Service” in Athena (TCS) • In addition to event store (TES): • Need a detector store (TDeS) • Need interface to conditions DB (TCS) • A prototype TDeS coded by C. Leggett and P. Calafiura (a second instance of the Store-Gate Service without object deletion at the end of each event) • Work in progress… J. Shank ATLAS Simulation/Recon. SW.

  6. TRT (F. Luehring/Indiana et al) • Athena Pile-Up Requirements documentation ATL-SOFT-2001 • GEANT4 code writing • TRT hit and digitization definitions • TRT GEANT3 code • current beampipe + geometry updates • TRT material budget • TRT atlsim example on the grid. • GEANT 3  GEANT 4 comparisons J. Shank ATLAS Simulation/Recon. SW.

  7. LAr Simulation (M. Leltchouk/Nevis et al) • LAr simulation coordination: M. Leltchouk/Nevis • Participation in G4 EM barrel development • LAr EM calorimeter hits (LArEMHit) were implemented in GEANT4 by B.Seligman. • The ROOT I/O scheme is used for hit persistency (see http://www.usatlas.bnl.gov/computing/software/db/LArRoot2.htmlhttp://www.usatlas.bnl.gov/computing/software/db/rootio.html ) • Comparisons of GEANT4 Simulations with Testbeam Data and GEANT3 for the ATLAS Liquid Argon Calorimeter has been presented on CHEP2001 J. Shank ATLAS Simulation/Recon. SW.

  8. GEANT 4 LAr Simulation J. Shank ATLAS Simulation/Recon. SW.

  9. m-counter FCal1 TailCatcher Argon Excluder FCal1 Module 0 FCal2 VetoWall Cryostat FCal1 Electrode Pattern MWPC FCal2 Module 0 Iron Shield HoleVeto FCal1 Testbeam Setup in GEANT4 Setup around Cryostat only! J. Shank ATLAS Simulation/Recon. SW.

  10. No Noise Cut Noise Cut Dependence Geant3 Relative Energy Resolution [%] Relative Energy Resolution [%] Geant4 Beam Energy [GeV] Beam Energy [GeV] FCal1: GEANT3/4 Comparisons of Energy Resolution Fit to experimental data GEANT4 high energy resolution problem ?? J. Shank ATLAS Simulation/Recon. SW.

  11. LAr Reconstruction -- Major Milestones Met • Early design in PASO: Jan. 2000 • Migrate to Athena: May 2000 • LAr Reconstruction used as a test bed for early Athena • First application software to successfully migrate to Athena • 3 working days at LBL • First Common Calorimeter Interfaces Oct. 2000 • QA review of then available components Dec. 2000 • S. Albrand (Grenoble) • Combined Reconstruction (egamma) Jan. 2001 • Process GEANT4 LAr Hits (Root Objects) Mar. 2001 • Lund Release June 2001 • Establishing most of the reconstruction chain • From G3/G4 Hits to Particle Identification J. Shank ATLAS Simulation/Recon. SW.

  12. LAr Data Classes • Data Objects proposed/implemented in March 2001 • J. Collot et. al. J. Shank ATLAS Simulation/Recon. SW.

  13. Comparison to ATRECON J. Shank ATLAS Simulation/Recon. SW.

  14. Recent Plots using LAr recon. program J. Shank ATLAS Simulation/Recon. SW.

  15. LAr Reconstruction Conclusion • A central framework that is evolving to provide robust support • The reconstruction design has been built over this framework • Much of the ‘Fortran’ code has been migrated. • Validation ongoing, but results are promising. • It now paves the way for work in: • optimizing and developing new algorithms • Physics and Detector performance studies J. Shank ATLAS Simulation/Recon. SW.

  16. Tile Calorimeter • Tile Calorimeter DB coordination: T. LeCompte/ANL • Tile Cal reconstruction coord: F. Merritt/Chicago • Tile Cal XML Detector Description has been improved • Extended barrel completed • non-uniform plate spacing included • Extended barrel can be easily repositioned w.r.t. the barrel • allows studying the effect of recently introduced gap • Geant4 models have been built from both XML and "by hand“ • G4 vs. test beam comparisons just beginning • TileCal per se reconstruction is largely an issue of calibration (convert ADC counts to Energy) • calibration DB access is a goal for late FY2002 • TileCal classes have changed to be more in line with LAr classes • Jet reconstruction classes have been streamlined J. Shank ATLAS Simulation/Recon. SW.

  17. Tucson JetRec Working Group and Supporters Tucson, Arizona, August 20-22, 2001 • Argonne National Lab: Tom LeCompte • Brookhaven National Laboratory(*):Hong Ma, Srini Rajagopalan • TRIUMF: Monika Wielers • University of Arizona: Peter Loch • University of Chicago: Ed Frank, Ambreesh Gupta, Frank Merritt (*) by phone J. Shank ATLAS Simulation/Recon. SW.

  18. Tasklist for the Workshop • Come up with an improved JetRec software design in view of recent suggestions for changes: • definition of basic classes -> review of use cases; • establish the algorithm flow; • first look at the “navigation problem” • First attempt to set up a working group structure within the Jet/Etmiss performance group: • work plans, deliverables and commitments; • reporting to Jet/Etmiss and Software groups; • bi-weekly phone conferences Tuesdays, 17:00 (Geneva time) -> next October 2, 2001! J. Shank ATLAS Simulation/Recon. SW.

  19. Algorithmic Flow Example!! There is NO restriction to Calorimeter Reconstruction Objects or any other specific type in general! J. Shank ATLAS Simulation/Recon. SW.

  20. Muon Spectrometer • Boston U (J.Shank), U Michigan, Harvard, BNL + CERN, Italy,.. • Current activity: • Muon database and detector description • Muon DB coordination: S. Goldfarb/UM • XML detector description: MDTs, RPCs, TGCs implemented; full chain to Geant4 implemented • Geometry ID scheme for all subsystems defined and documented • OO muon reconstruction (Moore) development • Integrated into Athena; in repository; in early development • Limited reconstruction in the barrel • Simulation for detector layout optimization • Near term goals: • Extend Moore to barrel, update to emerging reconstruction data model. • Trigger TDR studies: L1->L2 rejection, efficiencies • Calibration DB, trigger DB, ongoing detector description work J. Shank ATLAS Simulation/Recon. SW.

  21. Physics Performance Comparasion CSC Doublets vs. SingletEndcap Muon System Staging Study B. Zhou & D. Levin, U. of Michigan • US DoE/NSF Lehmann Review recommend the US ATLAS Muon Team build 50% of the CSC chambers at the initial phase of the LHC. • Physics studies used single muons , and double and four muon final states from low mass Higgs decays. Conclusion: the US CSC muon staging plan has not shown significant impact on The low mass Higgs detections at the Day 1 of the LHC physics run. June, 2001:ATLAS management approved US staging plan. J. Shank ATLAS Simulation/Recon. SW.

  22. Investigation of Alignment Criteria in EndCap MDTs Daniel Levin – University of Michigan Impact on Efficiency and Resolution due to Uncertainties in chamber surveying, placement & orientation Efficiency T S Z Green: Rotation about beam T Axis Misalignment (mrad) Criterion: Alignment tolerance should be <0.3 mrad J. Shank ATLAS Simulation/Recon. SW.

  23. ATLAS Muon Database Contributions (S. Goldfarb) • Overall Coordination • Management of MuonSpectrometer packages for Event and Detector Description • Reduction of cross-package software dependencies, porting to CMT • New packages for Objectivity DDL • Planning document for Detector Description development • ATL-COM-MUON-2001-021 • Event Model Development • MuonEvent • Completion of transient G3 hit, container classes for MDT, RPC, TGC • Completion of persistent (objectivity) digit, container classes, schema for MDT, RPC, TGC • New Muon Event Model • Commencement of discussions with BNL defining project for Muon Spectrometer • Coordination with SCT/TRT community • Detector Description Development • MuonDetDescr • Completion of transient detector description classes for TGC • Completion of persistent (objectivity) detector description classes, schema for MDT • MuonAGDD • Evaluations of MDT, RPC, TGC descriptions for GEANT4 simulation • Development of “compact” syntax definitions for MDT, RPC, TGC and Barrel Toroid • Completion of XML description, expansion interface for MDT, Barrel Toroid • HEPDD (http://doc.cern.ch/age?a01380) • Hosted, Chaired second annual workshop on Detector Description for HEP at CERN J. Shank ATLAS Simulation/Recon. SW.

  24. ATLAS Muon Database Contributions Descriptions of Barrel Toroid (left) and H8 test beam geometry (below). Both geometries were generated using compact AGDD syntax and both were developed by REU summer students, under the supervision of S. Goldfarb. J. Shank ATLAS Simulation/Recon. SW.

  25. ATLAS Muon Database Planning • Data Challenge 0 • Persistent (objectivity) detector description classes, schema for RPC, TGC • Data Challenge 1 • Access to Geometry Version O in Athena from AMDB (Naples + SG) • General Development to Event Model • MuonEvent • New packages for technology-dependent software • Modifications necessary for new geometry implementation • New Muon Event Model • Initial implementation of Muon Digit Container and Identifier Classes (BNL) • Implementation of new identifier scheme (BNL + SG) • General Development to Detector Description • (These plans detailed in document ATL-COM-MUON-2001-021) • MuonDetDescr • Completion and testing of objectivity persistency • New AGDD_DetDescrSource classes to interface MuonDetDescr with AGDD • MuonAGDD • Completion of syntax, XML descriptions, interfaces for RPC, TGC, CSC, inert material • Extensive testing, evaluation of AGDD with G4 Simulation, Moore, Muonbox • HEPDD--Plan to Host/Chair Third Annual Workshop on Detector Description for HEP J. Shank ATLAS Simulation/Recon. SW.

  26. Offline Muon Reconstruction (Moore) • Muon Object Oriented Reconstruction (Moore). • Runs in the Athena Framework using the ATLAS CMT • Strategy • Base algorithms on trigger simulation: • Make roads from trigger chambers • MDT Pattern recognition added (see next slides) • Fitting from iPat • Graphics currently using GraXML and ATLANTIS J. Shank ATLAS Simulation/Recon. SW.

  27. Pattern Recognition: Track Finding x, y plane Inner station Outer station z, y plane Middle station J. Shank ATLAS Simulation/Recon. SW.

  28. Efficiency A Muon track consists ofhits from at least 2 stationsand is successfully fitted. 100 300 20 1000 Efficiency (%) 6 Muonbox MOORE The efficiency is normalizedto all events with the generated muon within ||<1at the event vertex PT (GeV) J. Shank ATLAS Simulation/Recon. SW.

  29. Resolution 1000 PT-resolution (%) The resolution is defined as the of the gaussian fit to the PTrec/PTgen distribution 300 6 100 20 PT (Gev) J. Shank ATLAS Simulation/Recon. SW.

  30. Pull of 1/PT pull = difference between the reconstructed and true values normalised to the error on the reconstructed value. PT = 1000 Gev PT = 6 Gev σ= 6.176 σ = 1.1941 The error on1\PT pull is due to the material J. Shank ATLAS Simulation/Recon. SW.

  31. Moore Plans • Release the code documented • Extend Moore in the End-cap regions • Look into using the Level-1 simulation code directly • Need to get the material description • Plan to use Cobra fitting • Exploring Graphics with Atlantis and continue with GraXML • Implement the current O-Layout • Participate in the Data Challenge (1) J. Shank ATLAS Simulation/Recon. SW.

  32. Trigger/DAQ Offline Software • The ATLAS High Level Trigger (HLT) is mostly a software trigger • LVL2: Optimized algorithms and simple trigger menus • Event Filter: Offline-like algorithms, full event, and latest calibrations • The LVL1 trigger is a hardware trigger and needs special simulation in offline • TDAQ software is similar to other detector software in terms of offline requirements and applications • Full simulation is used in design and optimization of TDAQ system • Offline software is used to monitor performance (rates, efficiency, single component performance) • However, very stringent QC needed; “mission criticality” J. Shank ATLAS Simulation/Recon. SW.

  33. T/DAQ Offline Software: Status • LVL1 simulation exists in Athena for e/g/t trigger • Recently, most effort has been in the design of the HLT framework. Main requirement in design: • Use the same software in online and offline environments • Also plan to have similar framework for LVL2 and EF • Possibly sharing of (some) services and algorithms • Presently evaluating Athena for use as EF framework • If OK for EF, then consider use at LVL2 • First cycle of design recently finished; now implementing first prototype • Aim for vertical slice prototype for Spring 2002 • Exploitation for HLT/DAQ/DCS TDR in late 2002 J. Shank ATLAS Simulation/Recon. SW.

  34. High Level Design stage is finished Aim is to use same design for LVL2 and EF System factorized in work areas Steering Algorithms Data Manager Event Data Model Interactions needed (and ongoing) with offline and architecture groups HLT Offline Software: Design J. Shank ATLAS Simulation/Recon. SW.

  35. Validation of Athena for HLT Use • The ATLAS EF will use selection and classification algorithms derived from the offline suite • Offline software performance therefore has a direct impact on EF farm size and cost • The HLT community has started “validation studies” (detailed benchmarking) of Athena, offline algorithms, and event model • The aim is to set metrics for monitoring trends in software performance • It is clear that the software is presently far from adequate • Not fair to judge during development phase • But benchmarking can (and has) helped spur improvements • Feedback during monthly meetings with A-team and regular interactions with developers • Software performance is also important for offline – hope that offline community will continue this work J. Shank ATLAS Simulation/Recon. SW.

  36. Summary • New ATLAS framework, Athena, enthusiastically embraced by broad spectrum of sub-system community. • Many US physicists active in C++ code development • Well integrated into overall ATLAS software effort • Schedule: • DC 0 12/01 • Should have full OO sw ready. • Still some Fortran wrapping (muons) • DC1 02/02 • Large scale simulation/reconstruction. • Some with GEANT4 • Objectivity and Root IO. J. Shank ATLAS Simulation/Recon. SW.

More Related