1 / 15

SAS Overview

SAS Overview. Overall Data Flow Sim/Recon Gleam Recon rewrites Calibrations Level 1 Pipeline + Mirrors Science Tools Advertisement. Data Flow. Data recon + MC on disk. Abstract full-recon output into L1 DB for analysis. DPF. Italian mirror French mirror. MC. Recon. MOC. Calibs.

Download Presentation

SAS Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SAS Overview • Overall Data Flow • Sim/Recon • Gleam • Recon rewrites • Calibrations • Level 1 Pipeline + Mirrors • Science Tools Advertisement

  2. Data Flow Data recon + MC on disk. Abstract full-recon output into L1 DB for analysis DPF Italian mirror French mirror MC Recon MOC Calibs IOC L1 DB Fully automated server, with RDB for data catalogue + processing state. Uses SLAC batch CPU and disk farms. L2 DB SSC Parts of L2 processing also automated

  3. Instrument Simulations and Reconstruction 3 GeV gamma interaction Instrument data 3 GeV gamma recon CAL Detail

  4. Processing Pipeline WWW Level 0 IOC Batch system HSM Level 0 Automated Tape Archive Level 1, diagnostics ~50 CPUs; ~50 TB disk by 2010

  5. Sim/Recon Toolset applications Root, IDL – analysis Event Display Gleam TkrRecon, CalRecon, AcdRecon – test beam era versions; Rewrites being planned & executed simulation package gismo – on its way out GEANT4 – on its way in xml – geometry, parameters Root – object I/O Gaudi – code framework VC++ – Windows IDE gnu tools - Linux vcmt – Windows gui glastpack+soap - Linux CMT – package version management ssh – secure cvs access cvs – file version management utilities

  6. New flexible geometry flight.xml + detModel Handle EM1 (1x1), Calib unit (1x4), Flight (4x4) Calib may well include some simulation of beamline Expect to handle BFEM; maybe BTEM Used in Sim, Recon & Event Display. Analysis? Looking successful so far I&T group trying it out for EM1 Separate Sim, Digi & Recon Separately selectable and configurable via jobOptions Persistent I/O from each phase Root is the current choice Fully detailed output Object I/O matches OO coding in sim/recon MC allows for configurable output – user can specify which volumes are sensitive and PositionHit or IntegratingHit types Separate trees for MC, Digi, Recon Logically connected as one dataset Separate branches for systems: MC, subsystems Can exercise conditional reads by branch Expect to have a summary ntuple as well Derivable from the full tree Goals for Gleam

  7. Output Structure MC Tree Digi Tree Recon Tree Summary ntuple McParticle TkrDigi TkrRecon McPositionHit CalDigi CalRecon McIntegratingHit AcdDigi AcdRecon • need mechanism to link Recon Digi MC • nice if it were external to either pair • could also be used for cross-subsystem relationships • talking to Riccardo/Toby about such a mechanism

  8. Gleam Goals (cont’d) • New Recon chains per subsystem • TKR well underway (subject of this meeting) • CAL just getting going again • TKR+CAL for low energy measurements – TKR is sampling calorimeter • Real clustering at low energies? • Revisit leakage corrections, especially off-axis • ACD to get a facelift • Use revised propagator to associate tracks to tiles (use extrapolated error matrix?) • Expect such work to be made much easier with the new output structures – access to all the detail we’ll need to understand algorithm performance • Gaudi allows for plug ‘n’ play operation of algorithms, so we can try out new ones easily. Have to make sure the recons are set up to exploit this! • Started planning on extensive test suites to follow performance as code evolves • Tie to Release Manager – track performance vs Release via database

  9. Cal ID Layer x=y=50 mm Horiz pitch = 27.84 mm => Tower 10, Col 1 Tower Column

  10. Xtal Threshold Threshold @ 2 MeV = 100 ADC: looks close. Might need more stats to understand cutoff.

  11. Details of Subsystems Data • Root output is imminent • MC & CAL already available for Digis • TKR should be appear within a day or two • ACD by end of week (we hope, if Heather has recovered!) • We have proto Root macros for accessing new output files and doing some simple checking • Have sample userAlg files for generating them (userAlg.cxx, requirements and jobOptions) • Can demonstrate them this week

  12. Calibrations • Infrastructure • Starting to happen now (Joanne Bogart) • MySql db for metadata; xml, Root for storing parameters • Prototyping now on TKR hot/dead channels • CAL interested in getting on board early. First redesign of calibration classes happening. • TKR • Hot/dead channels (Leon) • Alignment (Hiro Tajima) – should have substantial progress by end 2002 • CAL (NRL, Bordeaux?) • Thresholds, pedestals, gains, light tapers • Charge injection • Using heavy ions in flight • ACD • thresholds, pedestals, gains

  13. Data Pipeline and Mirrors • We are in the process of writing a spec for the pipeline now • Alex Schlessinger will write the server etc • Database is specified already – started from SLD experience • Set up for flexibility – configure data per task so RealData and MC can both be handled • See LAT-TD-0533 for database description • Will augment db for handling server state • We expect to have mirrors in 3 locations: • SSC – level 1 DB (summary ntuple needed for astronomy) • Italy – so far we have only had contact with ASI. ? • France • Don’t have input from LAT mirrors on what data to mirror • Choices are Level 1 DB only; plus Recon; plus Raw Data • Initial druther would be not to distribute raw data to ensure version control over Recon.

  14. Science Tools • Science Support Center is at Goddard • Well funded – will probably have ~12 people! • Their charter is to provide access to the public for analysis of GLAST data • Our goal is to work with them on a single suite of tools • Joint LAT-SSC working group in operation for a couple of months now • Have been fleshing out initial descriptions and requirements on the tool box • http://www-glast.slac.stanford.edu/ScienceTools/tool_defs • Public call for comments on the documents leading into the Science Tools Workshop @ SLAC June 12-14 Organized by Seth Digel (a Real Astronomer).

  15. Communications • Meetings • vrvs • Core – Wed 17:30; General Thurs 17:30; CAL – Wed 16:30; TKR/DSTF – Fri 17:30(alternating); DocTF – Tues 17:30 • All have minutes and slides posted from meetings • http://www-glast.slac.stanford.edu/software • Mailing Lists • General – Softlist; Core – InfraSoftlist; TKR – TkrSoftlist; CAL – CalSoftlist; • “bug tracking” – HelpSoftList • Majordomo via Stanford. • ICQ • Very useful for distributed group • Trillian for Windows; GAIM for linux seem best clients currently • Monthly Code reviews

More Related