1 / 8

Tony Johnson (for Richard Dubois) Fermi Data Handling Manager (but not involved in I&T)

Fermi Large Area Telescope (LAT) Integration and Test (I&T) Data Experience and Lessons Learned LSST Camera Workshop Brookhaven, March 2012. Tony Johnson (for Richard Dubois) Fermi Data Handling Manager (but not involved in I&T)

kirkan
Download Presentation

Tony Johnson (for Richard Dubois) Fermi Data Handling Manager (but not involved in I&T)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Fermi Large Area Telescope (LAT)Integration and Test (I&T) Data Experience and Lessons LearnedLSST Camera WorkshopBrookhaven, March 2012 Tony Johnson (for Richard Dubois) Fermi Data Handling Manager (but not involved in I&T) Most slides from Anders Borgland (Science Verification, Analysis and Calibration manager)

  2. Fermi LAT <> LSST • We only dealt with I&T for the LAT • not the spacecraft itself (thanks NASA) • NASA did have specific requirements for documenting I&T of LAT

  3. Fermi LAT I&T overview • Integration and Test • Before we launched Fermi we went through a long Integration & Test phase of the LAT. • We had an EngineeringModel of one of the towers. • When we got the complete towers: • Integrated them one by one into the 4x4 grid. • We took cosmic data and Van Der Graf test data along the way • In addition we took beam test data on prototype modules • At  CERN, SLAC and GSI • In parallel: • Intense development of Reconstruction/Analysis and Monte Carlo simulations • Detector modeling and response • Extensive “Data Challenge” program • Wherever possible the same software was used for all of these activities: • Data formats for raw, reconstructed and “housekeeping” data • Reconstruction and Analysis code • Data Catalog, Automated data processing (pipeline) • Electronic logbook

  4. Data Products Housekeeping data. Voltages, temperatures, beam conditions, etc Test • Complete set of data needs to be recorded and cataloged • Needs to remain accessible/usable for 10+ years Detector configuration, geometry, etc DAQ Rawdata Code versions, configuration, log files, (the code itself?) Reconstruction-Analysis Electronic log book, formal test report Reconstructed data

  5. Data Records – Web Interface

  6. Software/Tools Evolution • Reconstruction/analysis code • Had to deal with both real data taking and MC simulations. • Had to flexible enough to deal with different detector configurations • Preferably by configuration files without need to rebuild • Fermi allows detector geometry to be read from XML file • Fermi LAT code: • Made up of individual software packages: • Cal reconstruction, Energy estimate, track finding etc. • Version control using cvs • Software release: • A collection of a consistent set of software packages: • Currently: >90 packages. • Called 'GlastRelease'. • Original idea: • GlastRelease: • MC oriented. • Bleeding edge – all development takes part here. • EngineeringModel: • A frozen version of GlastRelease for real data taking: • Stable: Resynch to GlastRelease once in a while. • Bug 'free': Bugs fixed in GlastRelease before making it into EngineeringModel

  7. Real World • Data is different from MC: • Requirements from processing data are different from doing MC. • May have to add new features to process data. • Data taking schedule is different from MC: • Data taking schedule drives when and how fast to add new features. • In short: • EngineeringModelwas quite often more advanced than GlastRelease • They diverged to a large degree. • Often out of synch. • It was non-trivial to deal with all this • The I&T team needs to test new software features/bug fixes quickly: • You usually need bugs fixed yesterday! • You need to give quick feedback to the developers: • The more details you can give, the better .... • Excellent communication between I&T team and software developers is essential

  8. Lessons Learned • Planned well in advance to use common software for I&T and real data taking • Different requirements, timescales make this challenging • Planned to use the same tools for I&T and real data • Evolution of software makes this hard • Version of reconstruction software used for I&T is now very old • Current code can still read old data – but for how long? • Need to record code version used, may need to maintain old versions of software used for I&T • Used electronic logbook, database for housekeeping, data catalog, processing pipeline to record processing history • In all cases newer versions of these tools have evolved • I&T data is still accessible but in old/obsolete version of tools • Need to maintain old tools, or plan to actively migrate data over time

More Related