1 / 8

Simulation Commissioning, Validation, Data Quality

LHCb Simulation Day 23 rd & 24 th May 2013. Simulation Commissioning, Validation, Data Quality. A brain dump to prompt discussion Many points applicable to any of LHCb software but some simulation specificities. Gloria Corti. What is what? . Commissioning. Validation.

lilike
Download Presentation

Simulation Commissioning, Validation, Data Quality

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LHCb Simulation Day 23rd & 24th May 2013 Simulation Commissioning, Validation, Data Quality A brain dump to prompt discussion Many points applicable to any of LHCb software but some simulation specificities Gloria Corti

  2. What is what? Commissioning Validation Data Quality t = Before releasing t = with productions that could be PRODUCTIONS To adopt a new version of Gauss FOR PRODUCTION is a long process with a set of tests at different levels We have tools but what is the best tool and what should be do?

  3. My definitions - Commissioning • Commissioning of the software per-se, i.e. make a new version of Gauss compile and run • New compilers, new Gaudi version, new (version of) generators, new Geant4 version • For some of these changes the outcome of the application should not change • For the geometry for example checking overlaps are part of this

  4. Commissioning Gauss (and Boole) • … Once things that are suppose to work are committed • Not always possible • Gauss is in the nightlies but is the winner for red squares! • Even when it is successful …. Part of it because Gauss relies on random numbers • We (us Gauss managers) need to review what we want to test • Each test should look at one thing only, for example we just test that a new event type can run • We can customize tests • Need to disentangle reference tests that should give identical output and those for evolution • We have been using many slots recently • Support for various Sim05, Sim06, Sim08 • Exploring new versions of Geant4

  5. My definitions – Validation of simulation • Checking that only changes in physics and detector modeling as expected are there or Physics validation • Checking output with special productions • Need at least few 1000 events • Particle guns productions • Productions with different simulation settings of few 100k events • Checking that performance is ‘as expected’ and ‘acceptable’ or Software validation – and keep track of its evolution • Again need at least few 1000 events • Check that the whole simulation processing chain works • Integration tests with old and new conditions • ‘Commissioning’ with smallish samples, few 1000 events • Validating that the ‘final(s)’ configurations are not worse than

  6. Recent validation of Sim08 • In Sim08 we did change a lot in Gauss and simulation conditions • New version of Pythia6, Pythia8 at production quality, new EvtGen, new Gaudi version, new compilers, new version of Geant4, new hadronicphyics list used • Start out with building in the nightlies • As usual compilation errors (new Gaudi, new Geant4, new compiler…) • Made private test productions for new interesting features and understand how to use new functionalities • Gauss stand alone studies • To investigate new Pythia8, new EvtGen, new hadronic physics list cross sections, dE/dx for various EM physics lists • Central productions for systematic studies and effect on physics and verify calibrations • Particle guns studies (Gauss and Gauss+Boole) • Check effects of dE/dx on tracking and calorimetry • Check effect of hadronic physics on asymmetries

  7. Data Quality for MC • But still things slip through! • Some things can only be caught in production • Jobs getting stuck • All reconstruction distributions make sense • Need Data Quality for the MC • See Marco A.’s slides • Investigated in the past how to use Data Quality tools for Validation and Regressions tests • Far from ideal for MC as not really a single reference • See Ben’s slides on tools for validation test infrastructure

  8. LHCbPR tool

More Related