1 / 23

Automation Strategies for LHC System Tests and Re-Commissioning after LS1

Automation Strategies for LHC System Tests and Re-Commissioning after LS1. Kajetan Fuchsberger TE-MPE LS1 Workshop On behalf of the TE-MPE-MS Software Team: M.Audrain , A.Gorzawski , J.Suchowski , J.-C. Garnier. Content. Content. Activities.

Download Presentation

Automation Strategies for LHC System Tests and Re-Commissioning after LS1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Automation Strategies for LHC System Tests and Re-Commissioning after LS1 Kajetan Fuchsberger TE-MPE LS1 Workshop On behalf of the TE-MPE-MS Software Team: M.Audrain, A.Gorzawski, J.Suchowski, J.-C. Garnier

  2. Content

  3. Content

  4. Activities • Automation for LHC Comissioning + Software Quality improvements: • AccTesting Framework • Post Mortem Framework • Why during LS1? • AccTesting: Improve Automation and reliability of comissioning. • PM: Follow Technology changes • Impact, if not done: • AccTesting: Unability to integrate other systems and tests • In general: rotting software

  5. Schedule • Not dependant on the general LS1 Schedule, except: • Would like to have a new version ready for Start of LS1 to get feedback. • New AccTesting features must be ready until begin of Powering (End of April 2014) • PM: To be operational at End of LS1 “WITHIN” “BEFORE” “AFTER” End March, 2013 Mid April 2014

  6. Content

  7. From Alvaros’ pages to Acctesting • Currently operating in parallel • During LS1: Decommissioning of “Alvaros’ pages”

  8. Database Migration • Current Situation: • (-) Circuits only • (-) HWC & Sign Tests, only • (-) One Campaign Only Lsa DB AccTesting DB • Dedicated AccTesting DB (redesign): • (+) Different systems • (+) Different Tests • (+) Flexible Test Plans • (+) Many Campaigns  Before LS1

  9. AccTesting GUI • Improvements: • More information on Tests and Systems (MTF files, Issues, History) - see also Arekspresentation • Easy Editing of Test plans • Multiple Campaigns  Within LS1

  10. Analysis Times PIC2 and PNO.a1 automated (if successful)

  11. Automatic Test Analysis HWC Sequencer AccTesting Server • Analysis logic close to HWC sequences (Java classes) execute ExecutionResult PMEA + automatic analysis modules analyze AnalysisResult analyze AnalysisResult  Within LS1 PM Powering Server

  12. Reuse of existing PM modules 600A EE (Failing Capacitor Discharge) Main Quad EE 600A QPS Detection Crate (Detection of Simulated Quench Signal)

  13. Post Mortem System • Adaption of some modules + Communication with AccTesting Framework • Work induced by outside changes(Operating Systems, Middleware) • Maintainability & Testability Improvements • Improvement of PM access from GPN  Within LS1

  14. BIC Communication Tests • Testing communication between PICs, WICs and FMCMs to BICs. • Migration of existing GUI + dedicated Db Tables to Generic Framework + DB • Reusage of existing Code. Courtesy: I.R. Ramirez  Within LS1

  15. ELQA AccTesting GUI Test Results + Parameters Oracle DB (EDMSDB) Import AccTesting Server Courtesy: M. Bednarek, J. Ludwin  Before LS1

  16. IST.QPS • Details  See Arekspresentation  Before LS1

  17. Content

  18. Collaboration with Others • Other MPE Sections: • Discussion & Cooperation, but no dependencies • BE-CO-DO: • Development & Build tools • Continuous Integration & SW Quality monitoring • BE-CO-DA: • Integration of AccTesting with other frameworks • PM java layers • Database design & services • No potential issues so far.

  19. Scrum • Iterative Process Framework: • Time boxes • Continuous improvements • + Enables us to manage/estimate time! • Sprint: Iteration; (3 Weeks in our case) • User Story:Units of work(Use case; feature) • Story Point:Complexity of a User Story; (1 StP ~ 5 Man days in our case)

  20. Scrum Experience Team of 5 people, 6 Sprints so far Team Velocity: 16 ± 3 StP / Sprint • Until End of LS1 (22nd April 2014): 22 Sprints • 352 ± 66 Story Points

  21. Resources • Total Capacity: ~ 350 StPuntil end of LS1(Shared between 9 projects) • AccTesting & PM: 218 StPrequired (~60% !) 60/76/0 0/82/5

  22. Summary • Before LS1: • DB migration, GUI improvements • ELQA, IST.QPS • Within LS1: • Decommissioningof Alvaros‘ pages • Additional Automatic Analysis usingPowering PM server • PM improvements • PIC/WIC/FMCM to BIC communicationtests • Noshowstoppersidentified. Effortshard to estimate  Lot of unknowns. • Tight Ressources: • AccTesting + PM  need ~60 % of SW resources • Re-Prioritization might benecessary • Soon: Small Working Group on Automation • Some parts could be written within collaborations withotherpeople/sections(E.g. Analysis Modules, VerificationLogic)

  23. Thank you for your Attention! Questions?

More Related