1 / 29

Acctesting Framework - Motivation, Overview and First Experience

Acctesting Framework - Motivation, Overview and First Experience . Kajetan Fuchsberger TE-MPE-TM, 2012-05-10. Thanks to: M . Galetzka, V.Baggiolini , R.Gorbonosov , M. Pojer, M. Solfaroli Camillocci, M. Zerlauth. Content. Content. LHC Hardware Comissioning.

torgny
Download Presentation

Acctesting Framework - Motivation, Overview and First Experience

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Acctesting Framework - Motivation, Overview and First Experience • Kajetan Fuchsberger • TE-MPE-TM, 2012-05-10 • Thanks to: • M. Galetzka, V.Baggiolini, R.Gorbonosov, M. Pojer, • M. Solfaroli Camillocci, M. Zerlauth

  2. Content

  3. Content

  4. LHC Hardware Comissioning • About 7000 Tests on Magnet Circuits • Must be executed every year after Christmas Stop (Potentially more after LS1) • Workflow: • ‘Execution’: Test Sequence is Executed on the Hardware Commissioning Sequencer. • ‘Analysis’: Test-Data (Measured Signals) is analysed (manually or partly automatic). • ‘Signing’: Experts have to ‘sign’ the test result (If manually analysed).

  5. Motivation I • Initial Project Description: • “Create replacement for the p2n Web-Page • aka Alvaro’s pages” • PHP, grown over time  Hard to maintain

  6. Motivation II • New Requirements: • Some tests should be allowed to be executed even if previous analysis was not completed. • Avoid starting of test if system is not ready.(e.g. locked) • Avoid starting of tests on Circuits on same QPS controller. • Force starting of tests on QF, QD at the same time • ….. and some more …

  7. Motivation III • The HWC Legacy System: • Many interdependent Systems: • Sequencer • DB (Central Point) • web page • Daemon • Fesa Class • LabView • … •  • Decision: Design a new system to orchestrate the whole process (and can replace several parts).

  8. Content

  9. Overview e.g.: HWC Sequencer, LHC Sequencer orCustom Components e.g.: Magnet Circuits, BIC/PIC … e.g.: LabView, Powering Server …

  10. Acctesting Server • Orchestrates the whole process: • Test Execution • Test Analysis • Exclusively reads/persists data in the database. • Notifies all the GUIs about changes. • Robust Design: • Continuously persists relevant data to be able to recover in case of a crash. • Gracefully handles unexpected behaviour of Execution- and Analysis Components.

  11. Workflow I • 1. User “expresses his wish” to execute one (or many) tests.

  12. Workflow II • 2. The Tests go to the Execution Basket (Server!) • 3. The Scheduler (on the Server) will decide when to start which test(s).

  13. Why so complicated? • Central Scheduling can respect all the conditions, even if requests come from different GUIs. • When conditions are fulfilled later, the tests are started automatically (No delays). • No Need for reservation of circuits anymore.  More dynamic

  14. Test Phases TestPhase contains one or more Tests Tests within a test phase can be executed in arbitrary order.

  15. Preconditions • Before: Only email exchange, that e.g. “Cryois ready for a certain group of circuits”. • Hard to track (History) • No real constraint on test execution ( Many wrongly started tests in previous years) • Now: ‘SignOnlyTest’ for each precondition. Automated Tests

  16. Designed for Extension • Extension Points: • Server: • TestStepHandler (handle certain types of SystemTests) • Constraints (Restrict Test Execution) • LockProvider (PIC, Db, …) • SystemInformationProvider (e.g. Issues) • GUI: • TestResultsViewer (E.g. Powering Server)

  17. Test Step Handlers • Responsible for executing a specific TestStep (execution, analysis) for a certain type of tests. • Is itself responsible for communicating with other systems, if required. • Examples: • HwcTestExecutionHandler: Communicates with HWC Sequencer to execute the tests • DaemoneAnalysisHandler: Communicates with LabView system to retrieve analysis results • … Future: PicBicTestExecutionHandler?

  18. Constraints • Simple Extension Point, which has to decide if one Test is allowed to be run together with another one. (Simple Yes/No decision) • Used to formulate requirements like:“Only start one test on one of the four circuits on the same QPS controller.” • Checked by the Scheduler, to decide if a certain test-configuration is allowed or not.  See Michael’s presentation.

  19. Soft Migration • Working System had to be in place for start up 2012, but: • Old System should still work. • The switch between the two Systems should be easy. • Achieved by: • Using the old Db-Schema for new System (some restrictions!) + some (careful) extensions •  WebPage stayed operational.

  20. Content

  21. Some Statistics Min 5 Tests launched. 4 big Campaigns:

  22. Test Statistics Percentage of failed tests that timed out. Average number of Tests per shift Test failure rate did not decrease much !?  Efficiency increased.  (Preconditions!)

  23. Time between Tests? • Expectation: Average time between test executions on one circuit should decrease!? Not evident !? Time [h] (Data: X-Mas 11/12) In the shadow of Analysis Time!

  24. Analysis Time PIC2 and PNO.a1 automated (if successful)

  25. HWC Campaign 2012 • System was operational from the beginning. No big problems. Never stopped progress. • GUI was very well accepted (Lot of positive feedback). • Preconditions were useful, at least for tracking. Users Point of view? • Constraints turned out to be very useful although sometimes puzzling. Prevented many mistakes. Many additional constraints were added during the campaign. • After some doubt, the automated scheduling was found to be very convenient.

  26. To Improve • Missing: Editing of Test Plans! (A lot of Db hacks were necessary ) • Constraints should be more dynamic (Currently Java Classes)… dynamic loading for plugins? • Old page still used for statistics and quick lookups.  Missing GUI features.

  27. Next Steps • Database Migration (Old web page will die) • Improvements of GUI statistic features (Summer Student) • Add test-plan editor to GUI. • Improve Scheduling Algorithm ( Michael) • Integrate other systems: • Beam Commissioning as SignOnly tests.  Migrate smoothly to partly automatized tests? • BIC/PIC tests? • Others? • More Automated Analysis (Powering Server)

  28. Conclusion • System worked very well in 2011. • Some improvements (test plan editing, more dynamic constraints, new GUI features) • A lot to improve in analysis part! • Next Steps important: • Db migration. • Improvement of scheduling. • Improvement of automated analysis. • Integrate more systems.

  29. Thank you for your attention!

More Related