Acctesting
Download
1 / 29

Acctesting Framework - Motivation, Overview and First Experience - PowerPoint PPT Presentation


  • 60 Views
  • Uploaded on

Acctesting Framework - Motivation, Overview and First Experience . Kajetan Fuchsberger TE-MPE-TM, 2012-05-10. Thanks to: M . Galetzka, V.Baggiolini , R.Gorbonosov , M. Pojer, M. Solfaroli Camillocci, M. Zerlauth. Content. Content. LHC Hardware Comissioning.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Acctesting Framework - Motivation, Overview and First Experience' - torgny


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Acctesting framework motivation overview and first experience

  • Acctesting Framework - Motivation, Overview and First Experience

  • Kajetan Fuchsberger

  • TE-MPE-TM, 2012-05-10

  • Thanks to:

  • M. Galetzka, V.Baggiolini, R.Gorbonosov, M. Pojer,

  • M. Solfaroli Camillocci, M. Zerlauth




Lhc hardware comissioning
LHC Hardware Comissioning

  • About 7000 Tests on Magnet Circuits

  • Must be executed every year after Christmas Stop (Potentially more after LS1)

  • Workflow:

    • ‘Execution’: Test Sequence is Executed on the Hardware Commissioning Sequencer.

    • ‘Analysis’: Test-Data (Measured Signals) is analysed (manually or partly automatic).

    • ‘Signing’: Experts have to ‘sign’ the test result (If manually analysed).


Motivation i
Motivation I

  • Initial Project Description:

  • “Create replacement for the p2n Web-Page

  • aka Alvaro’s pages”

  • PHP, grown over time  Hard to maintain


Motivation ii
Motivation II

  • New Requirements:

  • Some tests should be allowed to be executed even if previous analysis was not completed.

  • Avoid starting of test if system is not ready.(e.g. locked)

  • Avoid starting of tests on Circuits on same QPS controller.

  • Force starting of tests on QF, QD at the same time

  • ….. and some more …


Motivation iii
Motivation III

  • The HWC Legacy System:

  • Many interdependent Systems:

    • Sequencer

    • DB (Central Point)

    • web page

    • Daemon

    • Fesa Class

    • LabView

  • Decision: Design a new system to orchestrate the whole process (and can replace several parts).



Overview
Overview

e.g.: HWC Sequencer, LHC Sequencer orCustom Components

e.g.: Magnet Circuits, BIC/PIC …

e.g.: LabView, Powering Server …


Acctesting server
Acctesting Server

  • Orchestrates the whole process:

    • Test Execution

    • Test Analysis

  • Exclusively reads/persists data in the database.

  • Notifies all the GUIs about changes.

  • Robust Design:

    • Continuously persists relevant data to be able to recover in case of a crash.

    • Gracefully handles unexpected behaviour of Execution- and Analysis Components.


Workflow i
Workflow I

  • 1. User “expresses his wish” to execute one (or many) tests.


Workflow ii
Workflow II

  • 2. The Tests go to the Execution Basket (Server!)

  • 3. The Scheduler (on the Server) will decide when to start which test(s).


Why so complicated
Why so complicated?

  • Central Scheduling can respect all the conditions, even if requests come from different GUIs.

  • When conditions are fulfilled later, the tests are started automatically (No delays).

  • No Need for reservation of circuits anymore.  More dynamic


Test phases
Test Phases

TestPhase contains one or more Tests

Tests within a test phase can be executed in arbitrary order.


Preconditions
Preconditions

  • Before: Only email exchange, that e.g. “Cryois ready for a certain group of circuits”.

    • Hard to track (History)

    • No real constraint on test execution ( Many wrongly started tests in previous years)

  • Now:

‘SignOnlyTest’ for each precondition.

Automated Tests


Designed for extension
Designed for Extension

  • Extension Points:

  • Server:

    • TestStepHandler (handle certain types of SystemTests)

    • Constraints (Restrict Test Execution)

    • LockProvider (PIC, Db, …)

    • SystemInformationProvider (e.g. Issues)

  • GUI:

    • TestResultsViewer (E.g. Powering Server)


Test step handlers
Test Step Handlers

  • Responsible for executing a specific TestStep (execution, analysis) for a certain type of tests.

  • Is itself responsible for communicating with other systems, if required.

  • Examples:

    • HwcTestExecutionHandler: Communicates with HWC Sequencer to execute the tests

    • DaemoneAnalysisHandler: Communicates with LabView system to retrieve analysis results

    • … Future: PicBicTestExecutionHandler?


Constraints
Constraints

  • Simple Extension Point, which has to decide if one Test is allowed to be run together with another one. (Simple Yes/No decision)

  • Used to formulate requirements like:“Only start one test on one of the four circuits on the same QPS controller.”

  • Checked by the Scheduler, to decide if a certain test-configuration is allowed or not.

     See Michael’s presentation.


Soft migration
Soft Migration

  • Working System had to be in place for start up 2012, but:

    • Old System should still work.

    • The switch between the two Systems should be easy.

  • Achieved by:

    • Using the old Db-Schema for new System (some restrictions!) + some (careful) extensions

    •  WebPage stayed operational.



Some statistics
Some Statistics

Min 5 Tests launched.

4 big Campaigns:


Test statistics
Test Statistics

Percentage of failed tests that timed out.

Average number of Tests per shift

Test failure rate did not decrease much !? 

Efficiency increased.

 (Preconditions!)


Time between tests
Time between Tests?

  • Expectation: Average time between test executions on one circuit should decrease!? Not evident !?

Time [h]

(Data: X-Mas 11/12)

In the shadow of Analysis Time!


Analysis time
Analysis Time

PIC2 and PNO.a1 automated (if successful)


Hwc campaign 2012
HWC Campaign 2012

  • System was operational from the beginning. No big problems. Never stopped progress.

  • GUI was very well accepted (Lot of positive feedback).

  • Preconditions were useful, at least for tracking. Users Point of view?

  • Constraints turned out to be very useful although sometimes puzzling. Prevented many mistakes. Many additional constraints were added during the campaign.

  • After some doubt, the automated scheduling was found to be very convenient.


To improve
To Improve

  • Missing: Editing of Test Plans! (A lot of Db hacks were necessary )

  • Constraints should be more dynamic (Currently Java Classes)… dynamic loading for plugins?

  • Old page still used for statistics and quick lookups.  Missing GUI features.


Next steps
Next Steps

  • Database Migration (Old web page will die)

  • Improvements of GUI statistic features (Summer Student)

  • Add test-plan editor to GUI.

  • Improve Scheduling Algorithm ( Michael)

  • Integrate other systems:

    • Beam Commissioning as SignOnly tests.  Migrate smoothly to partly automatized tests?

    • BIC/PIC tests?

    • Others?

  • More Automated Analysis (Powering Server)


Conclusion
Conclusion

  • System worked very well in 2011.

  • Some improvements (test plan editing, more dynamic constraints, new GUI features)

  • A lot to improve in analysis part!

  • Next Steps important:

    • Db migration.

    • Improvement of scheduling.

    • Improvement of automated analysis.

    • Integrate more systems.