1 / 31

Software Tools for MPS

Software Tools for MPS. Kajetan Fuchsberger MPP Workshop, 2013, Annecy Many thanks for input from M. Zerlauth , J. Wenninger , R. Schmidt, G. Kruk, V. Baggiolini , G. Papotti , D. Jacquet and the TE-MPE-MS Software Team. Software Tools for MPS. The solution for everything ;-).

glenda
Download Presentation

Software Tools for MPS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Tools for MPS Kajetan Fuchsberger MPP Workshop, 2013, Annecy Many thanks for input from M. Zerlauth, J. Wenninger, R. Schmidt, G. Kruk, V. Baggiolini, G. Papotti, D. Jacquet and the TE-MPE-MS Software Team

  2. Software Tools for MPS The solution for everything ;-)

  3. Content

  4. Content

  5. Tests and Procedures • Currently: Sharepoint Sites • Disadvantages: • Order not enforced (scheduling done ‘on the fly’). • Nothing enforces the tests to be really done. • No real overview, what was done, what still has to be done?

  6. The AccTesting Framework • Designed from the experience of Hardware Commissioning. • Successfully Used in HWC Campaigns 2012, 2013 • Enforces Correct Order of Tests • Many additional features: • Automatic Scheduling • Constraints • Statistics

  7. AccTestingConcept Tests • Migrateof MPS CommissioningProcedures: • 1) Transform all the MPS comissioningstepstoAccTesting (simple sign-onlytests). • 2) Replaceonebyonewithautomatedtests. Test-Phases.

  8. Barriers • Whenthecommissioningof a systemreaches a Barrier Point, ithastowaituntil all othersystems (whichhavethe same barrier) reachit also. ReadyforInjection ReadyforPowering • Knowledgeof System-Dependenciesrequired!

  9. Interlocks based on Test Plan? • Examples: • Prohibit, that circuit can be powered, before all the tests are done. • Do not allow injection of beam before all necessary MPS tests are done. • Possibly: Enforce Tests after changes: • E.g. Which tests have to be executed, after a change of a QPS – card? • Pre-defined test plans or (partially) auto-generated? •  Knowledge about System-Dependencies required! • What to interlock? • E.g. Additional Circuit lock on PIC level which allows tests? • Injection interlock?

  10. Automation? • Automate whatever can be automated! (Avoid Errors, Reproducibility) • First steps: Concentrate on systems, where many interlocks of same type: • BIC/PIC: Already dedicated tool, in the pipeline to be integrated with AccTesting • Collimators • Integrate BIS connection Test? • Do we really have to look ‘by eye’ on lossmaps? • Could there be a Hierarchy-check without a reference? • Vacuum • BLMs: • E.g.: Test Of Latency: Close collimators: Done in pt3, pt7: Could be done for more, if automatic. • Others?

  11. System Dependencies & Information • Key Concept: UseexistingSources! • Prototype in place • PossibleCooperationbetween TE-MPE-MS softwareteamand BE-CO-DO (S. Jensen, D. Csikos). • System Providers: • Lsa • Layout Db • … Clients (e.g. AccTesting, Diamon, ….) • Information Providers: • Status • Issues (Jira) • Deployments • Faults? … • Dependency Providers: • Layout Db • Runtimeinfo (JMS) • Java codeanalysis • … System Information Server

  12. Test Analysis • Vision: • Simple descriptionof test-expectations (assertions) on signalsresultingfromtests (Java). • Universal GUI componentstovisualizeproblems. • First Implementation: • Cooperationbetween TE-MPE-MS softwareteamand BE-CO-DA (R. Gorbonosov, A. Jalal) • Used in AccTesting • couldbeused e.g. in future PM modules. (Maybe also in sequencertasksandotherchecks?)

  13. Data Analysis • TE-MPE-MS Vision: More generalData Analysis Framework • Key Concept: Performanalysisascloseaspossibletothedata! • (Potential) Collaborationwith BE-CO-DA tooptimizeresources, avoidduplicationofefforts • First Implementation in place. Logging DB Post Mortem Clients (e.g. AccTesting, Timber, Custom Analysis….) Hadoop Cluster? Others? Analysis Framework (Cluster?)

  14. Content

  15. Early detection of Failures PcInterlock GUI YASP ? monitor trim ? Dump! SIS Not Allowed! PcInterlock Server subscribe trim PC LSA Server PC PC

  16. LSA: The natural place to check • Applications: • YASP • Lumi-Scan application • Others… • Checks: • Power Converter Currents • Collimator Settings • BPM readings (actual + predictions) online modeling? • Others? LSA Server

  17. LSA checks - proposal • Vision: • Have an API like ‚isAllowed(TrimRequestrequest)‘ • First implementation: • Use ‚TrimPostProcessor‘ mechanismof LSA. • Executed after trimsaved in DB, on error LSA will do a rollback. • Incorporation = Trim  Same mechanismtoavoidpropagatingtrims, whichwouldotherwise e.g. failduringramp. • Overridefor MDs.

  18. Further securing LSA • LSA Method protection • Currently: Only basic protection (Sending of timing events, Trim, Drive) • Every method in LSA should be reviewed, if it can do harm or not and protected appropriately • Protect reference Cycles for PcInterlock/SIS: • First approach: TrimPostProcessor which evaluates RBAC roles

  19. Content

  20. Aperture Meter

  21. Aperture Meter - Status • Prototype Operational • Provedtobeveryusefulduring MDs • Follows the operational cycle (optics, BP, time in BP) • Listens to (some) LSA trims • Whatismissing? • Improvementsof User Interface • Performance Improvements • Tighterintegrationwith Operational changes(Collimatoroffsets after alignment, BPM usagefromYasp) • Couldbeusedasanothersourcefor LSA trim-verification (e.g. large collimatormovements, CollimatorHierarchy)

  22. Summary • MPS Commissioning: • Tracking in Acctesting Framework. • Automate! (StepbyStep) Candidates? • Preview: Twonewframeworks: • Tracking ofsystemdependencies • (Test-) Data Analysis • Early detectionof potential problemscouldbedone in by additional trimchecks in LSA. • Aperture Meter: Useful prototype, but needsimprovements.

  23. Tools, Tools, Tools, Tools, Tools, …? • …Tools do not solve everything! Culture & communication • (High Quality) Software is not for free! • Challenges: • Large part of Software manpower goes already into maintenance. • A lot of ‘grown’ projects, partly written by un-experienced programmers (E.g. Students). • How to improve? • Reliability =Maintainability =Quality (Self explaining code + Automated Testing, Testing, Tes…. ) • Supervision of any SW projects from two sides: System Expert + SW expert (‘Mentoring’) • Who has the overview and can judge what is already out there and into which framework a certain requirement would fit best? Communication! • Who is authorized to define priorities? Resources?

  24. Thank you for your Attention! Small Steps - Big Visions! Questions?

  25. Fail Fast! • Actual BP: • Would a certain trim exceed some limits? • Check before functional BP: • Collimator-limits prediction • Corrector Interlocks vs. Lsa settings •  Check at incorporation time

  26. Single Source for „Operational State“ • ForcertainapplicationsBeamModeis not enough! • Combinationof: Beam Mode, Beam Process, Time within BP, (Optics) • Currently (at least) 2 implementations: • Combinationof Beam Modes and PC statechanges(Aperture Meter) • Dedicatedtimingevents (PcInterlock) •  Both not optimal. Reference Implementation required.

  27. Fail Fast! • „A fail-fast system is designed to immediately report […] any failure or condition that is likely to lead to failure. ” [Wikipedia]

  28. Data Analysis • Mandate in BE-CO-DA (C. Roderick) toprovide Data Analysis Solutions • Currently in Use Case Analysis Phase • Aim to be as easy as possible for users to specify: • What to produce for an analysis case (e.g. signal names, units, description) • What are the inputs (signals and sources) • Which algorithms should be applied • When analysis should be performed (event driven e.g. fill / beam mode, time interval driven e.g. every 15 minutes) • Without user needing to: • Configure a specific development environment • Care about compilation and deployment • Look elsewhere for signal names etc. i.e. provide directory of signals and sources • Key concept: Performdataanalysisascloseaspossibletothedata!

  29. MPS Commissioning • Whatismissing? • Dependenciesbetween Systems (e.g. ELQA for a circuitisready, whenthetestaredonefor all magnetswithinthecircuit) • Global Phases? Injection Powering IST

  30. System Dependencies • Vision: Central Instance (Server?) toanswerquestionslike: • Whichmagnetsareconnectedto a certain power converter? (AccTesting – ELQA) • Whatisthecommonpointfor a setofcrates? (Diamon) • Whatsystemshavetobepotentiallyre-testedif e.g. a QPS cardischanged? (AccTesting) • Whatsystemsarepotentiallyaffectedby a OS upgrade on a certainhost? (SUWG)

  31. Interlocksduringfunctional BPs • In placefororbitcorrectors. Readytobeextendedtoothercircuits. • Potentiallyusefulforothersystems, e.g. Collimatormovements?

More Related