1 / 37

Controls & Monitoring Status Update

Controls & Monitoring Status Update. J. Leaver 05/11/2009. Infrastructure. Infrastructure Issues. Control PCs & servers Application management Client-side application launcher Soft IOC run control Configuration Database (CDB) EPICS Interface

arien
Download Presentation

Controls & Monitoring Status Update

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Controls & Monitoring Status Update J. Leaver 05/11/2009

  2. Infrastructure

  3. Infrastructure Issues • Control PCs & servers • Application management • Client-side application launcher • Soft IOC run control • Configuration Database (CDB) • EPICS Interface • Ensuring the validity of recorded run parameters • Protocols for updating & maintaining software on control PCs (see PH’s talk) • Alarm Handler (see PH’s talk) • Channel Archiver (see PH’s talk) • Remote Access (see PH’s talk)

  4. Control PCs & Servers: Current Status miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5 miceecserv miceserv1 target2 miceopipc1 micecon1

  5. Control PCs & Servers: Current Status • General purpose server PC • Currently runs EPICS servers for: • FNAL BPMs • DATE Status • Config. DB User Entry Data • CKOV Temp. & Humidity Monitor • DSA Neutron Monitor • Will also run EPICS servers for: • Network Status • CKOV + TOF CAEN HV Supplies miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5 miceecserv miceserv1 target2 miceopipc1 micecon1

  6. Control PCs & Servers: Current Status • Target DAQ & Control PC • Currently runs: • Target / Beam Loss Monitor DAQ • Will run EPICS servers for: • Beam Loss Monitor • Target Controller miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5 miceecserv miceserv1 target2 miceopipc1 micecon1

  7. Control PCs & Servers: Current Status • Target Drive IOC (vxWorks) • EPICS server for Target PSU & extraction motor miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5 miceecserv miceserv1 target2 miceopipc1 micecon1

  8. Control PCs & Servers: Current Status • Beamline Magnets IOC (vxWorks) • EPICS server for Q1-9, D1-2 miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5 miceecserv miceserv1 target2 miceopipc1 micecon1

  9. Control PCs & Servers: Current Status • Decay Solenoid IOC (vxWorks) miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5 miceecserv miceserv1 target2 miceopipc1 micecon1

  10. Control PCs & Servers: Current Status • Linde Refrigerator IOC (PC) miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5 miceecserv miceserv1 target2 miceopipc1 micecon1

  11. Control PCs & Servers: Current Status • ‘EPICS Client’ Server PC • Runs all client-side control & monitoring applications • Runs infrastructure services: • Alarm Handler • Channel Archiver miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5 miceecserv miceserv1 • Large wall mount display shows: • Alarm Handler panel • Log message viewer • Display may also be used to show any (non-interactive) panel containing information that must be monitored for the duration of a specific run target2 miceopipc1 micecon1

  12. Control PCs & Servers: Current Status • Gateway / Archiver Web Server PC • Runs Channel Access Gateway, providing read-only access to PVs between MICE Network & heplnw17 • Runs web server enabling read-only access to Channel Archiver data • Currently running non-standard OS for control PCs • Will reformat after current November/December run period • See PH’s talk miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5 miceecserv miceserv1 target2 miceopipc1 micecon1

  13. Control PCs & Servers: Current Status • General purpose Operator Interface PC • Primary access point for users to interact with control & monitoring panels • Essentially a ‘dumb’ X server – runs all applications via SSH from miceecserv miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5 miceecserv miceserv1 target2 miceopipc1 micecon1

  14. Control PCs & Servers: Current Status • Additional General purpose Operator Interface PCs • Currently running non-standard OS for control PCs • Useable, but not optimally configured… • Cannot disturb at present - will reformat after current November/December run period • See PH’s talk • Shall be renamed miceopipc2 & miceopipc3 miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5 miceecserv miceserv1 target2 miceopipc1 micecon1

  15. Application Launcher • New application launcher replaces DL TCL script • XML configuration file, easy to add items • Unlimited subcategory levels • Provides real-time display of application status • Configurable response to existing application instances

  16. Application Launcher: App Status Application was previously launched by an external process, but is no longer running (return value unknown) Application was killed by a signal & is no longer running Application is running, but was not executed by this launcher Application is running Application quit with an error code & is no longer running

  17. Application Launcher: External App Response • Multiple application launchers will be operated simultaneously • On miceopipc1-3 (via SSH from miceecserv) & miceecserv itself • Need to ensure that shifters using different launchers do not ‘conflict’ • If operator attempts to execute an application that is already running, launcher has a configurable response: • Ignore: Launch another instance • Inhibit: Prevent another instance from running • Kill: Close existing instance & run a new one (e.g. could be used for a ‘master’ override control panel) • Typical configuration: • Only one instance of each ‘control’ application may run (cannot have multiple users modifying the same parameter!) • Unlimited numbers of monitoring panels may be opened

  18. Soft IOC Management • Application launcher primarily concerned with managing client-side control & monitoring panels running on miceecserv • Also need to implement run control for corresponding EPICS servers • ‘Hard IOCs’ running on vxWorks (i.e. servers provided by DL) are always ‘on’ → require no routine external intervention • ‘Soft IOCs’ running on control PCs (i.e. servers produced within the Collaboration) are executed like any other application → require user control • Why can’t soft IOCs just run at system start-up, like any other service? • Assumes that servers run perpetually, unattended – not true! • Sometimes need to modify configuration files, requiring server restart • Servers sometimes crash due to hardware problems, requiring restart • May need to turn off or reconfigure hardware – cannot do this while a soft IOC is running • Shifters should not have to worry about Linux service management…

  19. miceiocpc1 CmdExServer FNAL BPM Server DATE Status Server Config. DB User Entry Data Server • NB • Current remote launcher configuration only includes a subset of the servers assigned to miceiocpc1 • Others will be added as they become available Network Status Server etc. micetk1pc CmdExServer AFEIIt Server target1ctl CmdExServer Beam Loss Monitor Server Target Controller Server Soft IOC Management • CmdExServer provides similar functionality to normal application launcher, but with an EPICS interface • Each IOC PC runs an instance of the CmdExServer at start-up • CmdExServer manages local soft IOCs • Client-side ‘remote application launcher’ communicates with all CmdExServers & allows user to start/stop IOCs

  20. Custom EPICS PV backup & restore client is functionally complete Enables manual backup & restore of set point values Automatically backs up set parameters when DATE signals end of run Currently stores values in local XML file archive Automatic backup files transferred to CDB via SSH/SCP to heplnw17 Temporary solution → will be replaced with direct SOAP XML transactions once RAL networking issues resolved Need publicly accessible web server on heplnw17 Restoration of parameters from CDB will also be implemented once SOAP transfers are possible Configuration Database: EPICS Interface

  21. Configuration Database: User Entry User Entry Client • Not all parameters required for a CDB ‘run’ entry are available through normal EPICS channels • i.e. Relevant IOCs & integration with the DAQ are not yet complete • Currently only beamline magnet currents can be backed up from ‘live’ servers • (Quasi) temporary solution: • Generic EPICS data server hosts PVs for all values missing from existing IOCs, so they can be read by backup/restore client • User entry client allows shifter to enter required parameters before initiating a run • As future work progresses, unnecessary user entry items will be removed • However, shall always require some degree of manual data entry CDB Data Server (miceiocpc1) Backup / Restore Client

  22. Ensuring the Validity of CDB Entries • Vital that set point values remain fixed during each standard run • If set point value record in CDB does not represent physical state of system for entire run, data are invalid • Implement following protocol to ensure invalid runs are correctly identified: • CDB data server hosts run status PV • User entry client automatically sets run status to true when user submits current run parameters • At this stage, users should not modify set point values again until run is complete • Dedicated monitor IOC checks all set point PVs while DATE is in ‘data taking’ state → sets run status to false if any value changes (to do) • Alarm Handler monitors run status→ immediately warns that run is invalid if any user modifies a set point value (to do) • run status incorporated in CDB run parameters record

  23. Control & Monitoring Systems

  24. C&M Systems Overview

  25. C&M Systems Developed by Local MICE Community

  26. Target: Controller • Target Controller Stage 1 upgrade underway • Hardware essentially complete • Currently working on Controller firmware (P. Smith) • Software nearing completion • Hardware driver framework in place • Have implemented all EPICS server / client functionality • Tested using ‘virtual’ Target Controller Device • Remaining task: write low-level hardware driver plug-in once firmware is complete • Stage 1 upgrade - Control functionality includes: • Set delays • Set / monitor Target depth • Park / hold, start / stop actuation • Monitor hardware status

  27. Target: Beam Loss • Standalone DAQ system upgrades: • Final algorithm selected for ‘absolute’ beam loss calculation • Thanks to AD for implementation • Standalone event viewer can now follow real-time output of DAQ • Target Beam Loss IOC will read local data archive written by DAQ & serve values as PVs • Enables integration with Alarm Handler & readout of actuation numbers for CDB run data entries • IOC will use same analysis & file access code as standalone DAQ, via C wrapper functions • See PH’s talk

  28. DATE Status • Need mechanism for reporting current DAQ state via EPICS • Required for user feedback, alarm handling & triggering automatic CDB run set point value backups • Simple (‘dumb’) data server hosts DATE status PV • Client application reads DATE status from DIIM server, forwards value to EPICS server • Server & display client complete – DATE integration should be complete before end of Collaboration Meeting (JSG…?) EPICS Data Server (Single ‘status’ PV) DATE Client

  29. Network Status • Need to verify that all machines on Control & DAQ networks are functional throughout MICE operation • Two types of machine • Generic PC (Linux, Windows) • Hard IOC (vxWorks) • EPICS Network Status server contains one status PV for each valid MICE IP address • Read status: PC • SSH into PC (using key files, for security) • Verifies network connectivity & PC identity • If successful, check list of currently running processes for required services • Read status: Hard IOC • Check that standard internal status PV is accessible, with valid contents • e.g. ‘TIME’ PV, served by all MICE ‘hard’ IOCs

  30. Network Status • Server & client are functionally complete • Client displays status of all PCs & hard IOCs, scans at user-specified period (with ‘check now’ override) • Currently not configured for use on the MICE network… • Necessary to compile list of required services for each MICE PC • Network specification not yet finalised • Need to confirm how status server will connect to PCs on both Control & DAQ networks

  31. Other ‘Local’ C&M Systems • TOF • CKOV • Diffuser • DSA Neutron Monitor • KL Calorimeter • Electron Muon Ranger See PH’s talk Remain unallocated…

  32. Daresbury C&M Status Update A. Oates

  33. Recent DL Controls Work • New ‘IOC controls server’ installed • Provides boot/configuration parameters for DL hard IOCs (vxWorks): miceioc1, miceioc2, miceioc4 • Enabled Online Group to take ownership of old ‘IOC controls server’ (miceserv1) & associated Channel Archiver duties • Provided the Online Group with general EPICS assistance • EPICS architecture guidance • Channel Archiver web server installation • Changed both Target Drive systems to incorporate new spilt rail power supplies • Fault diagnosis and correction on Target 2 Drive controls in R78 • Prepared quotation for Cooling Channel Control System • Completed requested changes to Magnet PSU databases

  34. Schedule & Final Notes

  35. Schedule

  36. Schedule

  37. Final Notes • Good progress on all fronts • Groundwork required to unify MICE control systems & create a user-friendly interface is well underway • Alarm Handler, Channel Archiver operational • Unified application launchers complete (bad old days of logging into individual PCs & typing commands are over!) • CDB interface functional • Lots of work still to do! • Controls effort is ‘significantly understaffed’ (DAQ & Controls Review, June ‘09) • We are always operating at full capacity • ‘Controls & Monitoring’ is such a broad subject that many unexpected additional requirements crop up all the time… • Please be patient!

More Related