1 / 74

LHC Control System Tutorial for the FNAL LARP group

LHC Control System Tutorial for the FNAL LARP group. Jim Patrick November 9, 2005. General Overview. Taken from a number of talks, mostly: ICALEPCS Conference, October 2005 (33 CERN presentations!) December 2004 AB/CO “Controls Day” Plus other miscellaneous talks Goals:

luana
Download Presentation

LHC Control System Tutorial for the FNAL LARP group

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LHC Control SystemTutorial for the FNAL LARP group Jim Patrick November 9, 2005

  2. General Overview • Taken from a number of talks, mostly: • ICALEPCS Conference, October 2005 (33 CERN presentations!) • December 2004 AB/CO “Controls Day” • Plus other miscellaneous talks • Goals: • Give some practical flavor of the system, but not prepare one to write actual code. • Relate concepts to FNAL system. • I am not at all an expert.

  3. Outline • Overview • Timing • Device Model • Front-End Software • Application Software and Development • Services • Security • Summary

  4. The LHC Control System B. Frammery For the CERN - AB/CO Group b. frammery -10.10.2005

  5. Until 2003 Since 2003 AB Department SL Division PS Division CERN machines (LEP) LHC b. frammery -10.10.2005

  6. Strategy • Develop new software and hardware infrastructures • For LHC • To be used & tested on all the new developments • To be spread over all the CERN accelerators at a later stage • Integrate industrial solutions as much as possible Meaning that, meanwhile, the “legacy” controls for LINAC2, the PSB, the PS and the SPS are to be maintained b. frammery -10.10.2005

  7. FIXED DISPLAYS CENTRAL OPERATOR CONSOLES LOCAL OPERATOR CONSOLES PUBLIC ETHERNET NETWORK T T T T PLCs T T T T T T FIP/IO PROFIBUS LHC controls architecture diagram TCP/IP communication services Linux/HP ProLiant APPLICATION SERVERS PVSS /Linux PC SCADA SERVERS FILE SERVERS CERN GIGABIT ETHERNET TECHNICAL NETWORK TCP/IP communication services TIMING GENERATION RT/LynxOS VME Front Ends Linux/LynxOS PC Front Ends cPCI Front Ends PLCs TCP/IP communication services WorldFIP SEGMENT (1, 2.5 MBits/sec) OPTICAL FIBERS BEAM POSITION MONITORS, BEAM LOSS MONITORS, BEAM INTERLOCKS, RF SYSTEMS, ETC… QUENCH PROTECTION AGENTS, POWER CONVERTERS FUNCTIONS GENERATORS, … ANALOGUE SIGNAL SYSTEM ACTUATORS AND SENSORS CRYOGENICS, VACUUM, ETC… b. frammery -10.10.2005 LHC MACHINE LHC MACHINE

  8. Software frameworks b. frammery -10.10.2005

  9. The software frameworks (1) • Front-End Software Architecture (FESA) Complete environment for Real-Time Model-driven control software implemented in C++ for the LynxOS and Linux platforms • Java framework for accelerator controls • Uses J2EE application servers with lightweight containers • Plain Java objects (no EJB beans) • Applications can run (for test) in a 2-tier setup • Unified Java API for Parameter Control (JAPC) to access any kind of parameter. • Runs on Linux platform b. frammery -10.10.2005

  10. The software frameworks (2) • UNified Industrial Control System (UNICOS) • Complete environment for designing, build and programming industrial based control systems for the LHC. • For cryogenics, vacuum, environmental controls etc. • Supervision layer: PVSS II (SCADA from ETM) • Cross communication with accelerator controls framework possible UNICOS & the Java framework for accelerator controls use the same graphical symbols and color codes b. frammery -10.10.2005

  11. Machine Timing& sequencing b. frammery -10.10.2005

  12. Overview ICentral Timing. What’s the CBCM ? • The Central Beam and Cycle Manager CBCM is a collection of hardware and software systems responsible for coordinating and piloting the timing systems of CERN’s accelerators. • In the LHC era, the CBCM will control Linac-II, Linac-III, the PSB, CPS, ADE, LEIR, SPS and the LHC timing systems. • The CBCM will also drive the Beam Synchronous Timing (BST) for LHC. There will be 3 distributions R1, R2, Experiments.

  13. Hardware IILHC MTG Energy/Ring Intensity/Ring Safe Beam Flg Beam present Flg Extraction permit Flg BIC Beam permit Flg CPS and SPS telegrams and timings and MTG synchronization when filling LHC MTG Safe Params Main MTG GMT LHC Preloaded Injector Sequences Preloaded LHC Sequence 3 BST Cards BST 2.2 G-Bit / S optical link 64Mb Reflective memories Clocks: Bunch Clock 40..8 MHz. Frev ticks at 89us. 40.00 MHz GPS clock 1PPS (1Hz) clock Basic period clock External Conditions and Events

  14. LHC Timing cable • The LHC telegram which will contain at least the following information: 0x14xxyyyy • USER: The cycle ID, it has values like PILOT, NOMINAL, DUMP MD… • PARTY1: The particle type in Ring-1, Protons/Ions from LEIR • PARTY2: The particle type in Ring-2 • ENERGY1: The Beam Energy in Ring-1 • ENERGY2: The Beam Energy in Ring 2 • INTEN1: The Beam Intensity in Ring-1 • INTEN2: The Beam Intensity in Ring-2 • RING: The next ring to be injected Ring-1, Ring-2, NONE • BUNCH: The next target bunch position in the ring 0..35640 • BATCH: The actual batch number in the ring 1..12 • BATCHES: The number of CPS batches • MODE: The machine mode, Pre-injection, Injection, Ramping, Physics, DUMP etc • BPNM: The basic period number from the start of the cycle • COMLN: Timing trigger bit patterns which are calculated by the CBCM to trigger specific actions. • STATUS: Machine status bits like, OK, ABORT, QUENCH • BEAMID: Identifies the next beam in all injectors • The SPS telegram • The CPS telegram • The UTC time each second • The LHC 1KHz events 0x0100xxxx • The LHC machine eventsCTIM X:=:0x13xx0000 • Some CPS & SPS events such as the SPS extraction kicker warning pre-pulse. Can have NO “next” lines Payloads = 0000 for LHC events CTIM X = F (code) Event Type = 3 LHC Machine = 1

  15. Overview IVTiming Reception FESA Tim/Tgm Lib CTRx SPS Tg8 PS Tg8 • The CTRx V/VME I/PCI P/PMC • Down to 1ns UTC time stamping if HPTDC installed, else 25ns • 50MHz external clocks • 1PPS 1KHz 10MHz and 40MHz internal clocks • Counters are 24-bit • 2048 actions supporting MP and PPM • Telegram and Payload handling • Full counter remote control • Fully Integrated into FESA, Alarms monitor

  16. Data Management b. frammery -10.10.2005

  17. Databases : the 4 domains of data Controls Configuration Installed Equipt Type Optics Powering Machine Layout Computer Address Consistent naming and identification scheme as defined in Quality Assurance Plan Equipment Catalogue Serial Number Operational Data Settings Measurements Alarms Logging Post-Mortem PhysicalEquipment b. frammery -10.10.2005

  18. Device/Property Model • A device is a named entity within the control system, which corresponds to a physical device (Beam Position Monitor, Power Converter) or to a virtual controls entity (e.g. transfer line) • The state of a device is accessed via propertiesand can be read or modified by the get and set access methods. (synchronous and asynchronous) • Uses CORBA, hidden from the user by “Controls Middleware” • Property can be monitored (publish/subscribe) • A cycleSelector or a polling period can be specified • Optional on-change mode: client will be notified only when property has changed (server criteria). • Uses Java Message Service (JMS) “publish/subscribe” technology • Device classes can implement many properties of simple type or few properties of composite type

  19. Device Model • Devices refer to higher level constructs than in ACNET • Devices have properties; may have more than one • All properties are like C-structures, not confined to READING, SETTING etc. as in ACNET • You name the elements (“parameters”) • These have associated datatype, units, dimension, minimum and maximum value etc. • Can be atomic (single element) or composite (multiple elements) • Scaling assumed to be done by front-end • Device structure defined in “MetaProperty” classes for each general type • Formal hierarchical naming scheme • A “Working Set” device is a collection of devices

  20. Beam Current Transformer Acquisition as example of composite property Standard Entries

  21. Input Form for metadata

  22. Generic Workset Display Program

  23. Generic Control Knob Component

  24. Initialisation Parameters • Any status or value control parameter for a device can be stored as a REFERENCE. This includes arrays for function generators. • This can be done for any of up to 64 virtual machines which configure our accelerators for a particular operation like injecting protons in the LHC. • Particular sets of values can be stored in named ARCHIVES for a virtual machine. Used to set up the machines for a particular operation. • The Directory Service provides interfaces for storing and retrieving REFERENCES and ARCHIVES.

  25. JAPC • “Java API for Parameter Control” • Single API for all Java applications to access devices (physical / virtual) • Based on the concept of parameter (device/property) • Unified and simple access to various systems • Hardware (via Controls Middleware – CMW; Including PVSS devices) • Directory Service (descriptions) • SDDS logged data, Simulation • Virtual Parameters in the middle tier • Provides more services to applications • Metadata, descriptions • Groups, Caching, Transactions Common Application Infrastructure – AB/CO Day - Lionel Mestre

  26. JAPC Code example • Counter device named Counter11 with one property with one parameter named Measurement that is an int: Parameter p = Factory.createParameter(“Counter11”, “Measurement”); CycleSelector sel = Factory.createCycleSelector(END_OF_CYCLE); ParameterValue value = par.get(sel); int counts = value.getInt(); • Code generation facility to make pseudo-”wide” API • e.g. par.getMeasurement(); • Compile time check

  27. FESA generic services

  28. 0.Outline • Our offering to the equipment-specialist “A comprehensive offering consisting of a model, method, framework, suite of tools and set of utility packages and support services” • Our progress at a glance “FESA switches from being a project to being an open-ended activity”

  29. 2.Service offering • Model • Method • Tools • Framework • Utility packages • Documentation • Training • Support • Consulting • Requirements ……..Formal generic-model and customization-language …….....Workflow formalized as a step-by-step method …...............One tool dedicated to each step of the above …..... Reusable C++ package which can be tailored ...“A la carte” interfacing with PLC and timing .........Essentials, tools’ on-line documentation ………………………..mostly ad hoc, on-line tutorial ……………………………………….mostly ad hoc …………………………………….long-term goal ….……………......Issue management system

  30. 3.Client needs coverage as today 7 Test 6 Instantiate a Hardware configuration Configure timings 5 4 b Deploy on FEC Field-bus standards Implement in C++ 3 Configure Alarms 2 1 Design Equipment Software http://project-fesa.web.cern.ch/project-fesa/

  31. FESA Development • Framework attempts to automate development • Minimize code that must be written by the developer via automatic generation of code and configuration information • Library support for timer cards and common devices • Generic GUI tools guide one through 4 main phases: • Designing the class structures (internal variables, real-time scheduling, external API etc.) • Deploying the class on a front-end computer • Instantiating 1 or more instances of a deployed class (defining configuration values for internal variables, real-time scheduling etc.) • Testing over the accelerator middleware ICALEPCS 2005

  32. FESA Development ICALEPCS 2005

  33. LHC Software Architecture • All accelerators share common characteristics • Create a model that captures those characteristics important for control • Have a common domain model • Have common software components to work with this model • Rationalize software development to reuse and extend the common parts to control all accelerators and transfer lines Common Application Infrastructure – AB/CO Day - Lionel Mestre

  34. Client Tier Applications JAPC API (TH1.5-8O) LSA Client API JAPC CMW/RDA JAPC Remote Client - JMS JAPC-LSA Spring HTTP Remoting / Proxies JMS HTTP HTTP Business Tier (Web Container) JAPC Remote Server - JMS LSA Client API LSA Client implementation CORBA IIOP Parameters Concentration LSA CORE (TH1.4-8O) (Settings, Trim, Trim History, Generation, Optics, Exploitation, Reference) JAPC Data Access Object (DAO) JAPC CMW/RDA Hibernate / Spring JDBC CORBA IIOP JDBC Devices Datastore Architecture for LHC Controls – iCALEPCS 2005 - Lionel Mestre

  35. BLM1 BLM2 BLM3 BLM4 BLM5 BLM6 BLM7 BLM8 BLM9 BLM10 BLM11 BLM99 BLM1 BLM2 BLM3 BLM4 BLM5 BLM6 BLM7 BLM8 BLM9 BLM10 BLM11 BLM99 BLM1 BLM2 BLM3 BLM4 BLM5 BLM6 BLM7 BLM8 BLM9 BLM10 BLM11 BLM99 BLM1 BLM1 BLM1 BLM1 BLM1 BLM2 BLM2 BLM2 BLM2 BLM2 BLM3 BLM3 BLM3 BLM3 BLM3 BLM4 BLM4 BLM4 BLM4 BLM4 BLM5 BLM5 BLM5 BLM5 BLM5 BLM6 BLM6 BLM6 BLM6 BLM6 BLM7 BLM7 BLM7 BLM7 BLM7 BLM8 BLM8 BLM8 BLM8 BLM8 BLM9 BLM9 BLM9 BLM9 BLM9 BLM10 BLM10 BLM10 BLM10 BLM10 BLM11 BLM11 BLM11 BLM11 BLM11 BLM99 BLM99 BLM99 BLM99 BLM99 Logging Fixed Displays Operator Console 1 Operator Console 2 Operator Console 3 Operator Console 4 Broadcasting Publication BLMs Concentration 4000 Beam Loss Monitors … Architecture for LHC Controls – iCALEPCS 2005 - Lionel Mestre

  36. Operator Console 1 Operator Console 2 Operator Console 3 many applications many applications many applications Complex Business Logic (Settings, Trim, Trim History, Generation, Optics, Exploitation, Reference) Devices Datastore Architecture for LHC Controls – iCALEPCS 2005 - Lionel Mestre

  37. Beamline Control Graphical User Interface Middleware Beamline Layout H2 = [ Tax1, Bend1, Coll3, …] App Server (Container) Beamline H2 150 GeV e- Layout = [ Tax1,Bend1, Coll3, …] Beamline Settings 150 GeV e- = [ ] Tax1 Bend1 Coll3 Mot5 Mot3 Mot4 HardwareConfig Middleware MotorDataMod. Magea DataMod. MotorDataMod. Extrapolation to Beamline Settings

  38. Complexity must be handled • Need of Standard Services • Service discovery (find where services are) • Remoting (split application among tiers) • Transaction handling (multiple device “sets”) • Database access (object-relational mapping) • Security (Who/what/where can access) Architecture for LHC Controls – iCALEPCS 2005 - Lionel Mestre

  39. One Answer : J2EE + EJB • Infrastructure provides standard services • Widely used in industry • In house experience • Change of programming model • Intrusive • Force the use of container • Force the use of components • Tie the persistency to the container • Debug with application server on local PC • Deployment hell Architecture for LHC Controls – iCALEPCS 2005 - Lionel Mestre

  40. Another Answer :J2EE – EJB + Spring Framework • Design for 3 logical tiers • Run 2 or 3 physical tiers • Developers write plain Java • No change in the programming model • Focus on our domain • No time for doing infrastructure Architecture for LHC Controls – iCALEPCS 2005 - Lionel Mestre

  41. Applications • Trim (perform + history browser/revert) • Orbit Steering • Generic Equipment Control • Fixed Displays • SDDS Logger and Viewer • Optics Twiss viewer • Settings Generation • Using the Application Frame Common Application Infrastructure – AB/CO Day - Lionel Mestre

  42. Generic Equipment Control LSA for SACEC

  43. Generic Measurement LSA for SACEC

  44. Trim LSA for SACEC

  45. Trim history LSA for SACEC

  46. Visualization of the settings LSA for SACEC

  47. Orbit Steering LSA for SACEC

  48. Fixed Displays Common Application Infrastructure – AB/CO Day - Lionel Mestre

  49. Optics Display Common Application Infrastructure – AB/CO Day - Lionel Mestre

  50. Results and Future Targets • Control of TI8 (October 2004) • Steering of the SPS ring orbit • LEIR controls • SPS start-up • Extraction sequencing TI2/TI8 • LHC sector test Architecture for LHC Controls – iCALEPCS 2005 - Lionel Mestre

More Related