1 / 44

APECS 2.0 Upgrading the APEX Control System

Dirk Muders, Heiko Hafok, MPIfR, Bonn. APECS 2.0 Upgrading the APEX Control System. APECS History APECS 2.0 Development The Upgrade Jan. 2009 Changes compared to APECS 1.1. APECS = A tacama P athfinder E xperiment C ontrol S ystem. APECS Origins.

freya
Download Presentation

APECS 2.0 Upgrading the APEX Control System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dirk Muders, Heiko Hafok, MPIfR, Bonn APECS 2.0Upgrading the APEX Control System • APECS History • APECS 2.0 Development • The Upgrade Jan. 2009 • Changes compared to APECS 1.1 APECS = Atacama Pathfinder Experiment Control System APECS 2.0 Upgrade, Group Meeting, 10.3.2009

  2. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APECS Origins • The ALMA Test Interferometer Control Software (TICS) was immediately usable at APEX due to the common hardware interface • Decided to re-use ALMA Common Software (ACS) and TICS and to benefit from large development team (already a dozen people in the year 2000) • ACS provides the Common Object Request Broker Architecture (CORBA) middleware

  3. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 CORBA in 30 Seconds… Client (Does not know deploy- ment) ACS ORB (Knows deploy- ment) ACS Container CORBA Component

  4. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 TICS • TICS is used to control the antenna via ACS/CORBA components to set up observing patterns • These components run under VxWorks to fulfill the real-time Controller Area Network (CAN) bus protocol • Testing the ALMA prototypes was performed mainly via low-level Python scripts using those components directly

  5. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 What else was needed for APE(X)CS ? • To use ACS and TICS at APEX all our devices (instruments, auxiliary hardware, etc.) needed to be represented as CORBA components • An astronomer-friendly user interface to set up typical sub-mm observing scans • An online calibration pipeline was required to provide data products for astronomers

  6. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APECS Component Interface • In contrast to ALMA, APEX was always supposed to have many different instruments (receivers and spectrometers), facility and PI • Thus we decided to define generic high-level instrument interfaces to be used by all devices of the same kind • This facilitates adding new devices and setting up observations enormously since it is now also generic

  7. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 SCPI Interface Level • The low-level hardware control systems could not use CORBA directly • Instead we adopted an SCPI (Standard Commands for Programmable Instrumentation) ASCII protocol via sockets to communicate between CORBA components and hardware controllers

  8. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APECS SCPI Setup ACS ORB ACS Container Real Hardware SCPI CORBA Component CORBA Simulated Hardware (emuEmbSys) SCPI

  9. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 SCPI Communication • Decoupling of CORBA and hardware controllers proved to be extremely useful because • it isolates APECS from hardware which is developed by many different groups / institutes • it allows to plug in simple emulator scripts to simulate a full system of instruments for APECS developments without real hardware • all CORBA components can be created fully automatically without any further manual interaction

  10. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 The APECS Pipeline • All communication between the Observing Engine and devices or other high-level applications is performed via ACS/CORBA

  11. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APECS 0.1 Installation (09/2003)

  12. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APECS 0.1 Installation (09/2003)

  13. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APECS 0.2-1.1 • Following the initial phase, APECS was debugged and extended a lot during the antenna and early instrument commissioning • We continued to deliver patches and new releases to provide control for more complex instruments and observing modes and to fix bugs • APECS 1.1 was still based on ACS 2.0.1 and ran on machines with RedHat Linux 7.2 (released in 2001, i.e. stone age !)

  14. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 The APECS Upgrade • Why upgrade at all ? • Mainly the old Linux was making trouble • Data rates ever increase and new servers are needed to handle them • RH 7.2 can no longer be installed on them • In addition, ACS 2.0.1 has a number of known issues that have been cured in later versions • Old Linux libraries limit the development of new APECS applications

  15. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APECS 2.0 Development • An upgrade was already planned in 2004 • In an effort led by J. Ibsen, ALMA had ported TICS to ACS 3.0/4.0 • But the ongoing APEX commissioning delayed our APECS porting • Only in 2007 we found the time to port the APECS and TICS codes to then ACS 5.0

  16. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 The VxWorks Drama • Unfortunately, we discovered bugs in the ACS property monitoring that required at least ACS 6.0.4 • But that ACS version no longer worked with VxWorks → canceled upgrade in 01/2008 • A joint effort of ESO, UTFSM, the Keck Observatory, Remedy IT in the Netherlands made ACS 7.0.2 work under VxWorks in mid 2008

  17. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APECS 2.0 Software • APECS 2.0 is based on ACS 8.0, Scientific Linux 5.2 (this is ahead of ALMA who use SL 4.4 !) and VxWorks 6.6 • Using an up-to-date ACS allows to benefit again from future ALMA developments and bug fixes • During 2008 most of the TICS and APECS functionality was kept aligned to APECS 1.1 • Some areas are more advanced in APECS 2.0 (e.g. Calibrator → Heiko’s presentation)

  18. APECS 2.0 Hardware APECS 2.0 Upgrade, Group Meeting, 10.3.2009

  19. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APECS 2.0 Deployment Control2 CORBA Services Observing Engine Abm CORBA Container with Antenna Components Antenna (ACU, PTC, WIU) CAN Bus CORBA Opt2 Frame Grabber Comp. DB2 Data Base Display2 FitsWriter Calibrator SCPI Instruments • Apexdev2 Observing Clients Monitoring Clients • Instruments2 CORBA Containers with Instr. Comp. • SCPI Parsers

  20. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 Software Changes • No changes in “apecs” CLI. Observing scripts will run like before • Access to CORBA properties via “apexObsUtils” did not change • Component and method access did change with new ACS (→ special training session)

  21. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 Operational Changes • To reduce the dependency on a working microwave link, APECS is now running only at the high site • Access from Sequitor and remote via VNCs. • “stopAllAPECSClients” now stops observer processes too. Always use “restartAPECSServers” which includes stopping the clients. Restart now takes only 6 min. ! • More details in the training session

  22. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 New Network Setup • In addition to the new APECS, the network has been upgraded to 1(/10) Gbit (reached 55 MB/s (!) for file transfers) and physically split into 3 subnets at Chajnantor: • Control (CORBA, SCPI) • Data (TCP streams from backends to FitsWriter) • Maintenance (everything else: web cams, thin clients, notebooks, etc.) • Additional “SciOps” subnet for observational machines in Sequitor

  23. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 Chajnantor Sequitor opt 2 Optical Camera Terminal to ABM Pointing Telescope Archive lastarria instruments2 instrument control rsync pular VNC client Hardware devices display2 FitsWriter Calibrator Control network, Data network Backends tacora VNC client control2 ObsEngine CORBA-Services VNCServer VNC Connections apexdev VNC client Telescope/Wobbler ABM apexdev2 Development APECS vncserver apexdb2 VNC client Monitoring Things WEB Cams tcserver Maintenance network

  24. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APECS 2.0 Testing [1] • During the installation a number of tests have been made to verify the performance: • Initial drive tests: no strange motions / vibrations • Tracking tests at different az/el: like in APECS 1.1 • Optical pointing runs: agree with previous results • Radio scans with SHFI and SZ (calibration, pointing, focus, skydip, on, raster, otf (also holo mode), spiral raster) produced MBFITS and Class data as expected. Line profiles and intensities agree.

  25. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APECS 2.0 Testing [2] • Throughput tests: • AFFTS with 28 x 8192 channels @ 10 Hz (8.75 MB/s !) without delays or other problems • SZ beam maps (which failed in Nov. 2008) with 141 subscans, OTF, continuous data (i.e. one 25 minute subscan !) without problems • Data acquisition via data network was tested with PBEs and SZ backend. ABBA/Bridge tests soon. Spectrometers later.

  26. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 Issues • APECS 2.0 is good, but not perfect: • ABM NC tasks (az/el & wobbler pos.) crashes sometimes. Being iterated with B. Jeram (ACS). • Online bolometer reduction needs iteration with real data. • Gildas as of 12/2008 has some bugs. Will need to update to a more recent version soon. • Cron jobs handling syslogs need to be adjusted. • optView GUI freezes after “movie” mode.

  27. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 Conclusions • APECS 2.0 with new network performs much better than the previous system • We are no longer bound to very old hardware and can fulfill requirements of new instruments • Observing modes that started failing last year are now possible again • New observing modes with higher data rates can now be used • Future developments made easier by new Linux and libraries.

  28. APECS 2.0 Upgrade, Group Meeting, 10.3.2009

  29. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APECS Design • APECS is designed as a pipeline system starting with a scan description (“scan object”) and eventually leading to data products • The pipeline is coordinated by the central “Observing Engine” • Most APECS applications are written in Python, but they use a lot of compiled libraries to speed up computations and transactions • The astronomer interface is an IPython shell

  30. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 Multi-Beam FITS (MBFITS) • The lack of a good format to store array receiver data of single-dish radio telescopes led to the development of the MBFITS raw data format • MBFITS stores the instrument and telescope data in a hierarchy of FITS files • MBFITS is now being used at the APEX, Effelsberg and the IRAM 30m telescopes

  31. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APECS 0.1 • The first version of APECS was installed in Chile in September 2003 • Much of the time was spent on hardware installations (network, racks, servers) • We had to fight initial problems with failing pressurized hard disk boxes and missing infrastructure

  32. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 The APECS Pipeline • For each scan the Observing Engine accepts a Scan Object from a user CLI and • Sets up the receivers (tuning; amplifier calibration) • Configures the backends • Sets up auxiliary devices such as synthesizers, IF processors or the wobbler • Tells the telescope to move in the requested pattern • Starts and stops the data acquisition • Asks the Calibrator to process the raw data to produce the final data product or result

  33. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APECS 2.0 Software (ctd.) • Basic operator / observer interfaces unchanged • Many improvements such as: • Jlog with automatic filter loading • New Qt GUI behavior (e.g. Calibrator Client) • New Qt widgets available • New Python (2.5) • New libraries (e.g. SciPy, NumPy) • New Gildas (12/2008) which fixes weighting errors when combining spectra

  34. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APECS' Astronomer CLI

  35. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 Software Changes [1] • Names of some variables have changed: • APEXROOT → APECSROOT • APEXCONFIG → APECSCONFIG • APEXSYSLOGS → APECSSYSLOGS • CORBA component name separator “/” instead of “:”, e.g. “APEX/RADIOMETER/RESULTS” • Component access (needs above syntax): • apexObsUtils.sc.get_device → apexObsUtils.sc.getComponentNonSticky

  36. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 Software Changes [2] • Property access compatible with APECS 1.1: • apexObsUtils.getMCPoint with arbitrary separator chosen from “/”, “:” and “.”, e.g. apexObsUtils.getMCPoint(‘APEX:HET345.state’) or apexObsUtils.getMCPoint(‘APEX.HET345/state’) • List of components: • apexObsUtils.sc.COBs_available → apexObsUtils.sc.availableComponents • Enums in MonitorQuery are now strings instead of integers (e.g. SHUTTER_OPEN)

  37. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 Software Changes [3] • Web scripts now need to run on “opt2” to fetch data from the DB2 (e.g. for the weather page) • Python wrapper for Gnuplot has been removed from SciPy. Use Pylab / matplotlib instead. • Observer account administration now on “apexdev2”

  38. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 Operational Changes [1] • Separate AMD maps to access /apexdata in Sequitor (lastarria:/apex_archive) and Chajnantor (display2:/apexdata) to minimize microwave link traffic. Initial reduction on “apexdev2”. Later maybe also on “display2”. • System VNC: control2:1 • Observing VNC(s): apexdev2:1 • Other VNC(s) (Wobb. GUI, SHFI, etc.): apexdev2:2/3/4 • Observing accounts only on “apexdev2”

  39. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 Operational Changes [2] • Syslogs are now split per hour to reduce link traffic • Syslogs can be loaded into “jlog” for inspection (“jlog –u $APECSSYSLOGS/APECS-<time stamp>.syslog.xml”)

  40. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 Operational Changes [3] • Two new accounts: • “apexops”: pointing & focus model, system source & line catalog administration in $APECSCONFIG • “apexdata”: raw & science data and obslog areas are owned by this account to avoid manipulations via the “apex” account whose password is not secret. The “apexdata” password must not be given to observers !!

  41. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 Operational Changes [4] • As a consequence the data production programs have to run under “apexdata” • This is accomplished by special scripts that use corresponding “ssh” and “sudo” commands: • fitsWriter start | stop | restart • onlineCalibrator start | stop | restart • obsLoggerServer start | stop | restart • obsEngine start | stop | restart (for convenience) • Do not use the old restart scripts ! • The overall system still runs under “apex” !!

  42. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 Operational Changes [5] • tycho, apexOptRun and MonitorQuery now to be run on “opt2” • DB2 entries now obey the complete naming hierarchy • “stopAllAPECSClients” now stops observer processes too. Always use “restartAPECSServers” which includes stopping the clients. Restart takes only 6 min. now ! • ABM console now via “minicom 1” on “opt2”

  43. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 APEX Staff To Do List • All Linux machines in Sequitor should be updated to SL 5.2 • At least “llaima” should be available for offline data reduction with SL 5.2 and APECS 2.0 soon • The LDAP and DNS services need to be moved to new servers (ideally off of normal PCs) • The remaining web cams etc. need to be reconfigured to the maintenance subnet • Port “fieldtalk” programs for optical camera to SL 5.2

  44. APECS 2.0 Upgrade, Group Meeting, 10.3.2009 We would like to thank the APEX staff for their support during the installation and tests of APECS 2.0 !

More Related