1 / 57

Outline

Photon Systems Controls and Data Systems Gunther Haller Particle Physics and Astrophysics Division Photon Control and Data Systems (PCDS) Group June 9, 2009. v2. Outline. LCLS versus LUSI Responsibilities PPS/HPS ( see previous presentation by Enzo) XTOD X-Ray Transport Optics & Diagnostics

aren
Download Presentation

Outline

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Photon Systems Controlsand Data SystemsGunther HallerParticle Physics and Astrophysics DivisionPhoton Control and Data Systems (PCDS) GroupJune 9, 2009 v2

  2. Outline • LCLS versus LUSI Responsibilities • PPS/HPS ( see previous presentation by Enzo) • XTOD X-Ray Transport Optics & Diagnostics • AMO Instrument Controls • LUSI Instruments • Diagnostics • Machine Protection System • Laser Safety System • EPICS Software • Femto-Second Laser Timing System • Data Acquisition (see Amedeo Perazzo presentation) • 2-D Detector Control and Readout (XPP, CXI, XCSS) • Offline/Computing

  3. LCLS & LUSI • Common photon-area Controls and DAQ System Design for LCLS and LUSI • Includes Front-End Enclosure (FEE), Near Experimental Hall (NEH), X-Ray Tunnel (XRT) and Far Experimental Hall (FEH) • Most of XTOD is located in FEE • LCLS Controls & DAQ W.B.S 1.6.2 • AMO experiment (NEH, hutch 1) • All controls and DAQ • Common services for all hutches • Examples • PPS/HPS/LSS • 120 Hz beam-quality data interface • Machine protection system interface • Network interface • Local science data online processing & cache • 2-D detector • Control and DAQ for detector • LUSI Control & DAQ W.B.S • XPP, XCS, CXI Instruments, Diagnostics • Examples of documents released • 1.1.523 XES-LUSI ICD (includes Controls and Data systems) • 1.1-516 XES Photon Controls to Electron Beam Controls System ICD • 1.1.517 XES Photon Controls to Electron Beam Controls MPS ICD

  4. XES Near & Far Hall Hutches and Beamline Layout (not to scale) 230 m AMO SXR MEC

  5. XR Diagnostics in Front End Enclosure Indirect Imager / K measurement / spectral reticule Total Energy Calorimeter Direct Imager Solid Attenuator Gas detector Diagnostic Package Fast Valve+ Fixed Mask+ Slit 5 mm diameter collimators Gas Attenuator Low-energy beam v v High-energy beam 24 mm vertical 900 mm horizontal FEL Offset mirror system u 10-10 Torr w Gas detector Muon Shield

  6. XTOD in FEE • Phase 1 of XTOD in commissioning • Controls installed • Being tested paced by availability of beam-line items installed

  7. Photon Area Controls and Data Systems Devices • Created web-pages to list supported controls and data-acquisition devices for photon area • For all experiments in xray area (AMO, XPP, XCS, SXR, CXI, etc) • SXR might have up to 8 different end-stations • http://confluence.slac.stanford.edu/pages/viewpage.action?pageId=9175609 • Additional devices can be added after discussions with Photon Controls and Data Systems (PCDS, G. Haller) group

  8. AMO Instrument Control • Requirements listed in Engineering Specification Documents (ESD’s) • 1.6-108 AMO Controls ESD, status: released • 1.6-109 AMO DAQ ESD, status: released • Interfaces specified in Interface Control Document (ICD) • 1.1-515 XES AMO Controls ICD, status: released • Held Final Design Review 11-10-08 • http://confluence.slac.stanford.edu/display/PCDS/AMO-FDR • Purchased all hardware • Racks with infrastructure/AC installed in hutch 1 • Controllers/DAQ being tested in lab before moving to NEH early next month • See pictures later in presentation

  9. AMO and its DAQ Instruments • Diagnostic Section • magnetic bottle electron time-of-flight spectrometer • x-ray emission spectrometers • pulse energy monitor • beam profile screens • High Field Physics End-Station • electron time-of-flight spectrometers • ion spectrometers • time-of-flight, imaging, momentum • x-ray emission spectrometers

  10. Example: Vacuum • All gauge controllers are MKS 937A • Interface • Terminal server – DIGI TS16 MEI • Automation Direct PLC • All ion pump controllers are Gama Vacuum DIGITEL MPC dual • All valves are controlled by PLC relay module • The out/not-out state of all valves go into the MPS system to prevent damage if a valve closes unexpectedly.

  11. Example: Motion • Control System provides support for all motions • Motors • IMS MDrive Plus2 integrated controller and motor • IMS MForce Plus2 controller for control of in vacuum and other specialized motors • Newport motor controllers • Others as required • Pneumatic motion • Solenoid Driver chassis, SLAC 385-001

  12. AMO Test Setup in Lab Racks in lab, AMO controllers being installed Pulse picker and also DAQ test setup

  13. AMO in Hutch 1 • Racks installed, AC connected • Will move loaded crates from lab to NEH racks in July

  14. Software: AMO Example: Moving Tables

  15. AMO Test Setup in Lab AMO table motion control mock-up for motion software algorithm Smart Motors

  16. LUSI & SXR • LUSI includes the instruments • XPP (NEH, hutch 3) • XCS (FEH, hutch 4) • CXI (FEH, hutch 5) • Have written and released ESD’s and ICD’s for each instrument • Held XPP Final Design Review, CXI & XCS Preliminary Design Review • See under AMO and CXI and XCS at http://confluence.slac.stanford.edu/display/PCDS/PCDS+Home • SXR • Beamline was added to the scope • Have drafts of ESD and ICD’s • Held discussions with several end-stations on how to plan to make them compatible with LCLS

  17. Common Diagnostics Readout Quad-Detector R2 q1 q2 R1 Target L • E.g. intensity, profile monitor, intensity position monitors • E.g. Canberra PIPS or IRD SXUV large area diodes (single or quad) • Amplifier/shaper/ADC for control/calibration/readout FEL • Four-diode design • On-board calibration circuits not shown • Board designed, fabricated, loaded, is in test

  18. MPS Rack Layout (Reminder) Dedicated Optical fiber 1 single-mode pair per Link Node for custom MPS signaling protocol TCP/IP Networking Redundant single-mode pairs for 1000Base-LX Ethernet Access to EPICS process variables on MPS Link Nodes via local Ethernet switch MPS Link Node console logins via Terminal Server Remote power-cycling of Link Nodes via APC AP7900 Switched Rack PDU All optical networking paths route through experiment hall patch panels and terminate at fiber patch panel in MCC

  19. Installation Status • Near Experiment Hall Machine Protection Rack, (B950S-04), Installed. • 48-strand fiber trunk pulled between B950B-42, (NEH Server Room), and B950S-04. The NEH MPS optical networking path is complete. • Ethernet Switch Installed in rack B950S-04, connected to fiber, and checked.

  20. MPS Trunk Field Rails assembled MPS trunk cables pulled between the hutches, the FEE, and rack B950S-04. All DIN rails for rack B950S-04 assembled, tested, and installed 12 MPS Link Node Input Aggregation Rails 6 MPS Trunk Head-End Rails MPS Installation Status

  21. PPS/MPS Racks • Installed in NEH

  22. NEH Laser Bay Laser Safety System Layout and Components

  23. Laser Safety System Laser Room Entrance Start installation on first table Laser Safety System control rack

  24. EPICS/Python/Qt • EPICS (Experimental Physics and Industrial Control System): • Control software for RT systems • Monitor (pull scheme) • Alarm • Archive • Widely used at SLAC and other labs • More: http://www.aps.anl.gov/epics/ • Python/Qt is a user interface between the EPICS drivers and records and the user • System is currently used for XTOD and AMO, provided as part of the XES Photon Controls Infrastructure • Later used for SXR, CXI, XPP, XCS, MEC

  25. Consoles: PyQt versus EDM

  26. Status • AMO EPICS drivers almost complete • More work as scheduled over the next two months • Plus need high level control panels

  27. Femto-Second Laser Timing System • RF Cavity System provided by SLAC • Laser Fiber driver/receiver system provided by LBL • Tested performance of LBL system prototype: better than 20 fsec drift performance (100 fsec is spec) • 2-Receiver, 1-Driver system will be installed at SLAC in July 09.

  28. Designed/Built at LBL

  29. RF Cavity Electronics for Femto-Second Timing System in Lab Control/DAQ crate with IOC and SLAC 119 MHZ waveform digitizer RF Cavity SLAC Chassis

  30. Example: LUSI Instrument CXI Diagnostics/Common Optics Diagnostics & Wavefront Monitor 1 micron Sample Environment * 0.1 micron KB & Sample Environment, Particle Injector and IToF (CD-4) 1 micron KB Reference Laser All Early Science except *

  31. Controls Subsystems • Vacuum • Motion • Viewing • Power Supplies • Racks and Cabling • Other items • Software: EPICS/Python/Qt • Type of controls • Valve Control • Vacuum Controls • Pop-In Profile Monitor Controls • Pop-In Intensity Monitor Controls • Intensity-Position Monitor Controls • Slit Controls • Attenuator Controls • Pulse Picker Controls • KB Mirror Controls • X-Ray Focusing Lense Control • Sample Environment Controls • Particle Injector Controls • Ion ToF Controls • Vision Camera Controls • Detector Stage Controls • Reference Laser Controls • DAQ Controls

  32. CXI Components to Control • X-Ray Optics KB system • Motion • Vendor provided, integration with LCLS • Reference Laser • Motion • Sample Environment • Sample Chamber • Motion, vacuum, vision • Ion ToF • HV, DC/pulser, digitizer • Instrument Stand • Motion • Detector Stage • Motion, vacuum, thermal • Particle Injector • Motion, vacuum, digitizer, vision, integration of commercial component • Vacuum System • Valve and Vacuum Controls

  33. CXI Components to Control con’t • Diagnostics and Common Optics • Pop-In Profile Monitor • Motion, Viewing • Pop-In Intensity • Motion, Digitization • Intensity Position • Motion, Digitization • Slit System • Motion • Attenuator • Motion • Pulse-Picker • Motion, Viewing • X-Ray Focusing Lense • Motion • CXI specific interface and programming • Racks & Cabling • Workstations • Vision Cameras • Beam Line Processor • Channel Access Gateway • Machine Protection System • Configuration • Data Acquisition • Use standard controllers on photon area controller list as much as possible

  34. LUSI Data Acquisition for Pixel Detectors • Cornell and Brookhaven 2-D pixel detectors are configured & read out using the SLAC ATCA Reconfigurable Cluster Element modules • Details in following slides • XPP XAMP Detector with custom ASIC • CXI Detector with custom ASIC

  35. Data System Architecture XPP specific Photon Control Data Systems (PCDS)‏ Beam Line Data L1: Acquisition (Many) Digitizers + Cameras L2: Processing (Many) Timing L0: Control (One) L3: Data Cache (Many) • DAQ system primary features • Trigger and readout • Process and veto • Monitoring • Storage

  36. XAMP 2D-Detector Control and DAQ Chain Beamline Instrument Detectors Fiber ATCA crate with SLAC DAQ boards, e.g. the SLAC Reconfigurable Cluster Element Module SLAC FPGA front-end board Brookhaven XPP/XCS 2D detector-ASIC • XAMP (XPP) LUSI instrument custom integrated circuits from Brookhaven are already connected at SLAC to SLAC LCLS high-performance DAQ system • XPP BNL XAMP Detector 1,024 x 1,024 array • Uses 16 each 64-channel FexAmps BNL custom ASICs • Instantaneous readout: 4 ch x 20 MHz x 16bit= 20 Gbit/sec into FPGA • Output FPGA: 250 Mbytes/s at 120 Hz (1024x1024x2x120) • In addition BNL has ATCA crate with SLAC modules to develop software and test with detector • ATCA • Advanced Telecommunication Computing Architecture • Based on backplane serial communication fabric, 10-G E • 2 SLAC custom boards (also used in other SLAC experiments) • 8 x 2.5 Gbit/sec links to detector modules • Dataflow and processing • Managed 24-port 10-G Ethernet switching • Essentially 480 Gbit/sec switch capacity • Naturally scalable

  37. Example: XPP Online Processing • Electronics gain correction (in RCE) • Response of amplifying electronics is mapped during calibration • Science data images are corrected for channel gain non-uniformity + non-linearity. • Dark image correction (in RCE) • Dark images accumulated between x-ray pulses • Averaged dark image subtracted from each science data image • Flat field correction (in RCE) • Each science data image is corrected for non-uniform pixel response • Event filtering (in RCE or later) • Events are associated with beam line data (BLD) via timestamp and vetoed based upon BLD values. Veto action is recorded. • Images may be sparsified by predefined regions of interest. • Event binning (processing stage) • Images (and normalization) belonging to the same bin (dt, Eg, ..) are summed together

  38. Example: XPP Monitoring • A copy of the data is distributed (multicast) to monitoring nodes on the DAQ subnet. • The monitoring nodes will provide displays for experimenters’ viewing: • Corrected XAMPS images at ≥ 5 Hz • Histories of veto rates, beam intensity, + other BLD values. • Reduced analysis of sampled binned data (versus scan parameter) • Implemented with Qt (C++/Python open source GUI)

  39. Example: XPP XAMPS Data Rates XPP specific Photon Control Data Systems (PCDS)‏ Beam Line Data L1: Acquisition (Many) Digitizers + Cameras L2: Processing (Many) Timing L0: Control (One) L3: Data Cache (Many) 4 x 2.5Gb PGP 10 GbE L1: RCE L2: Processing L3: Cache XAMPS 10 GbE 240 MB/s (480 MB/s) < 200 MB/s n x (200 MB – 20 GB) Binned data archived at end of run (mins – hrs) Expect ~ 6 – 60 GB / day

  40. CXI 2D-Detector Mechanical/Electrical Vacuum Assembly SLAC PPA Engineering Electrical/Mechanical Package • Positioning plate • Supports quadrant raft • Mounts to drive system Cam follower mounted to torque ring Hole size remotely adjustable Via PCDS Controls One Quadrant Raft Removed Pixel Detectors Cut-outs in base plate for cold straps and cables Cold strap

  41. Quadrant Board and Electrical Interfaces • Quadrant raft provides structural support and stability for the double-detector packages • Feet mount on quadrant raft through holes in the quadrant boards • This is also the thermal path • Quadrant boards provide grounding, power, and signal interface to the PAD detector package • 1 flex cable per detector • 1 FPGA on each quadrant board Cold strap Quadrant raft Mounting feet Double-detector package (4 per quadrant) Detector Quadrant board 2 Quadrant board 1

  42. ASIC Board • Rigid-flex ASIC board (SLAC design) • ASIC Bump-bonded to detector • ASIC/detector package bonded to carrier board

  43. CXI 2D-Detector Control and DAQ Chain Vacuum Ground-isolation Fiber Cornell detector/ASIC with SLAC quadrant board Carrier Board ATCA crate with SLAC DAQ Boards S:AC RCE ATCA Module • Each Cornell detector has ~36,000 pixels • Controlled and read out using Cornell custom ASIC • ~36,000 front-end amplifier circuits and analog-to-digital converters • Initially 16 x 32,000-pixel devices, then up to 64 x 32,000-pixel devices • 4.6 Gbit/sec average with > 10 Gbit/sec peak

  44. CXI Online Processing • Electronics gain correction (in RCE) • Response of amplifying electronics is mapped during calibration • Science data images are corrected for channel gain non-uniformity + non-linearity. • Dark image correction (in RCE) • Dark images accumulated between x-ray pulses • Averaged dark image subtracted from each science data image • Flat field correction (in RCE) • Each science data image is corrected for non-uniform pixel response • Event filtering (in RCE or later) • Events are associated with beam line data (BLD) via timestamp and vetoed based upon BLD values. Veto action is recorded. • Images may be sparsified by predefined regions of interest.

  45. CXI Online Processing con’t • Event processing (processing stage) • Examples are • Sparcification (region of interest) • Locating center • Reducing data by binning pixels • Mask errant pixels (saturated, negative intensity from dark image subtraction due to e.g. noise, non-functioning pixels, edge pixels from moving center) • Filling in missing data with centro-symmetric equivalent points • Transforming camera geometry due solid angle coverage and dead space between tiles • Radial averaging, showing intensity versus scattering angle or momentum transfer • Compute 2D autocorrelation function (single FFT) and store. Essentially at rate of 1 Hz with 4 MB (2Mpixel x 2 bytes) frames. • Peak finding (locate and fit Gaussian intensity peaks). There may be multiple peaks in some cases and the peak finding algorithms should be able to identify up to a few thousand peaks. • The CXI instrument will have an Ion Time-of-Flight which will produce data at 120Hz. The online processing of this data involves data reduction based on thresholding and vetoing based on thresholding or the fitting of peak positions and height.

  46. CXI Monitoring • A copy of the data is distributed (multicast) to monitoring nodes on the DAQ subnet. • The monitoring nodes will provide displays for experimenters’ viewing: • corrected detector images at ≥ 5 Hz • histories of veto rates, beam intensity, + other BLD values. • Reduced analysis of sampled binned data (versus scan parameter) or other processing tbd • Implemented with Qt (C++/Python open source GUI)

  47. Offline • Offline Analysis Examples • XCS Multi-tau multi-Q processing • CXI Classification • Averaging • Alignment • Phase Retrieval • To be addressed • CXI • Data Sorting/Averaging (Classification) • Phase Retrieval

  48. Offline Data Management System Functions Translate data into a form for long-term scientific access Translate to HDF5 format From online, XTC, Extended tag Container format, C++ Store data attributes in a database (science metadata) Store translated data and archive to tape Provide access for scientists to stored and archived data Access by attributes (science metadata) Manage data ownership and access restrictions Data access for: Processing clusters User workstations Offsite export (network and other means) Manage derived data products

  49. Offline System Architecture Steffen Luitz

  50. Components Data Format HDF5/NeXUS chosen because of support, wide availability of language bindings File Manager iRODS (next-generation of SDSC SRB) Tape Manager Interface iRODS to SLAC HPSS and mass storage system (SL8500 ) High performance file systems Lustre Databases MySQL Monitoring Ganglia Data transfer Currently performing data transfer studies with CAMP group in Germany (Peter Holl) to write and retrieve few hundred Gigabyte files using a few data transfer options Sftp (expect 2 Mbytes/s) Bbcp (expect 20 Mbyte/sec using around 15 streams) FTD (around 20 Mbytes/sec using 64 streams) User accounts, etc via SCCS, unix In 2010 will move to server-based solution XROOTD iRods http Steffen Luitz

More Related