1 / 10

NLC Archiving Requirements (Preliminary)

NLC Archiving Requirements (Preliminary). Why does the NLC need a new Archiver?. Since the NLC will be such a complex machine, there needs to be greater flexibility in the data archived than is currently available in order to troubleshoot problems

amello
Download Presentation

NLC Archiving Requirements (Preliminary)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NLC Archiving Requirements(Preliminary)

  2. Why does the NLC need a new Archiver? • Since the NLC will be such a complex machine, there needs to be greater flexibility in the data archived than is currently available in order to troubleshoot problems • The archived data for some devices need to be on the scale of milliseconds for a short period of time (seconds) while data for other devices may be on the scale of minutes over a longer period of time (10’s of minutes to hours) • Data viewing from many different control system applications should have a consistent interface: such as archived data, correlation plots data, buffered data acquisitions, configuration files, etc.

  3. Why does the NLC need a new archiver (cont)? • Collaborators will be internationally based so access to data must be accessible from around the world • The amount of data is huge (not including fast-timed based archives): 600,000 pvs x 80 bytes/pv x 20 sets of data/hour --------------------------------- ~1.0 Gbyte/hour x 24 hours --------------------------------- 24 Gbytes/day

  4. Requester and Archiver • Requester • Interface between user and archiver • Supports time-based entries, monitor-based entries, triggered entries, time-based + monitor-based entries and fast-timed entries (ring buffers) • Validates requests • Bookkeeper of archived channels/rate/requesters • Limits requests when necessary • Generates diagnostic reports • Archiver • Retrieves pv data at specified rates or monitor intervals and stores the data in files quickly and efficiently • Limited ring buffer capability • Ability to create “other” data files for configuration data, correlated data, etc. • Must be able to handle up to ~600,000 pvs • Much of the data will come in bursts instead of regular intervals

  5. Data Converter and Data Retriever • Data Converter • Runs periodically to convert data from fast file save format to fast file access format • Not part of the archiver • Ages and compresses data when necessary • Calculates statistics of compressed data • Keeps master index of data locations • Data Retriever • Accepts requests to retrieve data • Validates requests • Utilizes master index to locate data • Can process data in fast file save format and also fast file access format • Can retrieve data from archiver, correlation plots, buffered data acquisitions and configuration files • Sends data back to requester

  6. Browser • User interface for data retrieval • Web based component • Capability to display graphs and histograms • Capability to perform simple calculations (mean, std dev, etc.) and simple fits • Ability to zoom, pan and lasso graphics • Ability to export data to various output formats (MATLAB, spreadsheets, etc) • Ability to display time-based data, correlated data, correlated history data, configuration data, etc

  7. Requester/Archiver Questions • Requester Questions: • Can anyone make archiver requests? Should there be special privileges for different types of requests (based on rate? Subsystem?) • Do people based internationally need this capability? Should it be web-based? • Archiver Questions? • How closely sync’d in time do the IOCs need to be? • What are the requirements for fast-time based entries (ring buffers)? How many data points should be available? What is the maximum frequency which needs to be sampled (based on hardware requirements)?

  8. Data Converter/Data Retriever Questions • Data Converter Questions: • How is data aged? Should statistics be kept on all aged data? If so, what type? (May depend on hardware requirements). • Do users have input as to how much data is kept? • How long should data be kept available online? • How often should data be converted? Once daily? More often? • Performance requirements? • Data Retriever Questions: • What are the performance requirements? How much data, how fast so that it is acceptable to users?

  9. Browser Questions • Does the browser need security imposed or can anyone request data? • What are the display performance requirements? How many pvs? How many graphs? Timing? • How sophisticated should the calculation capability be? Can the browser do simple calculations and simple fits and rely on exporting data for sophisticated calculations and fits? • Or, do we want the browser to encompass everything the user wants to do with the data? Should MATLAB (or some other calculation engine) and some spreadsheet program be integrated into the browser?

  10. Technical Challenges • Optimizing algorithms which stores data and accesses data fast and efficiently • Finding storage mediums to handle the volume of data which will be archived and that is capable of meeting performance requirements • Creating a robust system with built-in redundancies which can handle failovers seamlessly • Designing a system which meets requirements now yet is flexible since the requirements will inevitably change over time

More Related