1 / 13

Database Reports and the IOC Crawler

Database Reports and the IOC Crawler. Presented by Katia Danilova 09/01/2005. Please. Hold the questions and comments, write down your questions and suggestions! discussion is planned after the presentation we can use your notes to address the problems later Thank you!.

ova
Download Presentation

Database Reports and the IOC Crawler

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Database Reports and the IOC Crawler Presented by Katia Danilova 09/01/2005

  2. Please • Hold the questions and comments, write down your questions and suggestions! • discussion is planned after the presentation • we can use your notes to address the problems later • Thank you!

  3. Why to discuss Reports and Crawler together: • They are “complimentary” programs providing access to IOC configuration info for the client (user) • Crawler takes IOC data from EPICS servers and puts them into the database • Web-based reports take this data from the database for the end user • ROCS system is under development • Reports by request (in the form of spreadsheet or flat file) are possible

  4. Data Flow View:

  5. Crawler • Crawler is a computer program which “crawls” through the IOC configuration related files on EPICS boot servers, collects the information, constructs a profile for each IOC, and automatically saves this data in a relational database. • Language used: • Initially: Java (Jeff Patton) • Currently: Perl (Don Dohan, Jeff Patton, Greg Lawson) • Relational Database used: • mySQL (Argonne) • Oracle (SNS)

  6. Collaboration • Participants: Argonne and SNS • Don Dohan (Argonne) • Coles Sibley, Jeff Patton, Greg Lawson (SNS) • Crawler is a part of IRMIS (Integrated Relational Model of Installed Systems ) • Basic package is made in Argonne • Other labs can add the modules to adjust the program to their realities – this is what Greg Lawson does for Control Systems

  7. Examples of files parsed: • Startup.cmd • Bootline • St.cmd • iocInfo/ • bspVersion • epicsVersion • vxVersion… • pvList • db, dbd template files

  8. Why Controls Systems care about the Crawler: • the latest information (about each IOC, or how many people use this or that version, or software changes…) is easily accessible through reports and viewing programs • Some problems are already identified: • Mistakes/typos in PV names • Can tell when somebody is not following the naming standards • In a case of disaster recovery: can look at the last configuration and reconstruct IOC from the DB • If a bug in a certain version of driver found: can find all IOC that require manual changes

  9. In perspective • Rather than pulling info from IOC, info can be pushed into IOC from the database automatically • If global changes are required: all IOC can be pushed at once by making changes in one place in the DB and can be loaded automatically instead of restoring everything manually • If a bug found: changes to all IOC that use the same driver can be done at the same time • If in addition to writing to a disk all IOC info goes to DB, then it can be used for logging almost real time • DB applications that talk directly to IOC can be created

  10. Current Issues to solve for developers: • Crawler is under development => • Every time Argonne makes changes to improve the program, SNS has to make adjustments (Jeff Patton has to adopt all the changes to Oracle) • Need to develop GUI viewing programs (like Archiver and ROCS) • Currently, when IOC engineers make changes in a file structure, or just move the files, crawler needs to be updated to be able to find info

  11. For the IOC engineers: • Keep the standard organization of the directories, otherwise crawler does not know were to look for the files. Order is important! • Keep the standard structure of the files the program parses, otherwise crawler can not find data. Order is important! • If there are no concrete standards yet, probably it’s the time to develop them?

  12. My sources: • IRMIS collaboration meeting • Relational Database Collaboration @ Argonne National Laboratory.ppt • Argonne National Laboratory: IRMIS PV Crawler.ppt • Coles Sibley. SNS Requirements.ppt • Jeff Patton. ORNL EPICS RDB Tools.ppt • Jeff Patton: Extending RDB Core.ppt • An overview of IRMIS • Interviews with J. Patton, G. Lawson, C. Sibley, E. Williams

  13. Crawler Animation: • is a student project completed by Katia Danilova for Flash class at UT, spring 2005 • Requirement: develop an animated web-based instructional/ educational unit (3-5 min length, funny creatures designed) • This unit: • shows a little of how the IOC boots up • shows how the crawler gathers IOC configuration info from the EPICS IOC boot server • shows how a client, for example a web based report, gets an updated info from a DB Crawler Animation

More Related