1 / 22

MBARI’s Shore Side Data System

MBARI’s Shore Side Data System. From Ships, ROVs, Moorings, AUVs, & ? To Bytes, Plots, Pictures, Samples, & Video. What Are Our Goals?. Build data systems that can grow over time Make adding instruments and data routine Easily add new and unimagined components

violet
Download Presentation

MBARI’s Shore Side Data System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MBARI’sShore Side Data System FromShips, ROVs, Moorings, AUVs, & ?ToBytes, Plots, Pictures, Samples, & Video

  2. What Are Our Goals? • Build data systems that can grow over time • Make adding instruments and data routine • Easily add new and unimagined components • Scale to meet growth needs of observing systems • Create an extensible IT umbrella • Encompass the real world of data sources • shipboard and shore-side systems • isolated buoys and networked observatories • one-off data files and high-speed (Gb net) streaming data • Embrace image, video and document archive formats • Still provide users with “do what I want”

  3. Cruise (Expedition) Interface

  4. Samples Interface

  5. Video Annotations Interface

  6. 3D Visualization Interface

  7. What Has MBARI Learned? • Metadata: It must accompany the data • Data w/out metadata is like a directory with no Readme • A system’s power relies on good knowledge of its data • Metadata: It must accompany the instrument • Every connector between the two increases error rates • Once data and metadata detached, reattaching is painful • Metadata: It must be flexible and yet structured • Flexible: you’ll need to define new kinds of data sources • Structured: consistency => automation => value • Distributed storage, great interfaces, transparency

  8. About MOOS: MBARI’s Ocean Observing System • A major observing platform development initiative • Multi-platform, cabled & uncabled, benthic to surface • “What Would It Take?” — develop and test ideas • Need a way to store metadata with instrument • Need a way to submit metadata to data system • Result: Answers that can work anywhere • Local ‘intelligent’ storage: PUCK concept • Consistent services: Instrument SW Infrastructure • We can iterate to good, tested solutions

  9. Archiving 101110 110 234 999 110011 Data Presentation Data line 1 more data last data MOOS (Showing Data Flow) Applications/ Interfaces User Communications             Portal Deployed Platform Data Tracking Applications Devices Shore Side Data System (User Tools) Ocean Side Shore Side

  10. About SSDS: The Shore Side Data System • A MOOS Development Project • Goals: low cost, flexible, expandable, reliable • Future systems beyond MOOS (e.g., MARS) • Now in 3rd year, deploying initial elements • Key Tenets of SSDS Development • Iterative development—improve it as we go • Test with real data—new and archival • Build for change—use modular interfaces

  11. Shore Side Data SystemRequirements • Ingest data in any described format and save it • Capture, publish data descriptions (metadata) • Provide standards-based access to data • Raw data, and other common digital formats • APIs for common visualization and analysis tools • User-oriented web interfaces, quick-look plots • Merge data (different sources & time intervals) • Support data visualization & quality control • Provide data access security as needed

  12. Applications 101110 110011 SSDS Elements Automated Data Flow Internal Interfaces On-Demand Interactions (Re)Processed and New Data Sets Arriving Data Metadata Data Tracker Ingest < Requests Data > 110 234 999 223 207 191 Archiving Web I/F< Requests Data > Data Presentation Data line 1 more data last data Data For Analysis Data Catalog External Data Stores Shared Descriptions

  13. 1 2 3 4 How Does It Work? • First, the developer describes what’s in an instrument’s data records (the metadata). • That description is stored with (or near) an instrument, and sent to SSDS before any data from the instrument. SSDS tracks this info…. • Data records generated include the data type. • SSDS automatically routes data of each type to the correct ‘data bucket’. • SSDS automagically knows about the data, because they’ve been described. Now it can: • Plot Print Search Merge • Format (on request) Describe (in files & headers) • Send to Applications Point to by variable name

  14. Applications 101110 110011 Standard Interfaces and Tools (Re)Processed and New Data Sets FGDC Arriving Data Metadata Data Tracker Ingest XML/ DTDs < Requests Data > 110 234 999 223 207 191 SQL Archiving Web I/F< Requests Data > Data Presentation DODS Ferret LAS netCDF Data line 1 more data last data inGrid HTTP Browsers ncBrowse Data For Analysis Z39.50 Data Catalog DODS External Data Stores Shared Descriptions

  15. Strategies (Hard-Earned) • Low threshold for user entry (minimal XML) • Stay away from domain-specific solutions • Example: ‘deployment’ is useful concept • Minimize internal structure & assumptions • Our biggest challenge: flexible architecture • Be agnostic about input data & file formats • Maximize access/presentation features • Provide many views into data (common first) • Take advantage of OO methods and reuse

  16. SSDS Multi-View Interface

  17. SSDS Multi-View Interface

  18. Data Integration Strategy • Data can be remote (managed by links) • Domain-specific tasks done externally • Domain-specific calibration and QC • Data reprocessing and conversions • Non-automatable data sets (time series) • Custom views unique to domain or media • SSDS is access point / service provider • Maintain focus on core services, interfaces

  19. Development Status • First deployed for MOOS Test Mooring • 10/1: Training with development tools • 10/7: First actual code written • 11/8: First end to end test (in use ever since) • 11/19: Demonstrated Java GUI for data access • 12/3: Data is live from deployed mooring • These are prototype solutions (first round) • Planning for AUV CTD data management

  20. Summary We designed and built a flexible, dynamic data system with an open architecture. • Metadata critical to observatory and instrument operation. • Standard interfaces enforce modularity. • A layered metadata model with generic concepts provides multiple data access paths. • Iterative development processes support fast product deployment and improvements.

  21. Acknowledgements • Monterey Bay Aquarium Research Institute • David and Lucile Packard Foundation • The SSDS Team: • Kevin Gomes, John Graybeal, Mike McCann, Brian Schlining, Rich Schramm, Dan Wilkin • The ISI Team: • Led by Duane Edgington and Tom O’Reilly • All our committed and helpful users

  22. Contacts • Shore Side Data System: • John Graybeal, IAG Lead831-775-1956 graybeal@mbari.org • Video Annotation and Reference System (VARS): • Dan Wilkin, Lead Developer831-775-1865 wilkin@mbari.org • Samples Database Interface • Susan Von Thun, Samples Coordinator831-775-2006 svonthun@mbari.org • Cruise (Expedition), SSDS, and ROV 3D Interfaces • Mike McCann, Lead Developer831-775-1769 mccann@mbari.org See Poster at AGU See Poster at AGU

More Related