1 / 38

SURA IT Committee Meeting March 22, 2005

SCOOP Status. SURA IT Committee Meeting March 22, 2005. Sara J. Graves, Ph.D. Director, Information Technology and Systems Center University Professor, Computer Science Department University of Alabama in Huntsville Director, Information Technology Research Center

osric
Download Presentation

SURA IT Committee Meeting March 22, 2005

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SCOOP Status SURA IT Committee MeetingMarch 22, 2005 Sara J. Graves, Ph.D. Director, Information Technology and Systems Center University Professor, Computer Science Department University of Alabama in Huntsville Director, Information Technology Research Center National Space Science and Technology Center 256-824-6064 sgraves@itsc.uah.edu

  2. Whereas, the Southeastern Universities Research Association has proposed the creation of an open-access network of distributed sensors, linked via an ultra-fast network to state-of-the-art computing systems that track and model the southeastern coastal zone in real time and provide components of a more comprehensive coastal security infrastructure — known as Southeastern Coastal Ocean Observing Program (SCOOP); now, therefore, be it Resolved, That the Southern Governors’ Association supports SURA’s Southeastern Coastal Ocean Observing Program to bring more effective protection of life and property to the increasingly developed coastal zone, to offer a vehicle for bringing the extensive and widely dispersed intellectual talent of the ocean sciences community to address program of homeland security via an integrated and spatially distributed program, and to aid in addressing the ecological and environmental concerns endangering health and safety of inhabitants and marine resources.

  3. SURA’s Southeastern Coastal Ocean Observing Program (SCOOP) will facilitate the assimilation of observational data into community models and provide a distributed data ingestion and support grid with broad band connectivity. This is expected to become a coastal counterpart to the Global Ocean Data Assimilation Experiment (GODAE) with emphasis on the southeast.

  4. Board of Trustees Meeting Nov 2002 • Data fusion is critical • Modeled and observed fields must have equal representation • Use GODAE (Global Data Assimilation Experiment) as a guide for CODAE (Coastal Ocean Data Assimilation Experiment) • SURA is a strong brand (we should use it) • Focused sub-regional efforts with specified deliverables which would be new and exciting • Broad SURA effort targeted on building a culture supporting region-wide collaboration in shared scientific goals

  5. Integrated Ocean Observing System (IOOS) Serves national needs for: • Detecting and forecasting oceanic components of climate variability • Facilitating safe and efficient marine operations • Ensuring national security • Managing resources for sustainable use • Preserving and restoring healthy marine ecosystems • Mitigating natural hazards • Ensuring public health

  6. National Federation of Regional Systems • National Backbone • Satellite remote sensing • In situ sensing • reference & sentinel • station-network • Link to global ocean • component • Data standards & • exchange protocols • Regional Systems • Regional priorities • Effects of climate change • & Effects of land-based • sources •  Resolution, •  Variables

  7. Overarching Principles for Coastal Observing Programs • 1. A national coastal observing program will necessarily consist of regional and sub-regional components. • National, regional and sub-regional observing systems must consist of three interconnected aspects: (i) spatially distributed sensor arrays; (ii) data management and dissemination hubs; and (iii) nowcasting and forecasting models that are fused with assimilated observational data. • 3. The creation and long-term viability of nested integrated and sustained coastal observing systems will depend on a high level of interagency coordination.

  8. SCOOP Vision Statement The SURA Coastal Ocean Observing and Prediction (SCOOP) program is an initiative to create a open-access, distributed national laboratory for scientific research and coastal operations. SCOOP is designed to complement the efforts of both Ocean.US - the organization responsible for implementing the national Integrated Ocean Observing System (IOOS)- and the coastal component of NSF’s Ocean Research Interactive Observatory Networks (ORION) project. The SCOOP emphasis is on interoperability in order to create a real-time observations system for both monitoring and prediction. Through SURA Universities, SCOOP will provide the expertise and IT infrastructure to integrate observing systems that currently exist, and incorporate emerging systems. This will promote the effective and rapid fusion of observed data with numerical models, and facilitate the rapid dissemination of information to operational, scientific, and public and private users.

  9. Overarching Goals for SCOOP • System of Systems • Ocean Observing = IOOS & OOI & ORION • Coastal Ocean Component of the Global Earth Observing System of Systems (GEOSS) • Components: (i) Sensor arrays, (ii) Data management & communication, (iii) Predictive models • Distributed National Lab for Research & Applications • IT Glue...Bricks & Mortar • Research to Operations • Academic & Federal Agency & Industry partnership • IT Enabling Big Science • Environmental prediction • Standards enable innovation • Interoperable community solving the really big problems

  10. Planned Capabilities • Validate accurate and timely short and long-term predictions • Simultaneous measurements of winds, waves, currents, water density, nutrients, water quality, biological indices, and fish stocks under all conditions • Focus on storm surge, wind waves, and surface currents, with special attention to predicting and visualizing phenomena that cause damage and inundation of coastal regions during severe storms, hurricanes and possibly tsunamis • Bridge the gap between scientific research and coastal operations

  11. SCOOP Science Goals • Assess and predict the coastal response to extreme atmospheric events – focus on storm surge, flooding & waves • Modular modeling tools for regional issues (wave coupling, sediment suspension, etc.) • Standardized interfaces for data and (coupled) model interoperability • Ensemble prediction – forecasts based on many independent models runs

  12. SCOOP Research Goals • Measure, understand and predict environmental conditions • Provide R&D support for operational agencies including NOAA, the U.S. Navy, and others • Include outreach and education components that assure relevance of their observing activities

  13. SCOOP Objectives • Develop and deploy standards and protocols for data management, exchange, translation and transport • Implementation of existing standards and protocols (e.g. FGDC, OGC, web services, etc.) • Application of Grid Technologies • Deployment of the communications infrastructure to link ocean sensors operating in extreme environmental conditions to people who need timely information • Cultivation of industry partners

  14. Coordination is Key • Ocean.US - National Office for Integrated and Sustained Ocean Observationscoordinates development of an operational, integrated and sustained Ocean Observing System (created by NOPP)http://www.ocean.us/ • Integrated Ocean Observation System (IOOS)a national effort to create an Integrated Ocean Observing Systemhttp://www.openioos.org/ • National Oceanographic Partnership Program (NOPP)15 federal agencies providing leadership and coordination of national research and education programshttp://www.nopp.org/ • National Federation of Regional Associations provide a framework for orchestrating regional collaborations http://www.usnfra.org/ • NSF Ocean Research Interactive Observatory Networks (ORION)an emerging network of science-driven ocean observing systemshttp://www.orionprogram.org/default.html

  15. Interoperability is Key • Ocean.US Data Management and Communications (DMAC) Planprovides the framework for interoperabilityhttp://dmac.ocean.us/dacsc/imp_plan.jsp • Open Geographic Information Systems (GIS) Consortium (OGC)an open consortium of industry, government, and academia developing interface specifications to support interoperability http://www.opengis.org • Marine Metadata Interoperabilitya community effort to make marine metadata easier to find, access and usehttp://www.marinemetadata.org/

  16. Interoperability Demonstration NOAA and ONR grant recipients collaboration www.openioos.org

  17. SCOOP System Development • Funding provided by ONR, NOAA • 2004 List of SCOOP Partners: • Consortium for Oceanographic Research and Education • Gulf of Maine Ocean Observation System (GoMOOS) • Louisiana State University, Center for Computation &Technology • Louisiana State University, Coastal Studies Institute • Southeast Atlantic Coastal Ocean Observing System (SEACOOS) • Southeast Coastal Ocean Observations Regional Association (SECOORA) • Texas A&M University & Gulf Coast Ocean Observing System (GCOOS) • University of Alabama in Huntsville • University of Delaware (Mid-Atlantic Regional Association (MACOORA) • University of Florida • University of Miami, Center for Southeastern Tropical Advanced Remote Sensing • University of North Carolina • Virginia Institute of Marine Science

  18. SCOOP Program Elements • Data Standards Metadata standards – compliant with existing and emerging standards Standard data models – to facilitate aggregation • Data Grid OGC Web services – for distributed maps and data Augmenting with new data, e.g., surface currents • Model Grid Storm surge & wave prediction Modular, standardized prediction system

  19. SCOOP Data Architecture (high level) Transport Mediums HTTP & HTML NOAA Regional Data Provider #1 SCOOP Modeling Partners TBD…??? Other Regional Association Data Centers Regional Association Data Center (Archive) Regional Data Provider #2 LDM…??? Web Browsers HTTP & HTML Regional Data Provider #N OGC Protocols GIS Clients NDBC MODEM NDBC SCOOP Modeling Partners TBD…???

  20. SCOOP Prediction SystemAll Versions • Standard naming conventions – Adopt existing community standards where appropriate (e.g., CF or NCEP) and add our own conventions only when necessary. • Mechanisms for tracking metadata, e.g., provenance, forcing, source of OBCs, forcing used to create OBCs, etc. • Portals – entry point for access to models & model output. Deals with authentication & authorization.

  21. SCOOP Prediction System, Version 1.0 • Modular wind forcing • Modular embedded regional models • Coupled models – for existing groups • Using existing computational resources • Verification – real time model-data comparisons • Model-GIS interface & OGC Web services • Web mapping with roads, etc. • Web mapping with time sequences (WMS) • Standardized time-series verification • Openioos.org for displaying results • Other…?

  22. SCOOP Operational Prediction System Version 1.0 Standardize Transport/EncapsulationXML, FTP, LDM, OPeNDAP…? Large Scale Response Forcing Operational Wave Predictions (BIO/GoMOOS) Regional Response NOAA/NCEP (ETA) Regional Model Center #1 Operational Tide/Surge Predictions (SABLAM) NOAA/NCEP/UNC? (EDAS) Archive Regional Model Center #2 Enhanced Winds (Miami) Coupled Wave-Surge Predictions (Miami) Standardize model interfaces

  23. SCOOP Verification & Visualization Prediction System OGC, RSS…? Modeling Center #1 (Regional or otherwise) Information Providers Modeling Center #2 (ditto) Regional Web Server #1 Data System Regional Web Server #2 Regional Data Center #1 Regional Data Center #2 Standardize model-GIS interfaces Standardize verification tools & data

  24. Prediction System Task Elements for Version 1.0 Task: Lead Partner: Data standards TAMU Data transport UAH Data translation & mgmt UAH Coupled modeling Miami Nested Modeling VIMS Customized configuration TAMU Visualization Services LSU Verification & validation Miami Computing & storage resources LSU Security TAMU Grid management middleware LSU Web Mapping Demonstration GoMOOS

  25. SCOOP Data Architecture:High level Services Regional Data Providers Observation Data Users, Modeling Partners, other Data Centers, etc. Regional Association Data Centers GeoSpatial OneStop / FGDC Clearinghouse User Interface Data Provider data Data and Metadata Metadata Services Model SCOOP Catalog Data Translation Services Archive/Repository Broker data Metadata only data Modeler / Data Provider Data Access Services Data Access Services Data Provider data Model or Application

  26. SCOOP Data Architecture Specifics:Data Acquisition – example technologies to support dynamic transport and metadata cataloging Data Transport Metadata Cataloging Regional Data Providers Observation Data Regional Association Data Centers Data Provider e.g., XML, Metadata Harvest data Metadata and Data Metadata Services Model SCOOP Catalog e.g., LDM data Metadata only Archive/Repository Broker Modeler / Data Provider data LDM Data Access Services Data Access Services Data Provider data

  27. SCOOP Data Architecture Specifics:Data Discovery & Access – example technologies to support dynamic transport, analysis and visualization Users, Modeling Partners, other Data Centers, etc. GeoSpatial OneStop / FGDC Clearinghouse Data Discovery User Interface XML Z39.50 SOAP Metadata Query Services IOOS Interoperability Demo SCOOP Catalog Archive/Repository Broker data Data Access HTTP data Modeler / Data Provider Model or Application ESML FTP OPeNDAP OGC FTP OPeNDAP OGC OGC WMS Data Translation Services FTP Regional Association Data Centers Regional Data Providers

  28. SCOOP Information Architecture:Example metadata exchange technologies to support Data Discovery Metadata Population GeoSpatial OneStop User SCORE Z39.50 Manual Updates FGDC Records XML Ingest Svcs Query Svcs Metadata Harvest Metadata Services Data Provider XML ? data IOOS Interoperability Demo SCOOP Catalog OGC HTTP Metadata Harvest WMS data list & metadata Get Capabilities Data Discovery SOAP Local Metadata SOAP SCOOP Interactive Search U/I SCOOP Data Dictionary Metadata Harvest SOAP Model or Application data Automated Data Discovery SOAP Modeler / Data Provider Regional Data Providers Regional Association Data/Service Centers Users, Modeling Partners, other Data Centers, etc.

  29. SCORE: Accomplishments & Plans • SCORE is the catalogs and services infrastructure for SCOOP data management • Data & Model Survey provided initial snapshot of partners’ data (observations and model results) • Developed database schema for SCORE to support • Strawman SCOOP Catalog: requesting input on improved capabilities • IOOS demo: working with GoMOOS team to integrate catalog with demo • FGDC Clearinghouse to support Geospatial One-Stop: plan to create FGDC metadata records from SCORE • Issues • What data management functionality is needed within SCOOP? • Metadata services for data collections, data files/streams, general model information, information on specific model runs,… • How to coordinate metadata and data management across sites? • How to automate population of SCORE?

  30. Science Goals for Version 2.0 • Environmental Prediction • Prediction systems fuse models & observations • Nonlinear dynamics limits predictability – Lorentz’s seagull • Probability and statistics – ensemble modeling • Hurricane Surge & Waves • Biggest uncertainty in the winds • Ensemble of winds: different models or different simulations • New paradigm & new metrics for skill assessment • Research to Operations • Improving upon SLOSH – a good idea 30 years ago • GIS compatibility enabling application & visualization • OpenIOOS.org is the high visibility “front end”

  31. NCEP MM5 NCAR Version 2.0 Wind Forcing Wave and/or Surge Models Result Dissemination Select region and time range Archive ADCirc Verification ElCirc or Transform and transport data Visualization WAM/SWAN Regional Archives OpenIOOS or Synthetic Wind Ensemble Ensemble of models run across distributed resources Analysis, storage and cataloguing of output data Ensemble wind fields from varied and distributed sources

  32. SCOOP: Data-to-Model (D2M) RealtimeTransport and Translation (Nested Model) Scenario NCEP (NAM)Wind Forecasts Atmospheric Models Regional Oceanic/Coastal Models TAMU LDM-push (FTP-pull) UNC POC: Gerry Creager (1) MM5 POC: Rick Luettich, Brian Blanton • Translation Services • subset • subsample • re-format • re-grid Translated Winds and fluxes Alternate LDM-push ADCIRC WRF (future) LDM-push (1a) High-Res Wind Forecasts (2) ESML LDM-push Water levels D2M Node (3) POC: Matt Smith, Ken Keiser (UAH) • Translation Services • subset • subsample • re-format • re-grid • Atmospheric Model products are “translated” through D2M to the form requested by the client model. Currently, using ftp-pull, all NAM grids 0-84h for the 4 runs (00, 06, 12, 18 UTC) of AWIP12 and AWIP32 are sent to a D2M node and translated. • Via LDM, UNC, TAMU, & UF have access to the raw and translated model data. • Partners use translated ob/model data in their models. Then push their results to a D2M node. Currently, ADCIRC output files (text and netCDF) are being pushed to a D2M node (for translation) and other modeling partners via LDM. • Resulting translated data products area pushed to a client model’s site and made available for other transport vehicles (FTP, OPeNDAP, OGC, etc) for use in retrospective studies and other applications. Likewise the output of other models can be processed through D2M for translation steps requested by other client models. ESML LDM, OPeNDAP, FTP Push/pull D2M Node TranslatedWater levels LDM-push (4) TAMU, UF, Others? Model X VIMS ELCIRC POC: Harry Wang Localized Models, Users and Archives

  33. Data Management Goals Version 1 • Provided a high-level data catalog for SCOOP data discovery, providing descriptions of partner data holdings and pointers to partner data access points (web, ftp, OPeNDAP, etc.) • Based initial catalog on Data & Model Survey results • Coordinated with Data Transport (Task 2) to develop initial LDM network to exchange data in near real time among SCOOP partners. • Coordinated with Data Standards (Task 1) on development of metadata keywords for SCOOP Version 2 • Expand SCOOP data discovery capabilities based on evolving data management practices of SCOOP partners. • Support IOOS Demo • Field an FGDC Clearinghouse node for SCOOP • Monitor Marine Metadata Interoperability activities and their potential interaction with SCOOP • Assist SCOOP partners in developing standard metadata to describe their data collections • Continue coordination with all partners on data management issues

  34. Distributed Data Integration Merged data product for on-demand visualization Countries Cyclone Events AMSU-A Channel 01 MCS Events Coastlines Knowledge Base AMSU-A ITSC GLOBE AMSU-A data overlaid with MCS and Cyclone events, merged with world boundaries from GLOBE.

  35. Heterogeneity Leads to Data Usability Problems • Science Data Characteristics • Many different formats, types and structures (18 and counting for atmospheric science alone!) • Different states of processing (raw, calibrated, derived, modeled or interpreted) • Enormous volumes

  36. Interoperability: Accessing Heterogeneous Data The Problem The Solution DATA FORMAT 3 DATA FORMAT 1 DATA FORMAT 2 DATA FORMAT 1 DATA FORMAT 2 DATA FORMAT 3 ESML FILE ESML FILE ESML FILE FORMAT CONVERTER ESML LIBRARY READER 1 READER 2 APPLICATION APPLICATION • One approach: Enforce a standard data format, but… • Difficult to implement and enforce • Can’t anticipate all needs • Some data can’t be modeled or is lost in translation • Converting legacy data is costly • A better approach: Interchange Technologies • Earth Science Markup Language

  37. What is ESML? • It is a specialized markup language for Earth Science metadata based on XML - NOT another data format. • It is a machine-readable and -interpretable representation of the structure, semantics and content of any data file, regardless of data format • ESML description files contain external metadata that can be generated by either data producer or data consumer (at collection, data set, and/or granule level) • ESML provides the benefits of a standard, self-describing data format (like HDF, HDF-EOS, netCDF, geoTIFF, …) without the cost of data conversion • ESML is the basis for core Interchange Technology that allows data/application interoperability • ESML complements and extends data catalogs such as FGDC and GCMD by providing the use/access information those directories lack. http://esml.itsc.uah.edu

  38. ESML Library NUMERICAL WEATHERMODELS (MM5, ETA, RAMS) ESML IN ACTION:Ingest surface skin temperature data in Numerical Models • Purpose: • Use ESML to incorporate observational data into the numerical models for simulation • Skin temperatures come in a variety of data formats • GOES – McIDAS • Reanalysis Data - GRIB • MM5 Model - Binary • AVHRR – HDF • MODIS - EOS-HDF Reanalysis GRIB files MM5 GOES ESML file ESML file ESML file • Scientists can: • Select remote files across the network • Select different observational data to increase the model prediction accuracy http://vortex.nsstc.uah.edu/~sud/web/default.htm

More Related