1 / 51

Comprehensive Large Array-data Stewardship System Status Presented to DAARWG

Comprehensive Large Array-data Stewardship System Status Presented to DAARWG. Kern Witcher CLASS Program Manager November 8, 2012. Agenda. CLASS Summary Program Overview Data Center Migration CLASS Capabilities CLASS System Evolution Other Initiatives Challenges CLASS and DAARWG.

remy
Download Presentation

Comprehensive Large Array-data Stewardship System Status Presented to DAARWG

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Comprehensive Large Array-data Stewardship SystemStatusPresented to DAARWG Kern Witcher CLASS Program Manager November 8, 2012

  2. Agenda CLASS Summary Program Overview Data Center Migration CLASS Capabilities CLASS System Evolution Other Initiatives Challenges CLASS and DAARWG

  3. CLASS Summary • CLASS meets original intended purpose (Large Volume Data) • Suomi-NPP • Volume Ingested – 1.32Pb • Volume delivered-3.53Pb • Files Delivered- over 34M • 22 major datasets included • AVHRR, GOES, IASI, CFS-R • 2.27 Pb, safely stored at 2 sites • CLASS must evolve to become a sustainable NOAA enterprise • NOAA’s desire for CLASS to be the Enterprise Archive • Significant challenges must be addressed

  4. Program Overview

  5. Program Transition • Program/ Contract Management was transitioned from OSD to NCDC in May of 2011 • New WBS and program management staff established • Increased communications between CLASS development and Data Centers • CY3*extension in Fall of 2011 to end of FY12 • CY4 extension though Q1 of FY13 • CY5 to begin Q2 of FY13. CY5 delayed to allow for • CLASS response to emerging Data Center consolidation • data migration activities • begin “pivot” or evolution of CLASS to the Enterprise Archive System • Effort is dependent on final funding allocations for FY13 • Contract may reach ceiling as early as FY15 • Acquisition planning needs to begin by January 2013 *- Contract Year 5

  6. Organization NCDC NGDC NODC CLASS Operations Planning Board COWG Support Services Division Remote Sensing Applications Division CLASS Program Manager – 100% (Kern Witcher) Budget Execution - 5% (T. Leary) COR / Proj Engineer + CWIP – 100% (Jim Goudouros ) Government Task Monitor – 30% (N. Ritchey) Legend Alt COR – 5% (J. Niemiec ) IT Specialist/Systems Engineer - 100% (Jay Morris ) NESDIS CIO NCDC Budget Analyst– 5% (L. Cholid) Operations Manager – 100% (D. Carter)) Other NESDIS OMB Exhibit 300 – 30% (T. Cohen) Contractor Inventory Spec – NCDC – 10% (A. Annis, acting ) Supervisory ISSO – 100% (Scott Koger) Reporting Total of 5 dedicated personnel

  7. Current Program Milestones Fiscal Year FY09 FY10 FY11 FY12 FY13 FY14 FY15 FY16 FY17 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 CLASS GOES-R Campaign NPP Campaign Climate Model Data Nexrad Campaign NDE Campaign Jason-3 Campaign Data Center Migration CLASS Releases Archive & Access PDR Archive & Access CDR SRR ORR Launch Interface PDR Interface CDR 10/15 Launch FOR NCT3 NCT4 ORR IDPS Blk 2.0 CDR IDPS Blk 1.5 CDR IDPS Blk 1.5 Ops IDPS Blk 2.0 Ops J-1 CDR J-1 PDR ORR Launch J-1 SRR JPSS-1 6/15 10/16 CDR v1 SDS v1 CDR v2 Charter ORR ICD CDR PDR ICD NPP Launch Launch SDR SRR PDR CDR ORR 7/11 4/13 6/12 9/12 10/11 2/12 R3 R4 PDR CDR R5.2 R5.3 NEAAT 2 R5.5 R5.4 R6.0 NEAAT 4 NEAAT 1 NEAAT 3 10/14 4/14 4/15 10/15 4/16 10/16 Current Milestone Development Construction Integration/ Test Operations Completed Milestone Notes: CDR: Critical Design Review FOR: Flight Operations Review PDR: Preliminary Design Review SDR: Systems Definition Review SRR: System Requirements Review ORR: Operational Readiness Review NCT: NPP Connectivity Test ICD: Interface Control Document Currently focused on NESDIS and NWS data sets 7

  8. Ongoing Challenges Architectural Boundary Responsibilities Common vs. Unique Data Ingest Data Center Migration Data Center Consolidation Governance Budget Transition to New Architecture 8

  9. CLASS Definition From CLASS Level 1 Requirements (preliminary) signed 2008 Purpose: Support long-term, secure storage of NOAA-approved data, information, and metadata and enable access to these holdings through both human and machine-to-machine interfaces. Capabilities will be provided in 3 primary functional areas (Open Archive Information System Reference Model, OAIS-RM components): • Ingest - provide mechanisms by which data, information, and metadata are transferred to and organized within the storage system. Issue: • Feasibility – Cannot afford/maintain unique solutions 9

  10. CLASS Definition (con’t) • Archival Storage - provide common enterprise means for data, information, and metadata to be stored by the system and the capability to refresh, migrate, transform, update, and otherwise manage these holdings as part of the preservation process. • Access- provide common enterprise access capability enabling users to identify, find, and retrieve the data and information of particular interest to the user. Issue: • Open to interpretation – IT systems versus Data Center stewardship responsibilities • Optimal data access varies by observations (vertical profiles vs. spatial patterns, Gridded vs. swaths, Time series vs. synoptic) 10

  11. CLASS Goals Goals: As an enterprise solution, CLASS will reduce anticipated cost growth associated with storing environmental datasets by: • Providing common services for acquisition, security, and project management for the IT system supporting NOAA Archives • Consolidating stove-pipe, legacy archival storage* systems • Relieving data owners of archival storage-related system development and operations issues Archival storage provides the services and functions for the storage, maintenance and retrieval of archival information packets. Archival storage functions include receiving archival information packets from ingest and adding them to permanent storage, managing the storage hierarchy, refreshing the media in which archive holdings are stored, performing routine and special error checking, providing disaster recovery capabilities, and providing archival information packets to access to fulfill orders 11

  12. High Level Requirements for CLASS From CLASS Level 1 Requirements (Preliminary 2008) 5.1 Core Mission Requirements 5.1.1 CLASS shall provide defined and documented human and machine-to-machine interlaces by which archives may securely store, maintain, and provide access to their data, information, and metadata holdings for indefinite periods. 5.1.2 CLASS shall ingest, provide long-term, secure storage, and provide access to baseline information holdings 5.1.3 CLASS shall provide long-term, secure storage of and common access to information pertaining to processing of CLASS maintained information holdings, including documentation, processing algorithms, and procedures. 5.1.4 CLASS shall comply with applicable National Archives and Records Administration (NARA) regulations. 5.1.5 CLASS shall initiate pilot programs with the GEO-IDE project to support risk reducing development and phased integration of standards for metadata, machine-to-machine interlaces, and archives. Proposed NOAA Action: Finalize CLASS Level 1 Requirements 12

  13. Current CLASS Governance The CLASS Program has numerous oversight and approval authorities From the CLASS Level I Requirements • CLASS is designated a "major" NOAA system in accordance with criteria in NOAA's Administrative Order (NAO) 216-108. • NOAA Observing Systems Council (NOSC) is the designated oversight council and will review the CLASS project at each Key Decision Point (KDP). • The project will be reviewed as a major project by the Program Management Council (PMC), unless delegated. • As an IT project, the CLASS project will be reviewed by the NOAA Information Technology Review Board (NITRB) and by the Commerce Information Technology Review Board (CITRB). • Changes to project baseline, scope, and direction shall be approved by the Deputy Undersecretary for Oceans and Atmosphere Proposed NOAA Action: Provide Clear Lines of Authority and a well defined Governance Structure 13

  14. Data Center Migration

  15. Goal, Objective, and Outcomesfrom NOAA Data Centers Data Migration Plan v1.2 signed May 2011 Data Center Migration Status • Goal:Prior to the end of FY 2015, the CLASS Operational System will be the primary safe storage/access capability for all environmental data holdings under the auspices of NOAA’s Data Centers. • Objectives: The objective of this plan is migration of all historical environmental data holdings on current Data Center storage systems to CLASS, in addition to the ingest into CLASS [of] near real-time environmental data streams currently received by the Data Centers. This objective involves archival storage as well as elements of ingest, data access, data management, and preservation planning.

  16. Starting point Data Center Migration Status(con’t) • Incoming Data Streams already migrated to CLASS include • POES, GOES, Jason-2, MetOp • Incoming Data Streams solely in CLASS • Suomi-NPP, GCOM-W, (future GOES-R, future JPSS, future Jason-3)

  17. Data Center Status Data Center Migration Status(con’t) • Each Data Center developing its own individual plans including schedule, metrics and milestones • CLASS PM included several pilot activities in CLASS’ Contract Year 4 (CY4) • Cloud access pilot (NODC, NCDC) - ongoing • NEXRAD pilot (NCDC) - completed • NCDC in situ migration prototype activity - completed • Next steps to be included in CLASS’ CY5 activities

  18. Data Center Migration Status(con’t) CLASS accomplishments in FY12 for Data Center Migration Aerospace completed Phase I (“as is” analysis) of Data Center Con Ops Completed Data Center Migration Requirements Documents Completed Data Center Migration Interface Control Documents Migrated NCDC Historical Data Set Completed NexRad Pilot 2 week migration test Initiated Cloud Pilot Project

  19. Data Center Migration Status(con’t) • Uncertainty regarding FY15 timeline • Reductions in CLASS funding available for migration activity could delay implementation • Increasing comms costs, operational costs • FY13 President’s Budget has increased ORF to solve this • Data Center consolidation activities • Individual Data Center plans not yet finalized • Several possible technological solutions still being evaluated with Data Centers and CLASS

  20. Summary Data Center Migration Status(con’t) • The Data Centers have an overall plan in place for the migration of their historical environmental data to CLASS by FY15. • Many historical data sets already migrated – 60% by volume – but many smaller sets still need to be completed. • Several large data streams already migrated, but many incoming data streams still need to be transitioned to CLASS storage. • Each Data Center working on their individual plans and coordinating with CLASS PM. • CLASS PM will address next steps in FY13/CY5, contingent upon available funds for migration activities.

  21. Current Capabilities

  22. Direct Connectivity to: • ESPC- NOAA Environmental Satellite Processing Center • National Ice Center • NOAA Coast Watch • JPSS Interface Data Processing Segment (IDPS) Current CLASS Assets Functions Test and Integration Environment • Operations • Ingest • Storage (Disk & Tape) • Public Access Development Team Fairmont, WV CLASS NCDC, Asheville NC CLASS NGDC, Boulder CO CLASS NSOF, Suitland MD Satellite Landing Zone Development Environment Replication via NOAA Science Network(N-WAVE) Users Users • Key Capabilities: • Tape Library Capacity – 2- 10,000 tape robotic libraries with a total storage capacity of 15.5 Pb (LT04 Tapes/ Native) • Spinning Disk Capacity- 2.1 Pb (NSOF, NCDC, NGDC) • 10 Gb/sec Internal Network Backbone • Redhat Linux OS Server Count- 48 (576 processors) • 10 Gb/Sec WAN (N-Wave)

  23. CLASS interfaces to NPP and JPSS-1 Primary JPSS IDPS Backup system for COOP Event Backup IDPS at CBU CURRENT IDPS Block 1.2 NPP GRAVITE STAR IDPS SDS C3S OPS RN OPS RN NSOF Fairmont GRAVITE IFCs I&T RN I&T RN FUTURE IDPS Block 1.5/2.0 NPP, J-1, J-2 CLASS NGDC CLASS NSOF NWAVE NWAVE 10Gb/s CLASS NCDC Boulder Asheville OPS FSN OPS FSN = Receipt Node – Data delivery to archive = Full Service Node (Ingest), Archive, and User Access = CLASS backbone WAN

  24. CLASS Interface to GOES-R GOES-R (In Development) 24

  25. Data Access from CLASS Present CLASS Preservation Planning Data Management CONSUMER PRODUCER requests Ingest Access results Archival Storage Administration Search and find info Submit ad-hoc order Submit order Submit standing order Consumer Submit service request 25

  26. NPP Data Dissemination GRAVITE STAR IDPS SDS C3S CLASS to SDS Near-instantaneous subscription delivery (~3.3 TB/day) CLASS to GRAVITE Near-instantaneous subscription delivery (~3.3 TB/day) Public Subscriptions Near-instantaneous delivery Svalbard, Norway CLASS NGDC ~4.7TB/day CLASS NSOF Public Subscriptions Replication ~20-min NWAVE 10Gb/s IDPS to CLASS 6-hour delay CLASS NCDC 6-hour Delay imposed by CLASS Minimizes retransmissionsrequests Ensures control of limited distribution files. ~4.7TB/day Public Ad hoc orders Public Ad Hoc Block Orders (up to 18 Tb/day) 50% complete within 0 to 6 hours 27% complete within 6 to 12hours 22% complete within 12 to 72 hours 1% complete over 72 hours Public Ad Hoc Bulk Orders Up to 7 days CLASS to STAR Near-instantaneous subscription delivery (~3.3 TB/day) Note: Near- Instantaneous subscription delivery is limited by network bandwidth

  27. Data Currently Archived in CLASS Represent 95% of the Archival Holdings by Volume (Single Node) 27

  28. Customer Groupings that Access CLASS CLASS Users by Domain TB 2012

  29. Current Constraints • Access limited through CLASS Web site • New data sets to archive require significant SW development efforts • Implementation of Trusted Internet Connection (TIC) will effect throughput and system performance • Access to data holdings have limited on-line disk capability • Manual vs. automated operations (ex. Load balancing) CLASS is addressing these constraints though several initiatives and standard software releases that will be discussed as part of the enterprise discussion 29

  30. System Evolution Ongoing Initiatives

  31. CLASS Projected Archive Volumes

  32. NOAA Enterprise Archive System (CLASS) Cumulative Total Volume by Data Type

  33. CLASS Projected Ingest and Delivery

  34. CLASS Evolution to an Enterprise Archival System Evolve the existing CLASS hardware and software infrastructure into a distributed, modular, service-oriented architecture Allow greater flexibility in supporting, to the maximum extent possible, not only large data arrays from satellite programs but also provide additional archival storage services for all of NOAA’s environmental data that has been approved for archive . Working with the NOAA National Data Centers, the new Enterprise Archival Storage architecture will consist of: • Generic ingest services for flexible data acquisition, • Flexible access services using existing community standard, open-source and emerging technologies such as cloud services, • Standardized metadata repository to support a variety of search and discovery services and • Long-term, secure archival storage and data management capabilities. .....continued…

  35.  CLASS Evolution to an Enterprise Archival System(con’t) Future CLASS (in green) Present CLASS Access Services Dissemination Services Preservation Planning Pre-ingest Services Access Interface Satellite Commercial Cloud Services NOAA Enterprise ArchivalStorage System Data Gateway PRODUCER CONSUMER Data Management Climate. Gov Ordering Discovery requests Private Cloud Services Ingest Interface Diss. Interface Model Data Gateway Ingest Access Direct Delivery results Archival Storage Insitu Data Gateway Admin. Interface Web Accessible Folders Administration Administration and Preservation CLASS provides components of the OAIS-RM as services and interfaces. Additional systems and services implemented by the data centers. 35

  36. Ongoing InitiativesLeading toward the Enterprise Archival System CLASS pivot points Access Services Pre-Ingest Services (Receipt Node/Gateway) Dissemination Services Common Access Interface (M2M) Pre-ingest Services Access Interface Satellite Commercial Cloud Services NOAA Enterprise ArchivalStorage System Data Gateway Common Storage Service (Cloud Pilot Project) Climate. Gov Ordering Discovery Private Cloud Services Ingest Interface Diss. Interface Model Data Gateway Core System Upgrades • Receipt Node/ Gateway • Performs Data Integrity Checks • Provides Temporary Data Storage • Provide translation of data if necessary • Provides a static XML schema for passing the elements to CLASS • Assigns UUID for preservation in CLASS • Passes metadata information to search and discovery services Direct Delivery Data Managers’ Toolkit Insitu Data Gateway Admin. Interface Web Accessible Folders Rules Base Middleware Common Interface Definition Administration and Preservation 36

  37. Ongoing InitiativesLeading toward the Enterprise Archival System CLASS pivot points Access Services Common Ingest Interface (Gateway) Dissemination Services Common Access Interface (M2M) Pre-ingest Services Access Interface Satellite Commercial Cloud Services NOAA Enterprise ArchivalStorage System Data Gateway Common Storage Service (Cloud Pilot Project) Climate. Gov Ordering Discovery Private Cloud Services Ingest Interface Diss. Interface Model Data Gateway • M2M (Common Access API) • Search, order, and holdings information • RESTful, Asynchronous, interaction with external systems • Returns all data submitted via Common Ingest Interface • Metadata publishing to data center catalogs Core System Upgrades Direct Delivery Data Managers’ Toolkit Insitu Data Gateway Admin. Interface Web Accessible Folders Rules Base Middleware Common Interface Definition Administration and Preservation 37

  38. Ongoing InitiativesLeading toward the Enterprise Archival System CLASS pivot points Access Services Common Ingest Interface (Gateway) Dissemination Services Common Access Interface (M2M) Pre-ingest Services Access Interface Satellite Commercial Cloud Services NOAA Enterprise ArchivalStorage System Data Gateway Common Storage Service (Cloud Pilot Project) Climate. Gov Ordering Discovery Private Cloud Services Ingest Interface Diss. Interface • Common Storage Service • Access area for newly arrived and often used data sets • Single interface for all access systems • Push once, read many model • Utilizes Cloud Services and Cloud technologies in hybrid model • Extends CLASS cache out to the Enterprise Model Data Gateway Core System Upgrades Direct Delivery Data Managers’ Toolkit Insitu Data Gateway Admin. Interface Web Accessible Folders Rules Base Middleware Common Interface Definition Administration and Preservation 38

  39. Ongoing InitiativesLeading toward the Enterprise Archival System CLASS pivot points Access Services Common Ingest Interface (Gateway) Dissemination Services Common Access Interface (M2M) Pre-ingest Services Access Interface Satellite Commercial Cloud Services NOAA Enterprise ArchivalStorage System Data Gateway Common Storage Service (Cloud Pilot Project) Climate. Gov Ordering Discovery Private Cloud Services Ingest Interface Diss. Interface Model • Hardware/OS refresh • All CLASS software migrated to Linux on new hardware • Significant improvements in performance • HPSS migration will provide more reliability and flexibility in configuration options Data Gateway Core System Upgrades Direct Delivery Data Managers’ Toolkit Insitu Data Gateway Admin. Interface Web Accessible Folders Rules Base Middleware Common Interface Definition Administration and Preservation

  40. Ongoing InitiativesLeading toward the Enterprise Archival System CLASS pivot points Access Services Common Ingest Interface (Gateway) Dissemination Services Common Access Interface (M2M) Pre-ingest Services Access Interface Satellite Commercial Cloud Services NOAA Enterprise ArchivalStorage System Data Gateway Common Storage Service (Cloud Pilot Project) Climate. Gov Ordering Discovery Private Cloud Services Ingest Interface Diss. Interface Model Data Gateway • Common Administration Interface • Single interface data managers supports: • Stewardship tools • Metadata updating and versioning • Data holdings monitoring and statistics Core System Upgrades Direct Delivery Data Managers’ Toolkit Insitu Data Gateway Admin. Interface Web Accessible Folders Rules Base Middleware Common Interface Definition Administration and Preservation

  41. Ongoing InitiativesLeading toward the Enterprise Archival System CLASS pivot points Access Services Common Ingest Interface (Gateway) Dissemination Services Common Access Interface (M2M) Pre-ingest Services Access Interface Satellite Commercial Cloud Services NOAA Enterprise ArchivalStorage System Data Gateway Common Storage Service (Cloud Pilot Project) Climate. Gov Ordering Discovery Private Cloud Services Ingest Interface • Rules based middleware • Data Transport • Moves data objects between systems • Data Information Sharing • Synchronizes metadata between systems • Holds data location information • Orchestration • Implements Enterprise process flow and routes products through Enterprise • Controlled by data managers and stewards Diss. Interface Model Data Gateway Core System Upgrades Direct Delivery Data Managers’ Toolkit Insitu Data Gateway Admin. Interface Web Accessible Folders Rules Base Middleware Common Interface Definition Administration and Preservation 41

  42. Ongoing InitiativesLeading toward the Enterprise Archival System CLASS pivot points Access Services Common Ingest Interface (Gateway) Dissemination Services Common Access Interface (M2M) Pre-ingest Services Access Interface Satellite Commercial Cloud Services NOAA Enterprise ArchivalStorage System Data Gateway Common Storage Service (Cloud Pilot Project) Climate. Gov Ordering Discovery Private Cloud Services Ingest Interface Diss. Interface Model Data Gateway • Common Interface Definition • Supports interaction between the components in the OAIS-RM • Granules • Collections • Browse Images • Static and well documented • Common Set of Elements • Common Schema • Unique Identifiers Core System Upgrades Direct Delivery Data Managers’ Toolkit Insitu Data Gateway Admin. Interface Web Accessible Folders Rules Base Middleware Common Interface Definition Administration and Preservation 42

  43. Evolution Roadmap (preliminary) FY2012 FY2013 FY2014 FY2015 FY2016 Phase I Phase II Phase III NODC Metadata Cloud Access AccessPath Dissemination NCDC Stewardship Pilot Staging Data Net NGDC IRODS HPSS M2M Data Center Migration NCDC NGDC NODC Archive Storage Archive Path NPP Service MOB GCOM-W On-Hold Programs Jason Concurrent CLASS Initiatives JPSS GOES-R

  44. CLASS Evolution to an Enterprise Archival System CLASS will evolve to: Implement well-defined interfaces based on industry-standard protocols and best practices. • Reduce the need for costly custom software development and allow the flexibility to use a multitude of different COTS discovery tools such as rules-based middleware for data sharing, search and discovery. Be scalable to support growth in volume and is extensible • Add functionality and services as new technologies mature and enterprise needs surface. Provide a significant and rapid return on investment for NOAA and the Nation • Enable the efficient and inexpensive archive of many NOAA products that are currently awaiting archival services • Enables the Data Centers to provide new services and products to their customers. . Allow the efficient leveraging of other elements of the NOAA Enterprise • Ground system, processing centers, distribution capabilities through generic services and interfaces

  45. Other Initiatives that Intersect with CLASS

  46. Federal Data Center Consolidation • Direct Connectivity to: • Environmental Satellite Processing Center( ESPC) Product Distribution and Access (PDA) • National Ice Center • NOAA Coastal Watch • JPSS Interface Data Processing Segment (IDPS) • GOES-R Product Distribution (PD) Functions Test and Integration Environment Notional Fairmont Consolidation Plan System Configuration • Operations • Ingest • Storage (Disk & Tape) • Public Access Development Team Fairmont, WV NOAA Enterprise Archival System, Fairmont, WV NSOF, Suitland MD • Direct Connectivity to: • Environmental Satellite Processing Center( ESPC) Product Distribution and Access (PDA)Backup • JPSS Interface Data Processing Segment (IDPS) Backup • GOES-R Product Distribution (PD) Backup Satellite Receipt Node/ Gateway Development Environment NOAA Science Network(N-WAVE) Users Notional Commercial Cloud Services Off-site Backup Facility Data Centers, NOAA Gateways Fairmont Consolidation Approach Phase I – Migrate Development, Test and Integration Environments Phase II- Establish backup Ingest Capabilities for JPSS, GOES-R, PDA Phase III- Migrate NGDC assets to Fairmont Phase IV- Consolidate NCDC assets into Fairmont

  47. NESDIS Data Center Consolidation Implementation team established Begin in 2015 Budget? Complete by 2023? Three Data Centers into One? Final Organizational Structure Plan not yet approved, in preliminary planning phases but is a popular idea within NOAA

  48. NESDIS Enterprise Ground System NOAA/NESDIS assembling EGS to meet its mission Planning phases • Technical Reference Model • PDA feasibility Study • Common Storage Various approaches under consideration include archive, dissemination Driving question- Does CLASS to become the Enterprise Archive part of the EGS? 48

  49. CLASS Programmatic Challenges Support for the Revision and Rebaseline of Level 1 Requirements • Analysis of Alternatives (Exhibit 300) • Update Full Lifecycle Cost Estimate (GAO Report) Clear program governance structure Adequate Funding for Operations Rebalance of Requirements/Cost/ Schedule for GOES-R and JPSS • Budget profile currently does not align with requirements or delivery schedule • Funding Demarcation • Where does the Program’s funding responsibility end? • Post Launch Support • Operations • Access

  50. CLASS Technical Challenges MetaData- Need to define common elements and format (ISO, ECHO, Etc.)  for ingest and access Interoperability- Date Information Exchange Language ( can include meta-data and data formats) for ingest and access Catalog- Search and discovery services outside of CLASS domain (Catalog Services for the Web) Data harvester Storage- External storage for dissemination and access outside the CLASS domain Con Ops- CLASS has a different role for satellite data and insitu data

More Related