1 / 14

Summary of MetOp Data Flow and Latency

Summary of MetOp Data Flow and Latency. November 13, 2008. Selina M. Nauman. COPC Action Item 2007-2.13. NESDIS will evaluate the costs needed to get a non-NOAA satellite data (e.g., JASON 2, METOP) data streams available to the OPCs and/or the user community in a more timely manner.

fisseha
Download Presentation

Summary of MetOp Data Flow and Latency

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Summary of MetOp Data Flow and Latency November 13, 2008 Selina M. Nauman

  2. COPC Action Item 2007-2.13 • NESDIS will evaluate the costs needed to get a non-NOAA satellite data (e.g., JASON 2, METOP) data streams available to the OPCs and/or the user community in a more timely manner. • CSAB 22 October, 2008. Recommend closing with Selina Nauman’s briefing to COPC. Briefing will include a comparison of NOAA’s satellite latency with non-NOAA satellite latency.

  3. COPC Action Item 2007-2.13 • NOAA analysis confirms Svalbard to NOAA Gateway in Darmstadt drives the data latency • Issue is prior to NOAA getting the data • Two aspects of the issue need to be considered: • Latency requirement • Current Metop and future Jason-2 data transfer within established latency requirements • Concept of Operations between NOAA and foreign partners • Change to data distribution procedures to reduce latency requires modifying existing agreements and operational scenarios

  4. IJPS System Overview Satellite Svalbard NOAA and NOAA and NOAA and NOAA and MetOp MetOp MetOp MetOp Satellite Satellite Satellite Instruments Instruments Instruments Instruments Fairbanks Fairbanks Fairbanks Fairbanks CDA CDA CDA CDA SOCC SOCC SOCC SOCC PCDA PCDA PCDA PCDA CGS CGS CGS CGS Satellite Satellite ESPC Svalbard Svalbard NOAA and NOAA and NOAA and NOAA and MetOp MetOp MetOp MetOp Instruments Instruments Instruments Instruments Suitland Suitland Suitland Suitland Wallops Wallops Wallops Wallops Darmstadt Darmstadt Darmstadt Darmstadt CDA CDA CDA CDA EPS Overall Ground EPS Overall Ground EPS Overall Ground EPS Overall Ground NOAA Ground Segment NOAA Ground Segment NOAA Ground Segment NOAA Ground Segment Segment (OGS) Segment (OGS) Segment (OGS) Segment (OGS) Links Legend: Links Legend: Links Legend: Links Legend: Space to Ground Space to Ground Space to Ground Space to Ground National National National National International International International International Internal Internal Internal Internal

  5. Determining METOP Data Latency A snapshot of one METOP pass from October 22, 2008 (Julian Day 296) was analyzed The data processing and data flow times for key interfaces were recorded in an Excel spreadsheet Svalbard Darmstadt SOCC ESPC

  6. Current METOP Pipeline Processing Timeline Darmstadt CE Transmission & Orbital File Complete ESPC granule processing Darmstadt CE Transmission starts ~96 ~ 0 ~92 ~108 ~180 ~184 First granule received at DAPE Gateway (~105 min) AOS LOS Last granule received at SPP CDA Svalbard Ingest Data age will be ~ 85 -115 minutes for all granules

  7. Reship of 1 blind orbit N-18 GAC delayed processing by 9+ hours (September 25) Data latency is the time difference between the start time of the GAC/FRAC orbit and the completion of the AVHRR level 1B

  8. Latency Summary • Data flow within ESPC has very little impact on the overall data delay • Data flow from Svalbard to NOAA Gateway in Darmstadt drives the data latency • November 18 – 20 Initial Joint Polar System Coordination • EUMETSAT planned upgrades will be discussed

  9. BACKUP SLIDES

  10. Metop GDS Data Latency

  11. Metop GDS Data Latency (cont.)

  12. MetOp Global Data Stream Data Flow • Svalbard • Acquisition Of Signal • Loss Of Signal • Darmstadt • Core Ground System • NOAA Gateway • Communications Element • SOCC • Communications Element • ESPC • Sky-x • Advanced Front End Processor • Diamond – primary level 1B processor – P570 • Shared Processing Gateway or Data Distribution Server

  13. AFWA ( 64 Mb Total ) 14 Mb 50 Mb ( 1 Gb ) NOAA / NWSTG FNMOC 100 Mb 100 Mb 6 Mb & NCEP ( FE ) ( FE ) ( 26 Mb Total ) ( 74 Mb Total ) FE 100 Mb TLS NWS Monterey 6 Mb 100 Mb ( FE ) 6 Mb Pass thru to NOAA NAVO NOAA / NESDIS ( 12 Mb Total ) UCAR / ( DAPE Gateway ) CDAAC & NAVICE ( 1 Gb Ethernet ) LEGEND : CDAAC – COSMIC Data Analysis & Archive Center FE -- Fast Ethernet Gb – Gigabits / sec Mb – Megabits / sec OC 12 – Connection backside to DATMS - U TLS – Transport Layer Security Link DATMS - U CONNECTIVITY UCAR – University Corp for Atmospheric Research NJDATMS - U Connectivity -- DATMS - U connection to physical AP 1000 2 / 20 / 2008

  14. ESPC PROCESSING (Jason-2 TM-NRT v2)

More Related