1 / 14

Charge to AWG SST Application Team: Product Validation

Charge to AWG SST Application Team: Product Validation. Validation priorities and science drivers What needs validating and why? Operational considerations Minimum requirements driven by MRD Sea Surface Temperature Ocean Currents (as part of Ocean Dynamics Team)

huey
Download Presentation

Charge to AWG SST Application Team: Product Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Charge to AWG SST Application Team: Product Validation Validation priorities and science drivers • What needs validating and why? • Operational considerations • Minimum requirements driven by MRD • Sea Surface Temperature • Ocean Currents (as part of Ocean Dynamics Team) • Ocean Fronts (as part of Ocean Dynamics Team) • Error characteristics of “truth”/in-situ data • Buoys (SST) • M-AERI (SST and Ocean Fronts) • CODAR (Ocean Currents) • OSCAR (Ocean Currents) • Include NCEP model forecasts/nowcasts in validation • Interaction with end users, such as CoastWatch, NCEP, etc

  2. Charge to AWG SST Application TeamPre-Launch Product Validation Validation data sources • All need error characteristics established • Moored Buoys (SST) • Drifting Buoys (Ocean Currents) • CODAR (Ocean Currents) • M-AERI (SST and Ocean Fronts) • TSG from Ships-of-Opportunity (Ocean Fronts) • OSCAR (Ocean Currents) • Various NCEP products • Heritage GOES Frontal Product

  3. Charge to AWG SST Application Team:Post-Launch Product Validation Validation data sources • All need error characteristics established • Moored Buoys (SST) • Drifting Buoys (Ocean Currents) • CODAR (Ocean Currents) • M-AERI (SST and Ocean Fronts) • TSG from Ships-of-Opportunity (Ocean Fronts) • OSCAR (Ocean Currents) • Various NCEP products • Heritage GOES Frontal Product

  4. Charge to AWG SST Application Team:Product Validation • Methodology • Identify where and when, and under what conditions, validation of product is difficult/easy • Better SST validation in cloud-free sky • High SST gradient areas • good for fronts • added complication for SST • SST: Skin vs Bulk • Currents: Within range of CODAR is best • Fronts: Difficult to determine and validate in Gulf of Mexico during summer due to near homogeneous SST field. • Identify limitations • Various regions (Coastal California) have seasonal cloud-cover • All SST validation is dependent on cloud-mask • Moored buoys provide limited spatial coverage • Frontal validation reliant on Ships-of-Opportunity

  5. Methodology (continued) • How will product accuracies be reported? • Validation against a subset of the available input-output data • Daily/weekly/monthly reports • various portals: email, webpage, etc • Raw statistics, histograms, plots of statistics versus other relevant parameters (time, latitude, other EDRs, etc) • What routine performance metrics are needed? • RMSE (AWG requirement), bias, variance, StDev • All metrics sub-setted versus other parameters

  6. Charge to AWG Application Teams for Product Validation • Methodology • How will validation statistics will be stratified • Season • Region • Validation Data Set • Specific Buoy/CODAR/S-of-Op • Nowcasts/Forecasts • Truth EDR (ie, “Truth SST between 0 and 5 C”) • Cloud/Clear confidence • Other EDR (ie Wind Speed, Air-Sea Temperature Difference) • Other varying time and space scales • What output diagnostics are needed? • Histograms, scatterplots, indeterminate data; maps of clear retrieval frequency • What data (satellite and validation) needs to be saved in “match” files that allows for routine and deep-dive assessments of product quality? • Latitude, longitude, time • Input IR / vis channels • GOES-R SST, Fronts, Currents • Buoy Matchups • CODAR Matchups • OSCAR Matchups

  7. Charge to AWG Application Teams for Product Validation • Methodology • Recommendation on approach for detailed cal/val • Routine generation of validation statistics and difference plots (estimated vs. observed) • Anomalies versus climatological fields • Regular case studies of flagged regions to determine causes of anomalies and potential cures • Regular feedback from user community

  8. Charge to AWG Application Teams for Product Validation • Methodology • Short term/Long term validation considerations • SST: utilize current SST Cal/Val methodology as baseline approach for GOES-R • Fronts: perhaps using climatological frontal positions for future validation, and developing anomaly fields? • Currents: As this is relatively new product, be open to any potential opportunities in updating validation approach.

  9. Recommended MRD Changes: SST • Ask Sasha!

  10. Recommended MRD Changes: Currents • Currents • Threshold: • Include directional accuracy of 45 degrees • Mapping accuracy of 2 km at nadir? • Product refresh rate • 3 hrs within CONUS & EEZ • 12 hrs otherwise • Accuracy of 2 m/s • Cloud condition qualifier: no retrievals in cloudy areas • Goal: • Directional accuracy of 45 degrees • Product refresh rate • 60 mins within CONUS & EEZ • 6 hrs otherwise • Accuracy of 1 m/s • Cloud conditions qualifier: no retrievals in cloudy areas

  11. Recommended MRD Changes:Fronts • Fronts • Threshold: • No MRD requirements at this time • Similar to Currents • Accuracy of 4 km • Product Refresh Rate of 6 hrs • Goal: • Accuracy of 4 km • Product Refresh Rate of 60 mins

  12. Revisit readiness of algorithms with respect to meeting requirements • Define capabilities and possible shortfalls of the algorithms at the first delivery, second ….. • Fronts • Cloud contamination • Flagging of high gradient SST regions as clouds • Currents • Cloud contamination • Image Navigation • Lack of near-term consecutive images • Magnitude/Direction under variable wind conditions • SST

  13. Revisit readiness of algorithms with respect to meeting requirements (continued) • Mitigation strategy for improving performance (e.g. first delivery SST meet spec in clear conditions, but not in moderate aerosol conditions, ….). • Fronts & Currents • Utilizing multi-image averages (6hrs?) to alleviate clouds in persistently cloudy areas • Flexibility to adapt from SST to Brightness Temperatures in regions of horizontally homogenous SST • Use NCEP Real Time Ocean Forecasting System as “first guess” for Ocean Current direction • Survey capabilities of other new and existing algorithms at addressing shortfalls in selected algorithms • SST

  14. Define issues that needs to be address by the GOES-R risk reduction program. • Detailed analysis studies (general) • Analyze regions and situations where algorithms perform poorly • Potential exploratory algorithm development • Improved exploitation of new ABI channels (use alternative proxies such as MODIS) for SST • Decision Tree utilizing various ocean current algorithms under sub-optimal conditions? • Potential of using Brightness Tb's for ocean currents

More Related