l2os product performance summary v550 highlights n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
L2OS: Product performance summary v550 highlights PowerPoint Presentation
Download Presentation
L2OS: Product performance summary v550 highlights

Loading in 2 Seconds...

play fullscreen
1 / 23

L2OS: Product performance summary v550 highlights - PowerPoint PPT Presentation


  • 165 Views
  • Uploaded on

L2OS: Product performance summary v550 highlights. The SMOS L2 OS Team . Many presentations made on SSS retrieval issues starting at IOCP Keypoint 1, plus several validation exercises using Argo floats data

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

L2OS: Product performance summary v550 highlights


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
smos l2os performance
Many presentations made on SSS retrieval issues starting at IOCP Keypoint 1, plus several validation exercises using Argo floats data

Present: First SMOS general reprocessing has provided a coherent data set built with L1OP 5.04 and L2OS 5.50 for product performance analysis

SMOS L2 OS Product Performance Status Report issue 2.3, October 2012 (draft under improvement):

Status of main SSS retrieval problems

Analysis of one year of SMOS salinity, 2011

User Data Product performance review

Status tables for UDP fields: grid point data, salinity, flags and quality descriptors

SMOS L2OS performance

2

smos l2os performance1
SMOS L2OS performance
  • L2OS operational processor history
  • reprocessing 2010-11
  • v550 main change: double OTT (asc/desc), updated monthly (bi-weekly and centeredfor reprocessing)
  • Other minor 500 & 550 modifications (several flags/filtering improvements, bugs fixed, new LUTs for roughness,...)
  • Strong impact of L1 modifications
smos os retrieval problems
Issues still degrading SMOS SSS retrieval. Different degrees of improvement with the new L1 & L2 processors versions:

Land contamination

RFI

Unrealistic drifts: long/short TB drifts, geophysical

Sun tails

Galactic noise modelling

Roughness effects

Auxiliary data quality: wind vector, TEC, ...

3, 4, 5 impacting the spatial bias correction (OTT technique)

SMOS OS retrieval problems

Paul (+ Nicolas)

Paul, Joe (+Jérôme)

Jacqueline

Joe

Paul

4

slide5

L1 346 (withbugcorrected) + L2 317 - climatology

L1 504 + L2 550 - climatology

RFI

RFI

Land contamination

Expectedimpact of Gibbs

No flagfiltering has beenapplied,

to keepcontaminated data

byC. Gabarró, ICM/SMOS-BEC

5

slide6

L1 346 + L2 317 (withbugcorrected) - climatology

L1 504 + L2 317 - climatology

L2 550 - L2 317

6

slide7

Asc-desc monthly average

March 2011 reprocessed

July 2011 reprocessed

7

slide8

Asc-desc monthly average

September 2011 reprocessed

November 2011 reprocessed

8

slide9

Roughnesseffects

Pre-launch roughness models and fitted to SMOS data

  • Wind induced Tb at θ=32.5°from 3 models and SMOS data
  • Pre-launch: bad fit to ECMW wind speed sensitivity
  • Tuned after analysis of SMOS data: much better agreement
  • Clear non-linear behaviour with wind speed

Pre

Post

V-Pol

H-Pol

V-Pol

H-Pol

by S. Guimbard, ICM/SMOS-BEC

slide10

Roughnesseffects

Differences between 3 roughness models: Highly reduced after fitting to SMOS data (SSS2 is now also semi-empiric)

Yearly statistics of L3 1º x 1º monthly maps (filtering out wind >12 m/s):

Global S. Pacific N. Pacific

60N-60S OTT region 45N-60N

  • bias STD bias STD bias STD
  • SSS1-SSS2 0.01 -0.02 0.02 -0.01 -0.06 0.01
  • SSS1-SSS3 0.02 -0.03 0.03 0.00 -0.09 0.00
  • SSS2-SSS3 0.01 -0.01 0.01 0.01 -0.03 -0.01
  • Variability in the SSS field due to the different roughness correction is lower than requirements (0.1 for 100 km 30 days averages)
slide11

Roughnesseffects

Retrieved SSS usingthethreeroughnessmodelsimplemented in v550

slide12

Roughnesseffects

AfterfilteringbyDg_quality_SSSx

Ascendingorbiton 19 October 2012

slide13

Product performance review

Objective: to report performance of the fields in the User Data Product

  • Analysis of SSS accuracy
  • Analysis of quality descriptors (flags and counters)
  • Summary in status tables
  • No analysis for Acard (no in situ data for validation)
  • No analysis for non-retrieved parameters (SST, WS)
  • No analysis for by-products (modelled TBx,y,h,v at 42.5º)
slide14

Global SSS maps

10-days/1o averaged SSS

3 -12 August 2011

Zonal average SMOS-Argo

Old processors

+0.2

- 0.2

New processors

L1 v3.46 v5.04

L2OS v3.17 v5.50

by J. Martínez, ICM/SMOS-BEC

14

slide15

SSS accuracy: firstapproach

Zero order accuracy:

comparison to climatology 2011 yearly statistics of 9-days L3 maps in 10ºx10º or 2ºx2º regions of contrasted conditions, different for ascending/descending

  • range usually OK
  • anomaly within few 0.1s
  • important asc/descdiffer.
  • exception: RFI areas, low SST, high variability
slide16

SSS accuracy: in situ validation

Comparison SMOS-Argo:

Proxy for absolute accuracy (with care due to Argo not samplig SSS). Only data set available for global analysis. More precise local comparisons possible (moored buoys, surface drifters)

Diagnostic sites defined in Product Performance Evaluation Plan

  • Interestingoceansituations
  • In situ samplingprograms
  • Expectedproblems
slide17

SSS accuracy: in situ validation

  • L3 binned products, filtering poor L2 grid points, wind <12 m/s, centre of swath, 2011 yearly statistics for ascending/descending/both orbits

Different analysis methodologies:

    • SMOS data selection (filtering by flags)
    • Argo data selection and interpolation to (sub)surface values
    • Match-up criteria
    • Statistical approach
    • BEC: 10-d/2º, monthly/1º maps, SSS3, 400 km, Argo 7.5m, box averaging
    • LOCEAN: 50 km/15-d, 100 km/15-d collocations, SSS1 weighted, 300 km
slide18

SSS accuracy: in situ validation

0.04-0.15

±0.5-0.6

±0.2-0.3

±up to 1.3

BEC: 1º/30-d LOCEAN: 50 km/15-d

slide19

Validation in specificregions

Monthly 1º maps: regional comparisontoArgo

SMOS ascendingorbits

±300 km

3-12 m/s wind

Bias - 0.04

0.02

- 0.07

- 0.15

STD0.25

0.38

0.48

0.31

SSS September 2011

SMOS (up), Argo (bottom)

by J. Boutin et al., LOCEAN

19

slide20

Interannual variability

SMOS : 2011-2010

Big spatial SSS 2011-2010 structures are qualitatively consistent between SMOS and ARGO

SMOS provides higher spatial resolution (+coastal and RFI errors!)

ARGO : 2011-2010

by S. Guimbard, ICM/SMOS-BEC

20

slide21

SSS status summarytable

  • Based on regional computations (yearly averages)
slide22

Flags and descriptors

  • Reported status for:
    • 25 control flags
    • 22 scienceflags
    • 19 productconfidencedescriptors
  • Fields

name, definition, dependencies, thresholds, status, usability, comments

example:

status: OK = implementationchecked

comments: can includeevaluation of flagusefulness in terms of its impact on filtering data to improve the quality of SSS maps. And eventually conclusions on the thresholds values.

conclusions
Analysis of reprocessed 2011 data set

Main issues continue to be the same

Linked to calibration, image reconstruction and modelling of geophysical variability

L2OS team working on possible corrections

First version of Product Performance Status Report well advanced: shaping information for users

SMOS SSS globally not reaching mission requirements, regionally approaching them

Conclusions

23