clivar perspective n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
CLIVAR PERSPECTIVE PowerPoint Presentation
Download Presentation
CLIVAR PERSPECTIVE

Loading in 2 Seconds...

play fullscreen
1 / 11

CLIVAR PERSPECTIVE - PowerPoint PPT Presentation


  • 98 Views
  • Uploaded on

CLIVAR PERSPECTIVE. Re-analysis and Seasonal Forecasting activities are common ground between the GODAE GOV and CLIVAR communities. This intersection should be acknowledged and not disputed. Often there are the same groups involved. The intersection covers several GOV areas:

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'CLIVAR PERSPECTIVE' - tangia


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
clivar perspective
CLIVAR PERSPECTIVE
  • Re-analysis and Seasonal Forecasting activities are common ground between the GODAE GOV and CLIVAR communities.

This intersection should be acknowledged and not disputed. Often there are the same groups involved.

The intersection covers several GOV areas:

    • Evaluation of the Observing System (discussed)
    • Metrics, Intercomparison (discussed)
    • Coupled forecasting (not covered in this workshop, brief reports)
  • Shared methodology, software and data sets
    • Data Assimilation methods
    • Models: with GOV leading the way in HR. Climate following
    • Observation Quality Control
    • Validation Data Sets
    • Surface Fluxes
    • Intercomparisons, ensembles, diagnostics, data repositories
clivar perspective1
CLIVAR PERSPECTIVE
  • Issues addressed during the workshop
    • Status of on-going efforts and developments on DA
    • How to consolidate community efforts for Observing System Evaluation.
    • How to consolidate community efforts for providing useful products.
    • Need of a framework for a 2-way communication between observation providers and users (climate and ocean forecasting communities).
  • Briefly presented but not discussed
    • Design/impact of future observing systems
    • OSSES

In what follows

Brief summary of recent developments

Observing System Evaluation: Methods,diagnostics, consolidation

Reanalysis: metrics for evaluation, intercomparison, product consolidation.

ongoing efforts and developments
Ongoing Efforts and Developments
  • Delayed OSEs for seasonal forecasting and re-analyses
    • JMA (completed) and ECMWF (ongoing)
    • Note: in seasonal forecasting the analysis system (ocean forced) is different from the forecasting system (coupled model).
  • New Ocean Re Analyses products:
    • GLORYS PEODAS GMAO
    • CFS ORAS4 CMCC
  • Intercomparison of ReAnalyses Heat Content
  • Development on DA system
    • Formulation of B: merging ensemble and variational techniques.
    • Efficient algorithms
  • Techniques for automatic diagnostics on the impact of observations on analyses and forecasts: DoF, BV, LV,…
  • Relationship between Volume and Heat Transports. Implications for the AMOC monitoring array
observing system evaluation methodologies
Observing System Evaluation (Methodologies)
  • Routine Near Real Time OSES (not applicable to seasonal/reanalyses)
  • Routine monitoring of observation impact with easy to compute metrics in observation space
    • Likely to underestimate impact. Metric to be defined
  • Delayed mode OSEs
    • Performed for relevant observing systems
    • Frequency: whenever the forecasting/analysis system change
    • Expensive and Not reactive enough.
    • In systems with bias correction, not easy to separate the impact of individual forecasting systems
    • Still, recommended since it allows more in deep analysis
  • Observation Footprint: this can be done outside the DA community
  • Automated Routine Diagnostics:
    • desirable, but not all the systems are ready. Needs development
    • Automatic forecast sensitivity difficult in current seasonal forecasting systems
observing system evaluation diagnostics
Observing System Evaluation (Diagnostics)
  • How to measure impact?
    • The impact of the OS on skill is the ultimate metric, but quite stringent, because or forecasting systems are not discerning enough.
  • Hierarchy of diagnostics:
    • Illustrate impact. Descriptive. Specific cases. No measure of skill
    • Statistics of the impact. (Mean, RMS). Descriptive. No measure of skill.
    • Impact in case studies with in depth analysis and scientific reasoning for potential improvement.
    • Impact in case studies with clear demonstration of improvement
    • Statistics of improvement. Impact on Skill.
    • Impact on Skill with reasoning
  • Need to be careful with interpreting and formulating the results. They can easily be misunderstood.
    • Results depend on the DA and forecasting system
observing system evaluation consolidation
Observing System Evaluation (Consolidation)
  • Contribute to the exchange of QC decisions
    • Reanalysis/SF systems to send data to GODAE server
  • Gathering of FG-Obs data files in a common repository
    • This can be the same GODAE server.
  • University of Reading can start diagnostics and visualization projects shortly if the data is in a central repository
  • Keep a good up to date Bibliography on Observing System Evaluation in GODAE/GSOP webpages
  • Observation Impact Statements (OIS)
    • Need to Identify targets and applications
    • Qualify: It was suggested GOV OIS. An alternative is to describe the area of benefit: OF OIS, SF OIS, ORA OIS,….
  • Other
    • Need engage with the modelling community
    • Need for targeted studies addressing specific questions
    • Need for specific case studies
consolidation of reanalysis products i
Consolidation of Reanalysis Products (I)
  • Clear need to exploit the value of the ensemble or reanalysis products
    • Especially those brought up to Near Real Time.
  • A proposal has been put forward, which appears viable
    • It follows the criteria of minimum effort rather than a comprehensive solution. The implementation can be improved as time progresses. The participants, hosts can change as considered convenient
  • Details
    • OOPC web page to host some climate indices of multi-reanalysis subsurface products:
      • Heat content indices (different regions. Possibility: same as SST). Public
      • MOC and HT at 26N, comparison with RAPID. It may not be public
    • Voluntary Centres responsible for specific variables. They compute indices for OOPC (if required). They perform more detail comparison, which can be displayed in their own web pages.
    • All interested participant reanalysis centers to provide data in the requested format (either original grid or common grid).
consolidation of reanalysis products ii
Consolidation of Reanalysis Products (II)
  • Voluntary Centres
    • NCEP for heat content
    • BMRC for Salinity
    • University Of Reading for Transports
    • Grenoble for surface fluxes
    • MERCATOR for SL?
    • Canada for Sea Ice
  • A document with timelines and requirements to be circulated by the end of July 2011.
metrics for evaluation of reanalysis products
Metrics for Evaluation of Reanalysis Products
  • Not discussed in detail
  • Two types of metrics:
    • Metrics for evaluation of standard DA system
      • O-B, O-A,…
      • Consistency with prescribed B & R
    • Metrics to evaluate the temporal and spatial consistency
      • Long time series needed
      • Preferible independent data
  • Some recommended data sets have already been identified in previous community documents: GSOP, GODAE
    • They may need revising (example, sea level gauges)
  • Individual groups are responsible for the validation of their reanalysis (as they gain credibility)
actions
Actions
  • Agree on contents, format for the QC and FG/AN –OBs (FB). Invite reanalysis groups to send data to the GODAE US repository. (Matt, Hernandez,Jim, Magdalena)
  • To send relevant data to the GODAE US repository
  • Continue diagnostics on QC decisions. (Keith)
  • To start diagnostics on FG-Obs (Keith)
  • Keep a good up to date Bibliography on Observing System Evaluation in GODAE/GSOP webpages (Kirsten, Nico, all)
  • Draft document on data requests from reanalysis groups for Routine monitoring (Yan Xue (HC), Oscar (salinity), Keith (MOC), Bernard (surface fluxes), Sea-Ice (Greg).
  • Agree on minimun set of indices to display in OOPC webpages (Ed Harrison, Yan Xue, Keith)
  • Producing Centers to send data to the processing centers (all)
  • Contact SL community to recommend a good quality subset of tide gauges for reanalyses validation (Aviso SL very valuable but short). (Guilles Larnicol)
recommendations
Recommendations
  • Quite a lot regarding diagnostics, interpretation, evaluation
    • There can be a summary at the end
  • Recommendations on both routine and one-off evaluation of the observing system by different groups.
    • Publish or make the results be known to the community