1 / 18

Global OSSEs at NCEP

Global OSSEs at NCEP. Stephen J. Lord. NOAA/NWS/NCEP/EMC. http://www.emc.noaa.gov/research/osse. Co-Authors and Contributors. Michiko Masutani *1# , Stephen J. Lord 1 , John S. Woollen 1 + , Weiyu Yang 1 + Haibing Sun 2 % , Thomas J. Kleespies 2 , G. David Emmitt 3 , Sidney A. Wood 3

victorl
Download Presentation

Global OSSEs at NCEP

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Global OSSEs at NCEP Stephen J. Lord NOAA/NWS/NCEP/EMC http://www.emc.noaa.gov/research/osse

  2. Co-Authors and Contributors Michiko Masutani*1#, Stephen J. Lord1, John S. Woollen1+, Weiyu Yang1+ Haibing Sun2%, Thomas J. Kleespies2, G. David Emmitt 3, Sidney A. Wood3 Bert Katz1+, Russ Treadon1, John C. Derber1, Steven Greco3, Joseph Terry4 1NOAA/NWS/NCEP/EMC, Camp Springs, MD 2NOAA/NESDIS, Camp Springs, MD 3Simpson Weather Associates, Charlottesville, VA 4NASA/GSFC, Greenbelt, MD #RS Information Systems +Science Applications International Corporation %QSS Group, Inc. *Corresponding Author Address: Michiko Masutani, 5200 Auth Road Rm 207 Camp Springs, MD 20746 E-mail:michiko.masutani@noaa.gov

  3. Overview • Introduction to OSSEs • Motivation • Basic Concepts • The NCEP OSSE System • Components • Testing • Preliminary NCEP results for Doppler Wind Lidar (DWL) • Planned upgrades • Conclusions

  4. Introduction to OSSEs: Motivation • Costs of developing, maintaining & using observing systems often exceed $100 M/instrument • To date, there have been no quantitatively–based decisions on the design & implementation of future observing systems • Significant time lags between instrument deployment and eventual operational NWP use

  5. Introduction to OSSEs: Motivation (cont) • Doing OSSEs will • Enable data formatting and handling in advance of “live” instrument • Exercise preliminary quality control algorithms provided simulated obs have realistic properties • OSSEs can provide quantitative information on observing system impacts and data assimilation system components • Requirements definition and instrument design for new instruments • Evaluate alternative mix of current instruments • Data assimilation system diagnosis and improvement • Understanding and formulation of observational errors • There is no perfect solution but more information can lead to better planning and decisions

  6. Observations Data Assimilation Analysis Verification Forecast Model Introduction to OSSEs Basic Concepts Real /OSSE Data Assimilation System Existing Real Observations With & Without Truth Nature New & Existing Observations With & Without Nature Run

  7. Introduction to OSSEs: Basic Concepts (cont) • Simulated observations should • Exhibit same system impact as real observations • Contain same kinds of errors as real observations (e.g., representativeness) • Nature Run is truncated spectrally in space & time • Real Nature is not truncated • Be produced by different instrument models than used in data assimilation system (e.g., radiances)

  8. Current NCEP OSSE System Components • Nature Run (NR) • ECMWF Reanalysis Model (ERA-15) • Resolution T213/L31 (~60 km) • 06 UTC 5 Feb 1993 – 00 UTC 7 March 1993 • Climatologically representative period • Good agreement in synoptic behavior • Clouds need adjustment • Observations • Conventional (raob, surface, pireps, MDCRS (1993)) • HIRS, MSU NOAA-11, NOAA-12 • Cloud track winds • NCEP Global Data Assimilation System • 1999 version • T62/L28 (200 km horizontal resolution) • Radiance assimilation

  9. Evaluation of NR clouds and adjustment NR clouds are evaluated and adjusted. Frequency distribution (in %) for ocean areas containing low level cloud cover in 20 5%- band categories. Solid line: NR cloud cover without adjustment. Dashed line: with adjustment.

  10. Testing: OSSE Calibration • Compare real data sensitivity to sensitivity with simulated data • Relative order of impacts should be same for the same instruments • Magnitudes need not be the same but should be proportional • Quality control (rejection statistics) • Error characteristics (fits of background to Obs)

  11. NH SH Calibration of Simulated Data Impacts Vs Real 500 hPa Height Anomaly Correlation 72 hour forecasts

  12. Observational Error FormulationSurface & Upper Air Observation simulated at the Nature Run Surface Nature Run NCEP Model   Observation simulated at the Real Surface Real

  13. Observational Error FormulationSurface & Upper Air Simulated with Systematic representation error Z • With random error: • Data rejection rate too small (top) • Fit of obs too small (bottom) time

  14. All levels (Best-DWL DWL-PBL DWL-Upper: Non-Scan DWL Preliminary NCEP results for Doppler Wind Lidar (DWL) All levels (Best-DWL): Ultimate DWL that provides full tropospheric LOS soundings, clouds permitting. DWL-Upper: An instrument that provides mid and upper tropospheric winds only down to the levels of significant cloud coverage. DWL-PBL: An instrument that provides only wind observations from clouds and the PBL. Non-Scan DWL : A non-scanning instrument that provides full tropospheric LOS soundings, clouds permitting, along a single line that parallels the ground track. Number of DWL LOS Winds 2/12/93

  15. Doppler Wind Lidar (DWL) Impact Time averaged anomaly correlations between forecast and NR for meridional wind (V) fields at 200 hPa and 850 hPa. Anomaly correlation are computed for zonal wave number from 10 to 20 components. Differences from anomaly correlation for the control run (conventional data only) are plotted. Forecast hour

  16. Effect of Observational Error on DWL Impact • Percent improvement over Control Forecast (without DWL) • Open circles: RAOBs simulated with systematic representation error • Closed circles: RAOBs simulated with random error • Orange: Best DWL • Purple: Non- Scan DWL Forecast length

  17. Planned Upgrades • Advanced instruments (e.g. AIRS) and impact experiments at higher resolution require larger computing resources, more realistic observations distribution and Nature Run and advanced data assimilation • Computing • Current: 32 processor SGI O2000 • Future: share former NCEP operational computer (IBMSP) until April 2004 (NPOESS funding) • Data Assimilation • Current: 1999 version of SSI and Global model • Future: 2003 version of SSI and model • Advanced radiative transfer for hyperspectral sounders • Improved computational efficiency • Prognostic cloud water • Improved mountain drag • Observations and Nature Run • Current: 1993 obs distribution; 4 weeks, T213 NR • Future: 2003-4 obs distribution; longer, higher resolution NR

  18. Conclusions • OSSE methodology has matured into scientifically valid, but not perfect, approach for evaluating • Instrument design • Cost – capability tradeoffs • Observing system design • Impact of components • Data assimilation system behavior • NCEP system has been developed • Attention to details of obs simulation, Nature Run calibration • DWL design is a test case • Can be applied to other NPOESS instruments (e.g. CrIS) • Can be used for observing system design (e.g. THORPEX) • Preliminary results for DWL give encouraging impact • Future testing at higher resolution and improved data assimilation system • More realistic observations (1993 -> 2004) and Nature Run

More Related