1 / 13

NCEP Dropout Team Briefing JAG/ODAA Meeting OFCM October 2008

NCEP Dropout Team Briefing JAG/ODAA Meeting OFCM October 2008. Jordan Alpert, Bradley Ballish, DaNa Carlis and Krishna Kumar Presentation by Bradley Ballish National Centers for Environmental Prediction. “Where America’s Climate, Weather and Ocean Prediction Services Begin”. Overview.

teague
Download Presentation

NCEP Dropout Team Briefing JAG/ODAA Meeting OFCM October 2008

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NCEP Dropout Team Briefing JAG/ODAA Meeting OFCMOctober 2008 Jordan Alpert, Bradley Ballish, DaNa Carlis and Krishna Kumar Presentation by Bradley Ballish National Centers for Environmental Prediction “Where America’s Climate, Weather and Ocean Prediction Services Begin”

  2. Overview • Dropouts (low 500 hPa AC score) in forecast quality is a problem for the NCEP Global Forecast System (GFS) • Experiments with ECMWF analyses as input • Experiments with changing data QC for GSI input • COPC and NCEP want real-time analysis for forecast skill dropouts • Summary

  3. On approximately a monthly basis, poor forecasts or “Skill Score Dropouts” plague GFS performance. NH AC SH AC

  4. Experiments with ECMWFAnalyses as Input Experiments using the ECMWF analysis as input to the GSI/GFS (ECM runs) show positive impact in dropout cases where the GFS did poorly • Here the GSI analysis is run using ECMWF analysis values on a 1x1 degree grid at mandatory pressures as pseudo radiosonde data for sole data input to the GSI, with its forecast labeled as ECM GFS • The background for these is the GDAS analysis • Additional experiments are run using the ECMWF analysis only in select areas (overlays) • Experiments are run using the ECM analysis as the background for the GSI (ECMANLGES runs) with usual operational observational data • The operational GSI can be rerun using the background from a 6 hour previous ECM run and are labeled as INTERPGES runs

  5. October 2007 Dropout cases 10/21 - 10-22 12,18,00,06,12Z Trough in central Pacific shows differences between ECMWF (no dropout) and GFS (had dropout) Ovrly “patch” box Select Observation Delete Area Detailed analysis shows significant differences in a number of wind maxes inside broad Pacific trough. (First of a number of dropouts…)

  6. 5 Day Anomaly Correlation Scores at 500 hPa for Dropout Cases ECM Performs Better than GFS (NH) 2007-2008 ECM runs are a good representation for ECMWF analysis. OVRLY defines sensitive/potential areas for QC improvements Dropouts are dependent on regime and causes may vary

  7. 5 Day Anomaly Correlation Scores at 500 hPa for Dropout Cases ECM Performs Better than GFS (SH) SH ECM runs consistently improve GFS forecasts for dropout events and in general. OVRLYm- Midlatitude OVRLY OVRLYp- Polar OVRLY

  8. midlatOVRLY results Oct.-Dec. 2007 GFS- operational GFS ECMWF- operational ECMWF ECM- psuedo-obs of ECMWF analysis run by GFS midlatOVRLY- ECMWF psuedo-obs between 35S-70S midlatOVRLY results show significant improvements for dropout cases and are comparable to ECM runs for some cases

  9. OVRLY and GES test results GFS- operational GFS ECMWF- operational ECMWF ECM- psuedo-obs of ECMWF analysis run by GFS midlatOVRLY- ECMWF psuedo-obs between 20S-60S polarOVRLY- ECMWF psuedo-obs between 60S-90S ecmanlGES- ECM guess + GDAS observations input to GSI then GFS forecast interpGES- 3, 6, & 9-hr forecast from previous ECM cycle used as GES + GDAS obs input to GSI then GFS forecast Using ECM as a GES removes the dropout except when ECMWF has a dropout as well (i.e. 2008031812) Midlat vs. polar OVRLY runs tend to vary in effectively reproducing ECMWF scores

  10. Experiments with Changing DataQC for GSI Input • Experiments are run using the GSI analysis with select suspect data deleted • The select data is analyzed by looking for observations with large differences to the background that passed QC in sensitive areas or with other known problems or data that the ECMWF did not draw for but the GSI did • The sensitive areas are analyzed by examining the sequence of forecast errors from hour 0 to 120 hours to estimate where a forecast error may originate from along with checking Langland’s plots of sensitivity (see next slide for an example) • Future sensitivity estimates will involve the estimates from ensemble variances and other forecast stability measures • For NH dropouts, the sensitive area is often in the Pacific • So far, no obvious smoking gun bad data have been found at hour zero. Further work is being done to look forproblem data earlier than hour zero

  11. Deletion of suspect satellite winds had small impact on skill

  12. COPC Wants Real-Time Analysis of Forecast Skill Dropouts • The Committee on Operational Processing Centers (COPC) (NCEP, Navy and Air Force forecast centers) want procedures to analyze causes of forecast skill dropouts in real-time • How can differences in NCEP and FNMOC analyses compared to the ECMWF be analyzed in real-time? • How can T-PARC researchers help these efforts?

  13. Summary • EMC runs have shown the NCEP GFS can produce good forecast skill in most dropout cases by using EMCWF analyses as input • Using the ECM runs to improve the background for the GSI or using the ECMWF analysis in limited areas also helps forecast skill • Although one can find many cases of bad conventional (non satellite radiance data) observational data that passed QC, it is difficult to find such examples in dropout cases

More Related