response to ncep reviews ncep update n.
Skip this Video
Loading SlideShow in 5 Seconds..
Response to NCEP Reviews/ NCEP Update PowerPoint Presentation
Download Presentation
Response to NCEP Reviews/ NCEP Update

Response to NCEP Reviews/ NCEP Update

149 Views Download Presentation
Download Presentation

Response to NCEP Reviews/ NCEP Update

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Response to NCEP Reviews/NCEP Update Dr. Louis W. Uccellini Director, NCEP 90th AMS Annual Meeting Atlanta, Georgia January 19, 2009 “Where America’s Climate, Weather, Ocean and Space Weather Services Begin”

  2. Outline • General response to UCAR reviews • 2009 NCEP highlights • Computer transition • Model implementations • Performance metrics • “Dropout Team” accomplishments • Prediction of December 8-9 Midwest Storm • Climate portal • Reanalysis status • Center highlights for 2009 • 2010 model implementations • Budget information • New building status • Summary

  3. Response to Reviews

  4. General Response • Deep appreciation for the amazing amount of work and dedication of the review teams • I believe many of the issues articulated can be addressed within NCEP resources • Other issues that will necessitate efforts within the NOAA budget approval process • Already begun acting on several of the recommendations (e.g., secured 2 FTEs for the SPC fire weather program) • Individual centers are assessing reports and communicating with the workforce – laying out strategies for moving forward (e.g., EMC- NCO)

  5. UCAR Review of EMC • Recognition of EMC’s fundamental role in • Scientific maintenance of NCEP’s operational systems • Transition to operations of new capabilities • Enhancement of current operational capabilities • Recognition of high level of organizational achievement • Concerns and issues • Global system falling behind international competitors • Too many “models” • Interactions with research community less than desirable • Less than adequate computing resources • Mismatch of human resources and scope of work • Working relationship with NCO • Recommendations • Single atmosphere-ocean-land surface modeling system • Implement advanced data assimilation system across atmospheric applications • Work closer with NCO on • Implementation process • Strategic issues (e.g. computing resources) • Improving relationships at all levels • More details will be available after Review Team presentation at AMS next week

  6. EMC and NCO Response to UCAR Review Team • EMC should: • Have access to substantially increased computer capabilities • Match human resources to the stated operational mission • Embrace an entirely new approach to model development and implementation • focus on a single, powerful, flexible atmosphere-ocean-land surface modeling system • involve the entire national weather and climate modeling community • Employ data assimilation capabilities that are significantly advanced beyond those now used • NCEP must alleviate tensions between NCO and EMC • EMC and NCO Directors and staff recognize these tensions and are determined to alleviatethem and improve working relationships at all levels • Since July 2009, NCO and EMC Directors are working together to: • Begin to improve and streamline the implementation process • Manage progress toward individual implementations on a weekly basis • Manage use of HPC resources at NCEP’s disposal (Resource Allocation Council) • Support operational applications from the NOAA National Ocean Service • Applications planned for 2010-2011 implementation • Great Lakes, Chesapeake Bay, Tampa Bay, Delaware Bay, Gulf of Mexico, Columbia River • Introduce Tsunami and Space Weather to NCEP’s operational applications • Advocate for increased HPC resources through NOAA HPC management and upper level NOAA program management • EMC and NCO Directors are • Supportive of and enthusiastic about these developments • Determined to institute an atmosphere of full cooperation between NCO and EMC at management and working levels

  7. Next Steps • Continue to engage the Review Executive Board at each step to insure response meets recommendations and provide semi-annual reports for larger community • NCEP management meeting (January 28) • Develop matrix for all recommendations • Track all related activities • Look to FY10 budget to initiate changes • Separate out the longer term issues and recommendations and begin engaging review committee and NOAA process for solutions • Explore various approaches for standing up an NCEP Advisory group. Establish by end of FY10 • Longer term • NOAA/PPBES process (e.g., NOAA already addressing NCEP operational computer) • NWS OSIP Process • NCEP “Summit” in March, 2010 (organized by NWSHQ/OCWWS) • Engage NOAA’s Environmental Model Program (EMP) • All NOAA modeling, HPCC, resource issues need to be worked through EMP • Engage NWSEO at each step

  8. 2009 NCEP Highlights

  9. Transition to IBM Power 6 complete Declared operational August 12, 2009 69.7 trillion calculations/sec Factor of 4 increase over the IBM Power5 4,992 processors 20 terabytes of memory 330 terabytes of disk space 1.7 billion observations/day 27.8 Million model fields/day Primary: Gaithersburg, MD Backup: Fairmont, WV Guaranteed switchover in 15 minutes Web access to models as they run on the CCS Central Computer System (CCS) Popularity of NCEP Models Web Page Millions of Hits 2001 2009

  10. FY2009 Implementations

  11. Record Value

  12. Day at which fcst loses useful skill (AC=0.6) N. Hemisphere calendar year means Forecast day

  13. Placeholder for Dropout slides (2)

  14. GFS dropouts that are ≤ 0.70 • The month of October for the last 3 years has produced a total of 17 dropouts in the NH • Decrease in NH dropouts during the winter months • In general, almost 3 times as many dropouts in the SH compared to NH (36 NH vs. 93 SH)

  15. Actions Undertaken through “Drop Out” Team Goal: Improvement of the GFS using dropouts as case studies measuring ECMWF differences with GFS and other statistics to solve QC and other related problems • Currently operational: • Updated the station dictionary (dropout relieved during the testing period) • Regional ATOVS Retransmission (RARS) resulting in more polar orbits • P5 to P6 transition change in data dump time for longer data window • NESDIS response to satellite data issue more prompt • Pending: • Asymmetric satellite wind quality control • Real Time Data Monitoring System (RTDMS) extended to 30 day archive

  16. Prediction of December 8-9 Midwest Storm Dec 8-9th Midwest Snowstorm model "spread" from 5 days out, with the white swath being the spread of the ECWMF, Canadian, and GFS Ensembles Increasing realization that NAM and SREF are becoming important tool for Days 1-3 forecast of extreme events 48 hours out, with the NAM, GFS, ECMWF, UKMET, and Canadian plotted.  The NAM is the left outlier, which tracked it right over Milwaukee 48 hours out.  The SREF had a similar track. Verification

  17. NCS Climate Portal: Data & Services NOAA is developing a comprehensive, Climate Portal that provides ready access to all NOAA climate data, products, and services.

  18. NCEP Climate Forecast System Reanalysis • Complete as of November 2009 • A global, high resolution, coupled atmosphere-ocean-land surface-sea ice system for the period 1979-2009. • Atmosphere resolution: 38 km (T382), 64 levels extending to 0.26 hPa • Ocean resolution: 0.25 degree at the equator, 40 levels to 4737 m depth • Products available at hourly time resolution, 0.5 degree horizontal resolution, and at 37 standard pressure levels • CFSR products began being transmitted on December 7, 2009, to the NOAA National Climate Data Center (NCDC), the official dissemination outlet for the CFSR. NCAR will also host a copy of the CFSR. • CPC is in the process of generating its operational climate diagnostics products from the CFSR data. • An operational implementation of the entire CFSR system, including all hindcast model runs, is scheduled for Q1 of FY11.

  19. Highlights from 2009 • NCO – Implementation of Power6; migrating NAWIPS to AWIPS2 • EMC - porting the NCEP Production System to the Power 6 system (with NCO) • HPC - operational implementation of the forecast graphics and narratives for Alaska • OPC - Ocean surface current and SST forecast (experimental) • CPC – contribution to the “” Climate Portal • AWC – operational production of G-AIRMET • SPC - routine issuance of high-time resolution probabilistic thunderstorm forecasts in support of the aviation and severe weather communities • NHC - continued improving absolute error reduction and increased skill in track forecast for critical decision times • SWPC – Securing budget increase for operational transition; initiated WSA/Enlil model transition

  20. FY2010 Model Implementations • SREF: Increase resolution to 32 km • GEFS/NAEFS: T126T190 • GFS: Increase resolution to T574 (27 km) • NAEFS: Include FNMOC • WaveWatch III: Global multi-grid wave model • HIRES window: Improved high res WRF model • Test Mode: Data Assimilation GSFC/GMAO/NCEP 4D VAR is proceeding

  21. FY2010 Budget Information

  22. NCEP Historical Base Funding(Direct Appropriation) $106.9M * Another $10-$15M Other Funding Sources

  23. New Building Status • Work stoppage in December 2008 when 75% complete • Developer filed Federal lawsuit to recover “damages” from U.S. Government • Developer filed for bankruptcy, June 2009 (in County Court) • Receiver appointed by County Court in August 2009 and has taken over security, dehumidification services and repaired water leaks • GSA continuing negotiations with Receiver to enable construction of facility to be completed • Federal lawsuit heard on January 11, 2010; judgment expected in February • Most optimistic schedule shows work resuming in July 2010 and NCWCP ready for occupancy in July 2011

  24. Summary • Positioning NCEP to respond to review in most positive, timely manner • Working an aggressive model implementation schedule – addressing the “Drop Out” issue • Hopeful on new building issue, but still in the courts

  25. Appendix

  26. Reaching Our Goals Goal: Improvement of the GFS using dropouts as case studies measuring ECMWF differences with GFS and other statistics to solve QC problems • Extended network of partners upstream of the data flow concerned with dropouts (NESDIS, NRL, COPC,…) • Developed an additional framework for parallel experiments of raw observations (dumps) from data tanks, e.g. station dictionary changes • Performed impact experiments modifying or withholding select observations within the PREPBUFR or non-conventional files • Comparisons between GFS and ECMWF using controlled (ECM) experiments to show the horizontal and vertical location of IC differences that caused dropouts • Establishment of standard procedures when dropouts occur • High-resolution (91 layer) ECMWF analysis in addition to low-resolution (ECM) experiments • Compiled statistics of how the GSI draws for observation types to quantify inherent biases and implement improvements to QC programs – an ongoing program • Real Time Data Monitoring System (RTDMS) extended to 30 days

  27. FY2010 Implementations

  28. FY2010 Implementations

  29. FY2010 Implementations

  30. FY2010 Implementations

  31. FY2010 Implementations