dtc management board meeting 24 th january 2013 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
DTC Management Board Meeting 24 th January 2013 PowerPoint Presentation
Download Presentation
DTC Management Board Meeting 24 th January 2013

Loading in 2 Seconds...

  share
play fullscreen
1 / 93
terris

DTC Management Board Meeting 24 th January 2013 - PowerPoint PPT Presentation

147 Views
Download Presentation
DTC Management Board Meeting 24 th January 2013
An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. DTC Management Board Meeting24th January 2013 Bill Kuo

  2. Topics for DTC MB meeting: • AOP 2013 plan and priorities • Execution of AOP 2013: • NOAA-UCAR Cooperative Agreement • Transition to next phase of DTC • DTC Science Advisory Board membership • DTC sponsors’ funding and priorities • DTC Executive Committee (EC) meeting preparation • Challenges

  3. Planning for AOP 2013 • Preliminary guidance for DTC is ‘flat budget,’ however, potential cuts are possible. • DTC has taken into account the initial guidance provided by DTC MB during the October 2012 MB meeting • Final budget for DTC will most likely not be known until Spring 2013 • DTC Task Leads will present: • Accomplishments in 2012 • Proposal for 2013 • DTC MB will need to decide on: • What tasks fall within the 85% budget level? • The ranking of tasks between 85% to 100% budget level

  4. DTC funding sources (in $K) * Due to period of performance for some projects, some funding is ‘committed carry-over’ for AOP 2012.

  5. DTC budget allocations (in $K) *Note verification support for T&E activities is included in the Verification task in AOP 2012. In AOP 2011, they resided with task areas conducting the test.

  6. Execution of AOP 2013 • DTC AOP has adopted the period of performance of March 1 – February 28. • Current NWS-UCAR Cooperative Agreement (CA) will end by August 2013. NWS indicated that: • NCAR budget through August 2013 can be transferred via NWS-UCAR CA • Remainder of funds needs to be transferred using OAR-NSF CA (subject to NSF cost-recovery fee) • NOAA will administer a competitive RFP in 2013 for the next DTC CA • When will the process be completed? • How would this impact AOP 2014 planning?

  7. DTC Science Advisory Board • DTC Science Advisory Board consists of 14 members. • Six SAB members whose terms will expire by June 2013: • Brian Colle, SUNY Stony Brook • James Doyle, NRL • Bob Dumais, ARL • Cliff Mass, U. of Washington (chair) • Tom Henderson, ESRL • David Bright, AWC • DTC MB needs to discuss: • Who should be retained? • Nomination of new members • Is the operation of SAB effective? Suggestion for change?

  8. DTC Science Advisory Board

  9. STC Sponsors’ Funding and Priorities • DTC funding sponsors include NOAA/OAR, NOAA/HFIP, USWRP, GSD, AFWA, NCAR, NSF • HFIP, USWRP, AFWA and NSF have provided guidance on the allocation of their funding to support specific tasks that fall within the core areas of DTC • NOAA/OAR, GSD, and NCAR have allowed flexibility to allocate funds according to priorities set by DTC MB • Recently, questions have been raised with regards to the allocation of NOAA/OAR funds: • Should NOAA (or OAR) provide guidance on the priority for NOAA/OAR funds (instead of the entire DTC MB)? • This change will have a significant impact on DTC operation, as NOAA/OAR is the largest funding source for DTC

  10. DTC EC Meeting Preparation • DTC EC meeting will be held 12 Feb 2013 in Silver Spring • Executive Committee will: • Review and approve DTC AOP 2013 and priorities • Review and approve DTC SAB membership • DTC MB needs to identify important issues/topics for DTC EC discussion • Next phase of DTC, RFP process, use of NOAA/OAR funds • Future direction of DTC • DTC partnership between NOAA, AFWA, NCAR, and NSF • Others?

  11. Challenges • Some DTC task areas are becoming subcritical due to budget reduction: • Should we consider consolidation of task areas? • Should DTC focus on certain aspects (e.g., physics)? • SAB recommended DTC put greater emphasis on T&E at the expense of community support. • Are we ready to take a large step back from conducting tutorials for our publically-released software packages? • Should we reduce the number of packages that we support to the community? • How can DTC be more effective in R2O? • Development and testing of next-generation NWP systems • Migration toward a unified modeling system (a recommendation from the UCACN) • Development of 10-year strategic plan for EMC • Establishment and operation of an “ECMWF-like” facility

  12. Summary • DTC has come a long way toward establishing a community facility with a robust management structure and planning process, strong community connection and partnership among sponsors. • Joint decision making process by the DTC MB is a very important mechanism for: • Setting priorities for DTC • Maintaining the partnership among sponsors • Setting future direction of DTC

  13. Community Interactions Louisa Bogar Nance

  14. Outline • Software Systems • Community Outreach Events

  15. Software Systems Framework for bringing together operational capabilities and research innovations to accelerate the transition of new technology into operations by facilitating carefully-controlled extensive T&E Close collaboration between DTC & developers is critical to the success of this work!

  16. Software System Philosophy • Shared resource w/ distributed development that includes capabilities of current operational systems • On-going development maintained under mutually agreed upon software management plan • Code repository maintained under version control software • Protocols for proposing & approving modifications to the software • Testing standards • Code review committee • Additional testing standards to more thoroughly check integrity of evolving code base

  17. Accelerating R2O Transitions for GSIBefore and After: An Example 2008 GSD initiates merging the cloud analysis code to the existing GSI Can we wait this long for any other R2O transition? 2009 2010 • Issues related to: • Version control • Code portability • Development coordination • Formal code commit procedure for external groups • Coding standards • Standard pre-commit tests GSD/DTC commit code changes to the GSI trunk as a trial case to set up the GSI R2O transition procedure 2011 2012

  18. Accelerating R2O Transitions for GSIBefore and After: An Example 2008 GSD initiates merging the code to the existing GSI Lessons learned through the DTC’s experience with GSI are being applied to all other software systems we work with! 2009 Establish DTC community GSI repository with the multi-platform feature GSD starts to use the GSI repository Establish EMC operational GSI repository 2010 DTC leads effort to form the GSI Review Committee (GRC) and set up the GSI R2O procedure (including repository syncing) GSD/DTC commit code changes to the GSI trunk(s) as a trial case to set up the GSI R2O transition procedure DTC commits portability related changes to the GSI trunk(s) All GRC members start to follow the R2O transition procedure 2011 GRC finalizes the R2O procedure Now, it takes about one week for GSI Review Committee to review a code change proposal and about one day to commit. 2012

  19. Current Software Systems • WRF – NWP model + pre- and post-processors • UPP – New package in 2011 – community code mgmt plan in process of being implemented • Model Evaluation Tools (MET) – verification package • Gridpoint Statistical Interpolation (GSI) data assimilation system, including GSI-hybrid capability • WRF for Hurricanes - set of tools for tropical storm forecasting, including coupled atmosphere and ocean system & stand alone GFDL vortex tracker • Modular end-to-end ensemble system (repository & code mgmt plan established during AOP 2011) • NOAA Environmental Modeling System (NEMS) (repository established during AOP 2011 – working towards code mgmt plan) Software system management for GSI, NEMS and ensemble systems has greatly benefitted from having DTC staff members, HuiShao and Eugene Mirvis, co-located with EMC staff!

  20. Making New Capabilities Available to Operations (2009-present) WRF (including atmospheric component of HWRF) • Enhanced interoperability for NMM-E, including moving nest • Radiation – RRTMG • Cumulus – Tiedtke, NSAS and Grell (uncoupled only) • New capabilities (3-nest) and physics updates from AOML/HRD HWRF • Extension of POM coupling to Eastern Pacific (URI) GSI • NCAR/MMM’s aerosol optical depth data assimilation function • Numerous contributions from GSD (e.g., cloud analysis) and GMAO (both new features and enhancements to existing features) MET • Baldwin-Elmore spatial significance tool (DTC Visitor Program) • Capability to convert TRMM satellite data into MET readable format

  21. Contributions to Operational Software Systems Ensemble system • Bias correction and downscaling for SREF Ensemble KalmanFilter • Working w/ EMC & ESRL to set-up a code management plan – DTC will likely be a major contributor to this effort NEMS • Enhanced portability (DTC staff and DTC Visitor Program) • Co-leading movement towards common repository for external NCEP libraries

  22. Publically-Released Packages Philosophy Current Packages Periodic releases made available to the community that include latest developments of new capabilities & techniques Additional testing, including multiple computing platforms and compiler options Centralized support (in collaboration with developers) Software downloads Documentation Email helpdesk Tutorials (online and onsite) WRF UPP HWRF GFDL vortex tracker GSI MET

  23. Registered Users *All WRF users required to re-register starting in 2008 – number corresponds to those who have registered since 2008

  24. AOP 2012 Code Releases and Tutorials

  25. AOP 2013 Code Releases and Tutorials85-100%

  26. AOP 2013 Code Releases and TutorialsJust beyond 100%

  27. Software System for AOP 2013 – 85%

  28. Discussion Items for Software Systems • Maintenance of SREF code repository does not currently fall within the 100% scenario (only NEMS portion of work currently part of plan) - what does this mean for the future of this repository? • Community support – are we ready to take a large step back from conducting tutorials for our publically-released software packages? • What type of planning should DTC be doing wrt potential upcoming transitions of HWRF to NEMS?

  29. Community Outreach Events Important mechanism for bringing together research and operations to discuss how to work together to advance NWP

  30. Outreach efforts – AOP 2012 • DTC-sponsored events • Mesoscale Modeling • Annual WRF Users Workshop (25-29 Jun 2012) • Ensembles • Mini-workshop w/ GIFS-TIGGE working group (June 2012) • DTC & NUOPC Ensemble Design Workshop (10-12 Sept 2012) • Verification – invited presentations • Unidata Triennial Users Workshop (3; July 2012) • Earthcube Workshop (Dec 2012) • Full-day tutorial for Turkish Met Service

  31. DTC & NUOPC Ensemble Design Workshop Focus: Quantification & characterization of uncertainty Main conclusion: more scientific approach is needed to answer ensemble design questions Plan for the future: Establish standard set of metrics that will allow for useful inter-comparison of ensemble formulations & dealing w/ uncertainty Establish a small set of target parameters Establish a global ensemble data archive for research (e.g. Ensemble ICs and perturbations, forecasts from major centers etc.) Establish clean experimental program BAMS paper by Scott Sandgathe, Brian Etherton, Barb Brown& Ed Tollerud

  32. DTC-Sponsored Events – AOP 2013 85% Scenario: • Annual WRF Users Workshop Proposed but did not make 100% • Verification workshop • GSI Workshop Given the DTC’s mission to serve as a bridge between the research and operational NWP communities, are we stepping back too much from sponsoring workshops?

  33. Mesoscale Modeling Jamie Wolff Collaborators: NOAA’s Environmental Modeling Center NOAA’s Earth System Research Laboratory NCAR’s Mesoscale and Microscale Meteorology Division North Carolina State University División de Energías Renovables, CIEMAT, Madrid, Spain University of Washington

  34. Mesoscale Modeling AOP 2012 Activities

  35. Key Accomplishments Inter-comparison Testing and Evaluation MMET

  36. WRF Testing and Evaluation (T&E) • End-to-end system: WPS, WRFDA, WRF, UPP, and MET • Test Period: 1 July 2011 – 29 June 2012 • Retrospective forecasts: 48-h warm start forecasts initialized every 36 h w/ DA • Domain: 15-km CONUS grid • Evaluation: • Surface and Upper Air ((BC)RMSE, bias) • Temperature, Dew Point Temperature, Winds • Precipitation (GSS, frequency bias) • 3-h and 24-h accumulations • GO Index • Statistical Significance Assessment • Compute confidence intervals (CI) at the 99% level • Apply pair-wise difference methodology • Compute statistical significance (SS) and practical significance (PS)

  37. WRF Inter-comparison T&E • Functionally similar operational environment testing • WRF Data Assimilation and 6-hr warm start • WRFDAv3.3.1 + WRFv3.3.1 w/ LoBCs from LIS w/ Noahv2.7.1 • WRFDAv3.4 + WRFv3.4 w/ LoBCs from LIS w/ Noahv2.7.1 • WRFDAv3.4 + WRFv3.4 w/ LoBCs from LIS w/ Noahv3.3 • Evaluation included: • Impact assessment of WRF system version • Performance assessment of the LIS input data set

  38. Background Error Files • Used gen_be to produce seasonal background error covariance files • Cold start cases initialized at 00 and 12 UTC daily for ~15 days during each season (GFS only - no SST or LIS) • Pseudo single observation test • Resulting analysis increment from a single observation of the v-component of the wind with 1 m/s innovation Summer: 2012072118 Winter: 2012011906

  39. WRF v3.3.1–v3.4 Results • SS (light shading) and PS (dark shading)pair-wise differences for the annual aggregation of surface temp, dew point and wind BCRMSE and bias aggregated over the full set of cases and the entire integration domain

  40. Regional Temperature Bias Verification WRF v3.4 w/ Noah v2.7.1 WRF v3.3.1 w/ Noah v2.7.1 00 UTC 12h forecast 00 UTC 24h forecast

  41. GO IndexVersion Difference

  42. Key Accomplishments Inter-comparison Testing and Evaluation MMET

  43. Testing Protocol Motivation • Wide range of NWP science innovations under development in the research community • Testing protocol imperative to advance new innovations through the research to operations (R2O) process efficiently and effectively. • Three stage process: • Proving ground for research community • Comprehensive T&E performed by the DTC • Pre-implementation testing at Operational Centers

  44. Mesoscale Model Evaluation Testbed (MMET) • What:Mechanism to assist research community with initial stage of testing to efficiently demonstrate the merits of a new development • Provide model input and observational datasets to utilize for testing • Establish and publicize baseline results for select operational models • Provide a common framework for testing; allow for direct comparisons • Where: Hosted by the DTC; served through Repository for Archiving, Managing and Accessing Diverse DAta (RAMADDA) www.dtcenter.org/eval/mmet

  45. MMET Cases • Initial solicitation of cases from DTC Science Advisory Board Members and Physics Workshop Participants – great response and enthusiasm towards endeavor • Cases current available within MMET • 20090228 – Mid-Atlantic snow storm where North American Mesoscale (NAM) model produced high QPF shifted too far north • 20090311 – High dew pointpredictions by NAM over the upper Midwest and in areas of snow • 20091007–High-Resolution Window (HIRESW) runs underperformedcompared to coarser NAM model • 20091217 – “Snowapocalypse ‘09”: NAM produced high QPF over mid-Atlantic, lack of cessation of precipitation associated with decreasing cloud top over eastern North Carolina • 20100428-0504 – Historic Tennessee flooding associated with an atmospheric river event • 20110404 – Record breaking severe report day • 20110518-26 – Extended period of severe weather outbreak covering much of the mid-west and into the eastern states later in the period • 20111128 – Cutoff low over SW US; NAM had difficulties throughout the winter of breaking down cutoff lows and progressing them eastward • 20120203-05 – Snow storm over Colorado, Nebraska, etc.; NAM predicted too little precipitation in the warm sector and too much snow north of front (persistent bias) User Case #1

  46. User Case #1 (Jimenez and Dudhia) 20100428-20100504– Extended case focused on historic Tennessee flooding event Forecasts: WRF v3.4 ARW baseline configuration namelist from DTC WRF v3.4 ARW namelist with topo_wind=1activated CONUS domain at 15km resolution Utilized IC and BC files provided by DTC for model initialization Utilized observation files provided by DTC for verification

  47. User Case #1 (Jimenez and Dudhia) Wind Speed Time Series

  48. User Case #1 (Jimenez and Dudhia) Wind Speed Error (topo_wind=1) Underforecast Overforecast 00 UTC 20100428 through 00 UTC 20100504 (every 3 hours) • Average wind speed across the domain • topo_wind=1 • Observed

  49. User Case #1 (Jimenez and Dudhia) Wind Speed 6-day Average Error • Status of testing: • Overall 6-day domain average with topo_wind=1 smaller than default • Reduces diurnal mean bias but does not capture full diurnal amplitude • Looking into reduction of convective mixing and vertical transport of momentum causing overall lower speeds Default topo_wind=1

  50. Proposed Activities for 2013