1 / 15

Working Group 2 Report: Uncertainty & Parameter Estimation

Working Group 2 Report: Uncertainty & Parameter Estimation. Tom Nicholson, Co-Chair Mary Hill, Co-Chair. Annual ISCMEM Public Meeting November 29, 2011 Rockville, MD. Outline. WG 2 Objective and Goals Members and Participants Activities, Technical Projects and Proposals

Download Presentation

Working Group 2 Report: Uncertainty & Parameter Estimation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Working Group 2 Report: Uncertainty & Parameter Estimation Tom Nicholson, Co-Chair Mary Hill, Co-Chair Annual ISCMEM Public Meeting November 29, 2011 Rockville, MD

  2. Outline • WG 2 Objective and Goals • Members and Participants • Activities, Technical Projects and Proposals • Products and Presentations at WG2 Meetings • Future Challenges • Recommendations for FY2012

  3. WG2 Objective • Coordinate ongoing and new research conducted by U.S. Federal agencies that focus on: parameter estimation methods and uncertainty assessment strategies and techniques to support environmental models and applications What is needed to achieve this objective? Coordination of research staff and their management thru wiser use of our limited resources.

  4. WG2 Goals • Basics: Develop a common understanding of parameter estimation in the context of model development, and the sources of uncertainty in the context of model predictions. Develop a common terminology. • Existing Tools: Identify, evaluate, and compare available uncertainty analysis strategies, tools and software. • New Tools: Develop and apply new parameter estimation, uncertainty, and sensitivity analysis methodologies. • Exchange: Facilitate exchange of techniques through teleconferences, technical workshops, professional meetings, and ISCMEM Web site links, and interaction with other WGs. • Communicate: Develop ways to better communicate uncertainty to decision makers (measures, visualization).

  5. Tom Nicholson, NRC Mary Hill, USGS Todd Anderson, DOE Tommy Bohrmann, EPA-Athens Larry Deschaine, HydroGeologic, Inc. Earl Edris, USACOE Boris Faybishenko, LBNL Pierre Glynn, USGS Philip Meyer, PNNL Yakov Pachepsky, USDA-ARS Tom Purucker, EPA–Athens Yoram Rubin, UC Berkeley Brian Skahill, USACOE Matt Tonkin, SSPA Gene Whelan, EPA-Athens Steve Yabusaki, PNNL Ming Ye, Florida State U Ming Zhu, DOE Members and Participants

  6. Activities: Teleconferences We conduct teleconferences to: • review and discuss ongoing research studies and software development • Uncertainty Methodologies • Model Abstraction • Field projects involving application of parameter estimation and uncertainty methods • formulate proposals for field applications of parameter estimation and uncertainty methods • communicate upcoming training, workshops and technical meetings • develop tool box of methods and software • exchange technical reports and software • identify databases and information sources (e.g., Web site addresses and field study databases).

  7. Activities: Conference sessions • Organized AGU session on Uncertainty Assessment, Optimization, and Sensitivity Analysis in Integrated Hydrologic Modeling as Application of Hydroinformatics, 2011 AGU Fall Meeting, San Francisco, CA. • WG Co-Chair chaired the Statistical and Applied Mathematical Sciences Institute (SAMSI) Workshop on Uncertainty in the Geosciences, September, 2011, Research Triangle Park, NC • WG Co-Chair participated in the International Conference on Radioecology and Environmental Radioactivity, Hamilton, Ontario, June 2011 with focus on field applications of environmental models (e.g., Fukushima and Chernobyl sites).

  8. Activities: Computer Software • JUPITER API – PublishedJoint Universal Parameter IdenTifications and Evaluation of Reliability Application Programming Interface for programming computer programs designed to analyze process models(USGS and EPA) (Banta and others, 2006) • Hydrologic Conceptual Model, Parameter and Scenario Uncertainty Methodology (PNNL and NRC) • Model Abstraction Techniques for model structure and parameter estimation (ARS and NRC) • Organized Sessions at the Fall AGU Meeting 2005 – 2011

  9. Seminars at WG2 Teleconferences in FY2011 • Joint Working Groups Briefing on the Chernobyl Cooling Pond Decommissioning and Remediation Proposal (as Case Study for Improving Scientific Basis for Multimedia Environmental Modeling and Risk Assessment) by Boris Faybishenko, Lawrence Berkeley National Laboratory on August 3, 2011 to provide comments and questions for the upcoming meeting in Kiev, Ukrainian. • Update Report on the Chernobyl Cooling Pond Proposal by Boris Faybishenko, Lawrence Berkeley National Laboratory on October 2011 to provide comments and questions for the upcoming meeting in Kiev, Ukrainian. • The “How” of Environmental Modeling: Toward Enhanced Transparency and Refutability by Mary Hill, USGS on November 7, 2011 to discuss advantages of establishing a base set of model sensitivity analysis and uncertainty evaluation measures, to be used along with any other performance measures of interest.

  10. Presentations at WG2 Meetings in FY2011 • Chernobyl Cooling Pond Decommissioning and Remediation Proposal (as Case Study for Improving Scientific Basis for Multimedia Environmental Modeling and Risk Assessment) by Boris Faybishenko, Lawrence Berkeley National Laboratory on August 3, 2011 to provide comments and questions for the upcoming meeting in Kiev, Ukrainian.

  11. Presentations at WG2 Meetings in FY2011 • The “How” of Environmental Modeling: Toward Enhanced Transparency and Refutability by Mary Hill, USGS on November 7, 2011 to discuss advantages of establishing a base set of model sensitivity analysis and uncertainty evaluation measures, to be used along with any other measures of interest.

  12. Misfit MODEL FIT • How to include many data types with variable quality, as typical of environmental systems? Error-based weighting, single objective function • Is model misfit or overfit a problem? Maximum likelihood variance • How to include many data types with variable quality, as typical of environmental systems? Error-based weighting, single objective function • Is model misfit or overfit a problem? Maximum likelihood variance Overfit MODEL SENSITIVITY ANALYSIS and UNCERTAINTY EVALUATION Observations Parameters Parameters Predictions • Which parameters are most important to the predictions? PPR • How certain are the predictions? • z/SDz, Prediction confidence intervals: linear, nonlinear*, Monte Carlo* • Which parameters are most important to the predictions? PPR • How certain are the predictions? • z/SDz, Prediction confidence intervals: linear, nonlinear*, Monte Carlo* • Which parameters can be estimated with the observations? b/SDb, CSS-PCC • Which observations are most important to the parameters? Leverage, Cook’s D • Are any parameters dominated by one observation and, thus, its error? • Leverage, DFBETAS • How certain are the estimated parameter values? b/SDb, Parameter confidence intervals: linear, nonlinear*, bootstrap* • Which parameters can be estimated with the observations? b/SDb, CSS-PCC • Which observations are most important to the parameters? Leverage, Cook’s D • Are any parameters dominated by one observation and, thus, its error? • Leverage, DFBETAS • How certain are the estimated parameter values? b/SDb, Parameter confidence intervals: linear, nonlinear*, bootstrap* *takes many models runs Observations Predictions • Which observation are most important to the predictions? • OPR, Cross-validation* • Which of many conceptual models are likely to have the best predictions? • AICc, BIC, KIC • Which observation are most important to the predictions? • OPR, Cross-validation* • Which of many conceptual models are likely to have the best predictions? • AICc, BIC, KIC

  13. WG2 Strategy Energize the science and technology thru closer linkage to decision marking: • better understand the methods being used in parameter estimation and uncertainty analyses • establish a base set of model sensitivity analysis and uncertainty evaluation measures, in addition to the other performance measures • use and compare different methods in practical situations

  14. Future Challenges Integrate Parameter Estimation and Uncertainty Strategies and Techniques through: • Test and demonstrate parameter estimation and uncertainty techniques using cooperative field studies and existing datasets • Organize Technical Workshops in cooperation with the other WGs which focus on the WG2 Recommended Proposals • Work with EPA-Athens, identify additional uncertainty, sensitivity and calibration tools and applications for their Website list of tools (e.g., COSU-API, JUPITER, PEST, UCODE) • Coordinate uncertainty assessment and parameter estimation methods and software with other ISCMEM WGs, Federal technical community, and international partners (e.g., Canadian Nuclear Safety Commission, IAEA) • Identify training opportunities and post manuals and example applications for the EPA-Athens Website Listings and iemHUB

  15. Recommendations for FY2012 • Assist development and creation of other working groups • Take advantage of the relevance of uncertainty and parameter estimation to all environmental modeling and monitoring fields. • Develop and conduct joint ISCMEM teleconferences • WG1 (Software System Design; design of uncertainty and parameter estimation software and data fusion) • WG3 (Reactive Transport Models and Monitoring; support decision making) • Act as an incubator to build support for new ideas • Proposed WG on monitoring based on the importance of monitoring to uncertainty and parameter estimation, and visa versa • Sponsor technical workshops on endorsed studies • Endorsed studies: Naturita, CO; Hanford 300, WA; OPE3 Beltsville, MD • Potential future international case studies (Chernobyl Cooling Pond) • ISCMEM Website • Revise under EPA’s iemHUB to enhance Information Transfer of Technical Reports and Data Sources

More Related