1 / 18

Modeling Complex Systems – How Much Detail is Appropriate?

Modeling Complex Systems – How Much Detail is Appropriate?. David W. Esh US Nuclear Regulatory Commission. 2007 GoldSim User Conference, October 23-25, 2007, San Francisco CA. Overview. Background Model development process Model complexity Model abstraction Examples Conclusions.

Download Presentation

Modeling Complex Systems – How Much Detail is Appropriate?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Modeling Complex Systems – How Much Detail is Appropriate? David W. Esh US Nuclear Regulatory Commission 2007 GoldSim User Conference, October 23-25, 2007, San Francisco CA

  2. Overview • Background • Model development process • Model complexity • Model abstraction • Examples • Conclusions

  3. Background • The issue of how much detail to include in models of complex systems is not new. • 14th century philosophers were considering different approaches to explain the world around them. • Decisions regarding model complexity apply to all fields of study. • Modern tools and computational capabilities present unique opportunities.

  4. Model – representation of essential aspects of a system (existing or planned)

  5. Model Development Process: Key Questions • Why are you using a model? • What is the purpose of your model? • Who is your audience? • What are your resources?

  6. Model Development Process: Key Questions • Why are you using a model? • Developing understanding (integrating, generalizing, testing) • Directing research (identify data gaps, propose new lines of research) • Representing reality (prohibitively costly or can’t observe) • What is the purpose of your model? • Is the decision controversial? • Is it high risk? ($, safety, etc.)

  7. Model Development Process: Key Questions • Who is your audience? • Technical, lay person, policy • High competency, low competency, mix • What are your resources? • Now and future • Computational • Time • For collection of additional information

  8. Model Development Process: Example What is assessed? What is Performance Assessment? • What can happen? • How likely is it? • What can result? • Systematic analysis of what could happen at a site Why use it? How is it conducted? • Collect data • Develop scientific models • Develop computer code • Analyze results • Complex system • Systematic way to evaluate data • Internationally accepted approach NRC would require a Performance Assessment to: • Provide site and design data • Describe barriers that isolate waste • Evaluate features, events, and processes that affect safety • Provide technical basis for models and inputs • Account for variability and uncertainty • Evaluate results from alternative models, as needed Site Assessment Overview of Performance Assessment Collect Data Site Selection and Characterization Site Characteristics Design and Waste Form Performance Assessment: a learning process Combine Models and Estimate Effects Develop Conceptual Models Develop Numerical and Computer Models

  9. Model Complexity Goals: • Simple is better (all things equal) • Broader scope • Systematic approach Metrics: • Accuracy • Explanatory Power • Reliability and Validity “Theories should be as simple as possible, but no simpler.”

  10. Model Complexity • Can improve model fit (But does it improve explanatory power?) • Can identify the need for enhancements • Increases difficulty in understanding • Increases difficulty in working with it • Increases computational burden So how do I decide?

  11. Model Complexity – How Much? Complexity and Effort Comparison of features Mass balance (watershed) GIS based analysis Model comparisons Analogs Long-term field experiments Isotopic studies

  12. Model Complexity – How Much? • No complete methodologies (generally) • Iteration (+/- interactions) • Statistical analysis of results • Visualization (data and output) • Metamodels • Most modelers put too much in to manage the risk of leaving something out • If complexity is not inexorably linked with accuracy, there may exist an opportunity to simplify

  13. 2 1 3 Model Complexity – How Much? 1 Prices go up, farmers produce more (too much) 2 Prices go down, farmers produce less 3 Repeat

  14. Model Complexity – How Much? P(decision) • Models provide information to think about, they don’t do your thinking for you • Decision makers need to reason about the issues • Model abstraction approaches can and should be used Effort Complexity

  15. Model Abstraction Example NUREG/CR-6884 Model Abstraction Techniques for Soil-Water Flow and Transport

  16. Model Abstraction • Need to start with a broad model space – allows exploratory analysis essential to abstraction • Reduce complexity – maintain validity • Show the abstraction represents the complex model Benefits • Less $ • Fewer inputs • Easier to integrate • Easier to interpret Types (not exhaustive) • Drop unimportant parts • Replace with simpler part • Coarsen ranges of values • Group parts together

  17. Model Abstraction:Example Benefit Uncertainty analysis • Simpler model yielder stronger results (6 variables identified compared to 3) • Allowed focused refinement of model • Complexity can have many unintended consequences

  18. Conclusions • Methodologies to address the level of model complexity continue to evolve • Model abstraction can have many benefits when done properly • Simple is better (all things being equal)

More Related