1 / 43

ABSVal

ABSVal. Goal: Adapt accepted Best Practices of VV&A * to simulation models that... Display emergent behavior Are used to model military effects on population dynamics and social phenomena as well as military decision making Support analyses. *maybe refine the general Best Practices.

skyla
Download Presentation

ABSVal

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ABSVal Goal: Adapt accepted Best Practices of VV&A* to simulation models that... Display emergent behavior Are used to model military effects on population dynamics and social phenomena as well as military decision making Support analyses *maybe refine the general Best Practices

  2. B.L.U.F.WHAT I THINK WE LEARNED or CONFIRMED • VV&A universally reviled. • EXCEPTION: DMSO employees and alumni • VV&A general principles do not map well to analysis domain. • describing anticipated use is problematic • analysis = exploration • ABS translates to Models exhibiting Emergent Behavior. • pre-production experimentation focused on achieving top-down control (building predictive capability) on dynamics that initially display emergence • scientifically examining emergence is interesting • We can help analysts and decision makers distinguish good analysis from bad analysis. • Minimally acceptable sim-to-app validation is recognizable, achievable, useful, and rare.

  3. OUTLINE • SIMULATION MODELS • A LITTLE EMERGENT BEHAVIOR • THINKING ANALYSIS • EXPOSING THE MATCH

  4. “In contrast to this interest in model-related technology, there has been far too little interest in the substance of the models and the validity of the lessons learned from using them. In our view, the DoD does not appreciate that in many cases the models are built on a base of sand.” The Base of Sand Problem: A White Paper on the State of Military Combat Modeling. Paul K. Davis Donald Blumenthal

  5. WORKABLE DEFINITIONS • Conceptual Model – Description of the system in some abstracted/symbolic formalism, usually mathematics. • Verification – The simulation executable faithfully reflects the Conceptual Model • Validation – The degree to which the system described in the conceptual model is appropriate in supporting the intended use. • Accreditation – The judgement, made by someone responsible for the outcome, that the simulation is adequately verified and valid for the intended use.

  6. COMMENTS • Conceptual models are always incomplete. • Verification of a simulation is a scientific endeavor if the conceptual model is complete. • A simulation is never “Valid.” • Analytical intended uses are difficult to deal with… • Repetition is very rare. • Analysts have no way to scientifically express the intended use. • Analysts often accept very poor data and models, and often express grave caveats for their results.

  7. formal transitions T(M) implementation and design executable code conceptual model ideal sim natural system IDEALIZED DEVELOPMENT PROCESS

  8. formal transitions T(M) implementation and design executable code conceptual model ideal sim natural system IDEALIZED DEVELOPMENT PROCESS abstraction coding and testing modeling mapping to sim design pattern software design

  9. formal transitions T(M) implementation and design executable code conceptual model ideal sim natural system IDEALIZED DEVELOPMENT PROCESS abstraction Driven by analytic task. More later... coding and testing modeling mapping to sim design pattern software design

  10. executable code ideal sim natural system REALITY FOR BIG-IRON SIMS abstraction data development

  11. formal transitions T(M) implementation and design executable code conceptual model ideal sim FOR A GIVEN ANALYSIS natural system FOCUS FOR ANOTHER DAY

  12. "The more complex the model, the harder it is to distinguish unusual emergent behavior from programming bugs." Douglas Samuelson Renowned Operations Research Analyst

  13. ABSVal PROJECT • ABSVal Framework • Test Examples • Pythagoras COIN • SZ/BZ Obstacle Reduction Analysis • Conclusions

  14. First-Principles Validation Assess the theory behind the conceptual model And predict its impact on the ensuing analysis Examine the implementation of the theory And predict its impact on the ensuing analysis Examine the combinations of theories used together Results Validation Compare output data to data from another source Historical case Another model Intuition VALIDATION

  15. First-Principles Validation Assess the theory behind the conceptual model And predict its impact on the ensuing analysis Examine the implementation of the theory And predict its impact on the ensuing analysis Examine the combinations of theories used together Results Validation Compare output data to data from another source Historical case Another model Intuition VALIDATION VALIDATION = EXPOSITION + ASSESSMENT The EVIDENCE on which acceptance is based.

  16. formal transitions T(M) implementation and design executable code conceptual model ideal sim natural system • Simulations displaying emergent behavior are difficult to validate because it is difficult to predict their behavior from the Conceptual Model • Therefore there is greater pressure to use results validation.

  17. “All models are wrong, some are useful.” George Box Wartime Statistician

  18. ANALYSIS • Predict the response (absolute) • Predict the response (as compared to a baseline) • Predict the functional form of the response for a set of independent variables • Predict the sign of the gradient (set of 1st derivatives) • Is there any response? • Predict the min/max of the response over a high-dimensional domain • Predict xi in [Li, Ui] such that response > c • Characterize the probabilistic nature of the response Compare to physical sciences -- These are very humble goals. Might a medical/biological mindset be more appropriate?

  19. …more ANALYSIS Provide the best decision support possible, and include useful assessment of the value of the analysis vis. the questions & issues at hand.

  20. IDEAL STUDY PROCESS PRESENT (and DEFEND) RESULTS DETERMINE THE QUESTION DETERMINE THE MOE’s and the EEA’s PRODUCTION RUNS FIND or BUILD the BEST (SIMULATION) MODEL

  21. SIMULATION-SUPPORTED ANALYSIS • Baseline/Excursion or Factorial Experiment • Driven to answer Analysis Questions • Key Elements of Analysis • Constraints, Limitations, and Assumptions

  22. Schism • Agent-based simulations use modular rules and local reasoning to produce realistic and/or interesting emergent aggregate behavior. • Surprise is good** • Successful simulation testing (core to face/results validation) based on demonstrating credibility across the range of potential input. • Surprise not good** ** Refined later in this talk

  23. GOAL: STOP BEING SURPRIZED In control, no more surprises How do we tell about this experience? Surprise Production Runs Accept/reject Explore • “Unnatural acts” reflect negatively on a sim • Once we achieve top-down control, is there still emergent behavior? Explain

  24. SIMULATION DYNAMICS based on accepted physical laws based on accepted social dynamics based on common sense distillation simple model relic required to facilitate actions simple model relic required to maintain consistency top-down human intervention DATA authoritative value measured witnessed argued by logic sensible range guess/arbitrary dimensionless ELEMENTSAdaptation of the Yost Scale RELEVANT DYNAMICS + REQUIRED DATA = ELEMENT e.g. underwater detection using observed detection range data in a cookie-cutter model

  25. SIMULATION DYNAMICS based on accepted physical laws based on accepted social dynamics based on common sense distillation simple model relic required to facilitate actions simple model relic required to maintain consistency top-down human intervention DATA authoritative value measured witnessed argued by logic sensible range guess/arbitrary dimensionless ELEMENTSAdaptation of the Yost Scale ANALYTICALLY DESIRABLE CONTROLABLE ABSTRACTION RELEVANT DYNAMICS + REQUIRED DATA = ELEMENT e.g. underwater detection using observed detection range data in a cookie-cutter model

  26. “It’s the Data, Stupid.” Phalanx, DEC 07 George Akst

  27. TRAC-TD-05-011 (rev. 1) January 2008 Constraints, Limitations, and Assumptions Guide Mike Bauman TRADOC Analysis Center 255 Sedgwick Avenue Fort Leavenworth, KS 66027-2345

  28. PARSING SOURCES OF VARIABILITY C.L.A.: Constraints, Limitations, and Assumptions necessary to give scope the analysis, and interpret the results CASES: details necessary to support the model, cases to be considered to achieve analytical goals DYNAMIC CONTEXT: has impact on the circumstances relevant to exercising the core model dynamics, create situations, not elements of analysis CORE: drive the results of your experiment, align with the key elements of analysis

  29. constraints, limitations, and assumptions core dynamic context cases IMPACT ON ANALYSIS • Agent-based design is reputed to enable fast and easy construction of dynamic context • Dynamic Context elements can display emergent behavior to add variability • Emergent behavior is often not predictable/controllable • Big-iron simulations often have parametric (knob) control over Case elements • impossible to promote these to Dynamic Context or Core elements • should NOT be elements of analysis • Ideally, analysts should have the most faith in their Core elements • should have high-quality data (high on the YOST scale) • should have well-studied dynamics (high on the YOST scale) • must not display uncontrolled emergent behavior • Limitations on the Core = Limitation of the simulation for analytical purposes • Core and Dynamic Context Elements should results-proven to be consistent with SME (explainable 1st derivative) • Core elements should be results-proven to be highly influential (see Scientific Method of Choosing Model Fidelity)

  30. constraints, limitations, and assumptions core dynamic context cases IMPACT ON ANALYSIS • Agent-based design is reputed to enable fast and easy construction of dynamic context • Dynamic Context elements can display emergent behavior to add variability • Emergent behavior is often not predictable/controllable • Big-iron simulations often have parametric (knob) control over Case elements • impossible to promote these to Dynamic Context or Core elements • should NOT be elements of analysis • Ideally, analysts should have the most faith in their Core elements • should have high-quality data (high on the YOST scale) • should have well-studied dynamics (high on the YOST scale) • must not display uncontrolled emergent behavior • Limitations on the Core = Limitation of the simulation for analytical purposes • Core and Dynamic Context Elements should results-proven to be consistent with SME (explainable 1st derivative) • Core elements should be results-proven to be highly influential (see Scientific Method of Choosing Model Fidelity) Taxonomy for sources of variability reflecting the relationship between model dynamics and analytical goals. ** Jargon for communicating how a sim element relates to the analysis. ** Identifies appropriate role for elements with emergent behavior in an analysis.

  31. a solid connection… GOLDEN GATE BRIDGE analytical requirements simulation/data capabilities

  32. or, not so much. TACOMA NARROWS BRIDGE analytical requirements simulation/data capabilities

  33. RECOMMENDED HANDLING

  34. EXAMPLE • Question: What is the tactical value of LW components to a rifle squad? • Core: weapon, computer/comm/SA, sight/NVG • Dynamic Context: paths of maneuver, acquisitions and detections, paths & actions of threat, attrition, … • Cases: terrain type (urban, jungle, alpine), scale (company, platoon), mission (HVT, defend a FOB) • CLA: kinetic outcome, unambiguous threat, terrain representation

  35. EXAMPLE • Question: What is the tactical value of LW components to a rifle squad? • Core: weapon, computer/comm/SA, sight/NVG • Dynamic Context: paths of maneuver, acquisitions and detections, paths & actions of threat, attrition, … • Cases: terrain type (urban, jungle, alpine), scale (company, platoon), mission (HVT, defend a FOB) • CLA: kinetic outcome, unambiguous threat, terrain representation emergent behavior dynamics fit here don’t average over these cases

  36. “Those claims to knowledge that are potentially falsifiable can then be admitted to the body of empirical science, and then further differentiated according to whether they are (so far) retained or indeed are actually falsified.” Carl Popper Philosopher of Science

  37. constraints, limitations, and assumptions core dynamic context cases NEGATIVE INFORMATION for IN-VALIDATION • Elements not data-driven • Elements not controllable • Element displays undesired emergent behavior • Element displays unexplainable 1st-order influence (results schism unexplainable) • Element not in the anticipated layer • level of influence is more/less than anticipated by analyst • dynamics or data are... • too low on the Yost scale • mismatched vis. the Yost scale

  38. constraints, limitations, and assumptions core dynamic context cases NEGATIVE INFORMATION = IN-VALIDATION? no concern show stopper Negative information: scopes analytical value of results. Analyst’s art: responsibly expand this scope. “This approach uses a very unrealistic model of certain dynamics, but it creates adequate dynamic context to stimulate the core elements in a way useful to our analytic goals.”

  39. “Computer programs should be verified,Models should be validated,and Analysts should be accredited.” Alfred G. Brandstein Renowned Military Operations Research Analyst Founder of Project Albert

  40. THE ANALYST • Prior to any experience with the simulation, can the Analyst... • Pose analytic questions mathematically? • Describe the experiment? • Identify Core vs. Dynamic Context elements? • Specify CLA elements? • Evaluate Core elements on Yost scale? • Disclose all outcomes the analyst anticipates matching with the simulation (Test Cases)? • Once experience has been gained, can the Analyst... • Explain changes to anticipated Core/Dynamic Context/Case/CLA classification? • Describe all testing and tuning required? • Quantify level of influence of each Core & Dynamic Context statistically? • Avoid integrating (averaging) Cases? • Explain the impact of each CLA on the results? • Statistically determine the level of agreement of the simulation outcomes with the Test Cases? • Resulting analysis should be peer-reviewed

  41. Bottom line

  42. Have lots of computational experience with your model. • Understand and be able to control it emergent behavior. • Plan and execute experiments, document. • Disclose relationship between each important sim element and the analytical goal. • Core • Dynamic Context • Cases • CLA

  43. QUESTIONS?

More Related