1 / 22

Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

Validation Methodology for Agent-Based Simulations Workshop DoD Validation Baseline. Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007. Outline. Validation defined General approach Issues for ABS validation. Outline. Validation defined General approach Issues for ABS validation.

raleigh
Download Presentation

Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Validation Methodology for Agent-Based Simulations WorkshopDoD Validation Baseline Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

  2. Outline • Validation defined • General approach • Issues for ABS validation

  3. Outline • Validation defined • General approach • Issues for ABS validation

  4. DoD Definitions DODI 5000.61 • Verification • The process of determining that a model implementation and its associated data accurately represents the developer’s conceptual description and specifications • Validation • The process of determining the degree to which a model and its associated data are an accurate representation of the real world from the perspective of the intended uses of the model • Accreditation • The official certification that a model, simulation, or federation of models and simulations and its associated data are acceptable for use for a specific purpose The workshop focus is Validation

  5. Utility of Validation • Military analysis requires the capability to evaluate an environment dominated by non-physical effects • Cold War analysis is not sufficient • Fighting the last war is not good enough • Subject matter expertise needs codification and expansion • Make appropriate use of M&S • Avoid using bad M&S/analysis • Avoid throwing out good M&S/analysis

  6. DoD 5000 on M&S VV&A • Much attention paid to “principals” but little to “principles” • Provides DoD authoritative definitions • Little emphasis on the “how’s” • Policies and procedures for M&S applications at the DoD Component level • Allows the tailoring of VV&A policies and procedures to the needs of the user • Likely to result in inconsistencies – little to no standardization of TTPs

  7. Outline • Validation defined • General approach • Issues for ABS validation

  8. Validation Steps DMSO, VV&A Recommended Practices Guide – Validation Special Topic • Verify M&S requirements • Develop V&V plan • Validate conceptual model • Verify design • Verify implementation • Validate results

  9. General Process Basic representation • Assessment • Appropriate referents • Rule set (alone & in the composition) • Instantiation • Interpretation • Trajectory Effect of interactions Adapted from DMSO, VV&A Recommended Practices Guide • Empirical Assessment • Another model • Mathematical • Simulation • Formalism • Historical event • Live experiment • Verify M&S requirements • Develop V&V plan • Validate conceptual model • Verify design • Verify implementation • Validate results • SME / Turing • Statistical • Metric

  10. Problem Domain • Application types – analysis, training, acquisition • Physics – laws, forces, systems • Representational requirements – Performance & behaviors of Real - world real entities based • Missions, doctrine, operations, rules User Simulation of engagement/deployment Domain Domain • Use cases – e.g., • Use cases – e.g., scenario scenario • Representation fidelity • Representation • Mission, enemy, fidelity Implement terrain, troops, time • Mission, enemy, functions & Available (METT-T) terrain, troops, time features • Behaviors, tactics Available (METT-T)) • Behaviors, tactics M&S Requirements Overlap Between Domain Areas& Requirements Adapted from DMSO, VV&A Recommended Practices Guide – Requirements Special Topic

  11. Finding a Referent DMSO, VV&A Recommended Practices Guide – Validation Special Topic • Experimental data • Empirical data • Experience, knowledge, and intuition of SMEs • Validated mathematical models • Qualitative descriptions • Other simulations • Combinations of the types described above Conceptual model = Content and internal representations of the M&S; includes logic and algorithms; recognizes assumptions and limitations

  12. Human Behavior Model Referents • SMEs • Empirical observations or experimental data from actual operations • Models of human behavior • Models of physiological processes • Models of sociological phenomena • Simulations of human behavior

  13. When a Referent Doesn’t Exist DMSO, VV&A Recommended Practices Guide – Validation Special Topic • Assemble from known components of the system or procedure • Assemble from known basic phenomena underlying the system’s behavior • Build a scale model of the system or its components and perform experiments • Use the referents for a similar existing system or similar situations

  14. Requirements Requirements Specifications Specifications Data/Nouns (Inputs and Outputs) • Attributes Objects Objects Resources • Behavior states • Objects Objects Actions/Activities/Verbs Functions & algorithms that • Create/change data • Create additional actions • Environment Objects Objects Constraints • Relationships • Simulation Environment Geometry • Conceptual Model Conceptual Model Components DMSO, VV&A Recommended Practices Guide – Conceptual Model Special Topic The model should be as simple as possible, but not too simple

  15. Conceptual Model Analysis • Test/analyze component algorithms of overall model to validate each individually • Mathematical analysis • Results of component algorithms should match available data • Increases confidence that interactions of the collected algorithms (i.e., the overall model) are valid • Algorithm testing • 3rd party program (e.g., Excel) • Should examine a range of data • Assumption testing (supplementary or alternative approach) • Determine assumptions (rarely stated) – structural, causal, and mathematical • Identify operational impacts of assumptions relative to intended application • Determine acceptability of operational impacts with Application Sponsor (Accreditation Authority) • If they exist, unexpected/emergent interactions should appear in model output • However, interactions between algorithms may not be addressed

  16. Informal Determine “reasonableness” Most commonly used, subjective Audit, review, face validation, inspection, Turing test Static Assess accuracy of design Automated tools available Analyses: semantic/structural, data/control, interface, traceability V&V Technique Taxonomy DMSO, VV&A Recommended Practices Guide – V&V Techniques Special Topic • Dynamic • Assess model execution • Requires model instrumentation • Tests: acceptance, fault/failure, assertion, execution, regression, predictive validation, structure, sensitivity, statistical • Formal • Complex, time consuming • Induction, inference, predicate calculus, proof of correctness How much V&V depends on budgetary considerations, significance of supported decisions, and the risk of inaccuracy.

  17. Face validation SME review Comparison to other M&S Legacy, non-Government, alternative formulation Functional decomposition Validating the parts, assuming the whole Sensitivity analyses Run boundary conditions 7 Recommended Techniques DA-PAM 5-11, Verification, Validation, and Accreditation of Army M&S • Visualization • Output appears to match intent • Turing tests • “If it walks like a duck, …” • Modeling-test-model • Anticipate, experiment, refine Each technique has its drawbacks

  18. Results match intuitive expectations Dynamic technique Results SMEs use intuition and estimates of expected behaviors and outputs Model and system behaviors considered subjectively Best used in early stages of development Issues Dependent on experience with the system being modeled to provide intuitive expectations Subject to human error Difficult to predict unexpected/emergent behaviors based on intuition/experience Intuition vs. Data • Results match data from past experience • Historical, exercise, other models • Dynamic technique • Reasonable results • Predictive validation – results provide a reasonable prediction of subsequent real-world behavior/results • Historical/exercise/model data should generate outputs similar to associated results • Models should be consistent • Multiple models for the same system should produce the “same” results from the “same” data • Systematic biases will not be detected

  19. Outline • Validation defined • General approach • Issues for ABS validation

  20. Agent Validation DMSO, VV&A Recommended Practices Guide – Human Behavioral Representation (HBR) Special Topic Moya & Tolk, Toward a Taxonomy of Agents & MAS • Evaluate • Conceptual model design • Knowledge Base • Engine and Knowledge Base implementation • Integration with simulation environment

  21. Agent System Validation Moya & Tolk, Toward a Taxonomy of Agents & MAS • Effect of parameter settings and system/agent instantiations (ranges, settings, interpretations, rules) • Interactions • Overall results

  22. Areas Affecting HBR Validity DMSO, VV&A Recommended Practices Guide – HBR Special Topic • Interactions between multiple behaviors • Assumes that interacting nonlinear behaviors will create even more convoluted nonlinear behavior • Dependencies between properties in the behavior space • Sensitivities between behavior space property changes • Nonlinear behavior • Errors can hide or be misinterpreted • Nonlinear component behavior transitions • Complex environmental interactions • Stochastic behaviors • Probabilistic sensing

More Related