1 / 11

Dr. Bob Sheldon Joint and External Analysis Branch Operations Analysis Division

Validation Methodology for Agent-Based Simulations Workshop Perspectives on Agent-Based Simulation and VV&A. Dr. Bob Sheldon Joint and External Analysis Branch Operations Analysis Division Marine Corps Combat Development Command 01 May 2007. Overview.

jason
Download Presentation

Dr. Bob Sheldon Joint and External Analysis Branch Operations Analysis Division

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Validation Methodology for Agent-Based Simulations WorkshopPerspectives onAgent-Based Simulation and VV&A Dr. Bob Sheldon Joint and External Analysis Branch Operations Analysis Division Marine Corps Combat Development Command 01 May 2007

  2. Overview • VV&A and Agent-Based Simulation (ABS) thoughts from Dr. George Akst, Senior Analyst, Marine Corps Combat Development Command (MCCDC) • MORS historical perspectives on VV&A and ABS • Personal reflections

  3. Perspectives from Dr. Akst • It’s the data, stupid! • How do you come up with data for parameter Z = x.x %? • Especially a problem for Irregular Warfare (IW) • Sometimes, model developers who are structuring algorithms don’t worry about data & assume data can be developed after the fact • Dr. Kirk Yost data triage – consider data sources when building models • Generally accepted (produced regularly by some believable source) • Semi-valid (reasonable information derived from various sources) • Judgment and knobs • If you start with meaningless data, and execute a design of experiments with 210 runs (just because you can), then you will have 210 useless results • To be useful, ABS need to provide more than just simplistic insights • ABS should go beyond being an automated tool that regurgitates SME intuition

  4. More Dilbert Data

  5. Perspectives from Dr. Akst • Two ends of the spectrum • Engineering-level model: should very closely predict how system would operate in the real world • Campaign-level model: measure relative differences that changes to forces, tactics, or equipment have on the outcome • Trying to literally match a combat model’s results with some other set of results (real world, experiment, or another model) is not realistic • What validation is: • Failure to invalidate after concerted effort • Ascertaining that results are “plausible” – no obvious logic flaws and results are “reasonable” and “relatively consistent” with past modeling results From “Musings on Verification, Validation, and Accreditation (VV&A) of Analytical Combat Simulations” Phalanx, September 2006

  6. MORS Meetings on VV&A • Simulation Validation (SIMVAL), October 1990 • SIMVAL II, April 1992 • SIMVAL '94, September 1994 • Simulation Validation tutorial, MORSS & ALMC, 1995 (Pete Knepell) • SIMVAL '99: Making VV&A Effective and Affordable, January 1999 • Evolving Validation Topics in MORS • Descriptive validity, Structural validity, Predictive validity • Structural validation, Output validation • Conceptual Model validation, Data validation, and Output validation

  7. MORS Meetings on ABS • New Techniques: A Better Understanding of their Application to Analysis, November 2002 • Included 1-day tutorial on Agent-Based Models • Agent-Based Models and Other Analytic Tools in Support of Stability Operations, October 2005 • Plus substantial coverage in MORSS working groups, e.g., WG 31 – Computing Advances in Military OR and WG 32 - Social Science Methods

  8. Personal Reflections • How to validate (or invalidate) counter-intuitive results (e.g., Surprise) • Clay Thomas “Analysis either verifies your intuition or educates your intuition.” • Simple visualization helps validation • Gantt chart example for sortie generation • Provide visualization that SMEs understand

  9. Personal Reflections (Cont’d) • Ready access to source code helps • Example: Effect of (0,1) parameter • Good mathematical documentation a plus

  10. Personal Reflections (Cont’d) • Comparing counter-intuitive results to “intuitive” results: a case study • At a Project Albert workshop, the agent-based model Socrates gave counter-intuitive results • Simulation attrition results varied over 3 phases with 2 breakpoints • When I fit a Lanchester linear model to the results, the regions where the fit was “bad” corresponded to the counter-intuitive results • Drill-down investigation explained these anomalies • Mysterious results were due to scenario data & tuning parameters “Comparing the Results of a Nonlinear Agent-Based Model to Lanchester’s Linear Model” Maneuver Warfare Science 2002

  11. Questions? Juan Muñoz, Five Seated Figures, 1996 Hirshhorn Museum and Sculpture Garden

More Related