1 / 32

Babes in the Woods: How Naïve Analysts Clashed with Trained Killers to the Mutual Benefit of All

Babes in the Woods: How Naïve Analysts Clashed with Trained Killers to the Mutual Benefit of All. Fred Cameron Operational Research Advisor to the Director General Land Capability Development. Kingston, Ontario. Directorate of Land Synthetic Environments.

Download Presentation

Babes in the Woods: How Naïve Analysts Clashed with Trained Killers to the Mutual Benefit of All

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Babes in the Woods:How Naïve Analysts Clashed with Trained Killersto the Mutual Benefit of All Fred Cameron Operational Research Advisor to the Director General Land Capability Development

  2. Kingston, Ontario Directorate of Land Synthetic Environments Defence Research and Development Canada

  3. Land Capability Development Operational Research Team • Mr Fred Cameron (Team Leader) • Mr Roger Roy • Ms Eugenia (Jenny) Kalantzis • Mr Ian Chapman • Mr François Cazzolato • Maj Bruce Chapman (to join July 2006) • The Army Experimentation Centre • Part of the Directorate of Land Synthetic Environments (Army’s home for Computer-based Wargames, Models, and Simulations for Training and Experimentation) • Strategic Analyst – Mr Peter Gizewski • Science Advisor – Mr Regan Reshke

  4. Deciding Which Method to Use – A Meta-Decision • A meta-model of the meta-decision process • An analysis: • Description • What happened? • Ascription • Were there causes and effects? • Prescription • What should we do in future? • Proscription • What should we avoid in future?

  5. Cameron’s Rubbish Bin Model of Decision Making March’s Garbage Can Model • Born of many years of observation • Employed for organizations that are “organized anarchies” • Basis for workshop on “Ambiguity and Command” in 1986 • Used provocatively: • Is it really that bad? Sources: Cohen, March, and Olsen (1972), March and Weissenger-Baylon (1986), and March (1994)

  6. March’s Garbage Can Model of Decision Making Complex interactions between: • Actors (decision makers) • Problems • Choice opportunities • Potential solutions

  7. March’s Garbage Can Model of Decision Making • Some ways to “improve” garbage-can decision making • Increase the heat and pressure • Force more frequent interactions • Put more garbage in the can • Add more potential solutions • Add more problems, more actors, more decision opportunities too? • Make actors “stickier” • Hope that they will carry problems or potential solutions with them long enough to get a better fit

  8. Description • “Deciding Which Method to Use” • OR analysts make great critics of the decision-making of others • But, turn the lens into a mirror • The Analyst’s Toolbox… Our Experience • Canadian Army Modelling and Simulation • Specific Examples and Lessons

  9. More: • time • resources • credibility The Operational Research Tool BoxSome Examples Soft OR/OA Methods Hard • Structured brain storming • Scenarios • Analysis of historical data • Options analysis, Decision analysis • Group ranking • Seminar war games • Analytical tools – mathematical analysis • Computer-based war games (JANUS, OneSAF) • Trials, exercises, and evaluations – Source: Future Army Development Plan, 1998 • Less: • time • resources • credibility

  10. Our Early Meta-decision Rules for “Deciding Which Method to Use” • If time and resources are short, and credibility is not an issue, then use soft OR methods • If credibility is paramount, and time and resources are available, then use hard OR methods • Our experience supports: “hard and soft methods can be used together at appropriate stages of a study (usually soft for problem formulation, hard for resolution)”

  11. People Real Simulated Virtual Constructive Simulated Equipment Live Real Simulation Types

  12. Link Real World C2 Systems Joint War Game Constructive Link Link Link Link Army Navy Air Force Operational Link Link Link Army Navy Air Force Link Tactical Link Link Link Live Virtual Simulators Simulators Simulators Goal Synthetic Environment

  13. CAST • Command and Staff Trainer • UK’s ABACUS Constructive Simulations

  14. Constructive Simulations

  15. Lesson 1: AE-7B* Armed Griffon Helicopter • Aim. To provide insights into differences in effectiveness and survivability between an armed and an unarmed helicopter • Purpose. To validate the incorporation of aerial firepower requirements in the CH-146 Griffon Helicopter mid-life upgrade • Simulation Structure. Constructive and virtual computer-based simulation supported by OneSAF Testbed Baseline (OTB), with virtual workstations for Griffon helicopters and Pointer-type Unmanned Air Vehicles (UAV) * AE = Army Experiment

  16. Lesson 1: AE-7B Armed Griffon Helicopter Constructive OTBSAF Tactical DIS DIS Griffon Pointer Virtual

  17. Strong Sponsor Involvement Weak Synthetic Environment Specification Late Terrain Generation Late Scenario Build Excellent Participant Base Automated Small Sample Statistics Late Griffon Simulation Delivery AE-7B – The Textbook Case Define Develop Conduct Analyze • Corrective Action • Implemented Synthetic • Environment Statement • of Requirement Report Jul 02 Oct 02 Apr 03 Jun 02 Jan 03 Feb 03 Mar 03 Aug 02 Sep 02 Nov 02 Dec 02

  18. Lesson 2: LOE* 0301 – TUAV in Controlled Airspace • Aim. To investigate possible reduced airspace restrictions for UAVs if Air Traffic Controllers have better situational awareness • Objective. To identify any differences in situational awareness enabled by various improvements to ATC situational awareness (in the vicinity of Kabul) • Simulation Structure. Constructive and virtual computer-based simulation supported by OneSAF Testbed Baseline (OTB), with virtual workstations for ATC Tower, Air Defence and ATC radars, and UAV * LOE = Limited Objective Experiment

  19. Lesson 2: LOE 0301 – TUAV in Controlled Airspace Constructive OTBSAF Tactical DIS DIS DIS DIS ATC Rdr Tower Ops TUAV Virtual AD Rdrs/ EOs

  20. Staff Check Problem Identification Sponsor Identification Iterative Prototyping Terrain Incomplete Sponsor Participant Access Automated Situation Awareness Analysis Tools Report Pre-Preparation LOE 0301 – The Short Circuit Define Develop Conduct Analyze Report • Corrective Action • Create Terrain/Visualization Cell Jul 03 Oct 03 Jun 03 Aug 03 Sep 03 Nov 03 May 03

  21. Ascription • Does the Garbage Can Model fit? Is it really that bad? • Decision outcomes driven by • Temporal Confluence: Decision makers, Choice opportunities, Problems, Potential solutions • Is the method we use for “deciding which method to use” any better that that?

  22. Toolbox Structured brain storming Scenarios Analysis of historical data Options analysis, Decision analysis Group ranking Seminar war games Analytical tools – mathematical analysis Computer-based war games (JANUS, OneSAF) Trials, exercises, and evaluations Skill set Facilitator Scribe Co-author of scenarios Enumerator of votes Analyst of preferences Modeller Mathematician/Statistician Experimenter in simulation space Experimenter in live trials The Operational Research Skill Set Soft OR/OA Methods Hard

  23. Roles of Soft OR Methods • To support thinking and planning by an individual analyst or decision maker – problem articulation • To support discussion between consultant and decision maker – problem negotiation • To support debate and conclusion among decision makers – group decision support • To initiate or strengthen organisational capabilities – organisational development – Source: Steve Cropper presentation in Holt and Pickburn (2001)

  24. Prescription • Consider the full spectrum in the tool box • Prepare for the full range of required skills • Implement a “lessons” process, and strive to improve

  25. Application of the Spectrum of Methods • Soft and hard OA should be viewed as equally useful and part of the analyst’s toolkit. These should be viewed as a spectrum of options within the analyst’s toolkit which should be selected and used as appropriate. • It was recognised that hard and soft methods can be used together at appropriate stages of a study (usually soft for problem formulation, hard for resolution). • Analysts need to be aware of all OA techniques and, broadly where they should and should not be employed. Better education may help to achieve this. • An expert practitioner (or team) is needed with expertise in the range of methods appropriate to the problem. – Extracts from Holt and Pickburn (2001), pp. 15, 17 and 19

  26. Proscription • Woe, woe, and thrice woe • Beware of complexity

  27. Specification Error Error Error Error Complexity Complexity Complexity Model Error Models, Complexity and Error Measurement Error Minimize Model Error

  28. Specification Error Error Error Error If we can increase the accuracy of performance characteristics, we can accommodate greater complexity. Complexity Complexity Complexity Model Error Models, Complexity and Error Measurement Error Overall error drops.

  29. Specification Error Error Error Error Complexity Complexity Complexity Minimize Model Error by Decreasing Complexity Model Error Models, Complexity and Error Measurement Error

  30. Main Recommendation • Make the actors “stickier”: • Analysts need to be aware of all OA techniques and, broadly where they should and should not be employed. Better education may help to achieve this • An expert practitioner (or team) is needed with expertise in the range of methods appropriate to the problem Source: Holt and Pickburn (2001)

  31. References • Michael D. Cohen, James G. March, Johan P. Olsen “A Garbage Can Model of Organizational Choice” Administrative Science Quarterly, Vol. 17, No. 1. (Mar., 1972), pp. 1-25 • James G. March and Roger Weissenger-Baylon eds. (1986) Ambiguity and Command: Organizational Perspectives on Military Decision-Making. Marshfield, MA Pitman Publishing Inc. • James G. March (1994) A Primer on Decision Making: How Decisions Happen. New York: The Free Press • Future Army Development Plan (1998) Kingston, Ontario: Directorate of Land Strategic Concepts • John Holt and George Pickburn (2001) OA Techniques for the Future. Farnborough, UK: Defence Evaluation and Research Agency. 30 March 2001

  32. Questions – Discussion Fred Cameron Tel: +1.613.541.5010 ext 2470 Email: Cameron.FWP2@forces.gc.ca

More Related