1 / 44

Improving Confidence in the Assessment of System Performance in Differing Scenarios.

Improving Confidence in the Assessment of System Performance in Differing Scenarios. T D Clayton. Cardinal Consultants. 1. Context 2. Scenario Dependency of Input Data 3. Choosing Scenarios to Assess 4. Modelling Widely Differing Scenarios 5. Example Study

gerda
Download Presentation

Improving Confidence in the Assessment of System Performance in Differing Scenarios.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Improving Confidence in the Assessment of System Performance in Differing Scenarios. T D Clayton Cardinal Consultants

  2. 1. Context 2. Scenario Dependency of Input Data 3. Choosing Scenarios to Assess 4. Modelling Widely Differing Scenarios 5. Example Study 6. Summary and Conclusions

  3. SYSTEM EFFECTIVENESS ASSESSMENT

  4. SYSTEM EFFECTIVENESS ASSESSMENT Warhead Lethality

  5. SYSTEM EFFECTIVENESS ASSESSMENT Warhead Lethality Combat modelling

  6. Sensor Performance Guidance System Wargaming SYSTEM EFFECTIVENESS ASSESSMENT Warhead / Fuze Performance Combat modelling Operator Performance Tactical / Strategic studies Other subsystems

  7. Purpose of System Effectiveness Studies • Research / long term development objectives • Medium term procurement objectives • Design optimisation • Procurement decisions • Input to Operational / Tactical Studies

  8. But, whatever the purpose, scenario assumptions are critical. or, we should assume they are, unless proven otherwise.

  9. Rule 1 Everything is scenario dependent.

  10. Sensor Performance Guidance System Wargaming SYSTEM EFFECTIVENESS ASSESSMENT Warhead / Fuze Performance Combat modelling Operator Performance Tactical / Strategic studies Other subsystems

  11. Sensor Performance Guidance System Wargaming SYSTEM EFFECTIVENESS ASSESSMENT Warhead / Fuze Performance Combat modelling Operator Performance Tactical / Strategic studies Other subsystems

  12. Pk = 0.47

  13. Nature of ground around the target • Presence of adjacent trees, or protective earthworks • Azimuth distribution • Elevation distribution • Relative value of M-kill, F-kill, P-kill, K-kill • Likelihood of multiple hits • Using an MFK value as a probability ?

  14. The Multi-Disciplinary Problem Lethality Expert Combat Modeller Systems Modeller

  15. TheManagementSolution Establish roles and responsibilities for managing the interfaces between expert groups.

  16. Responsibilities of the Interface Manager • Understand methodologies and assumptions at all levels • Organise training / briefings to assist expert groups widen knowledge • Conduct studies to measure Scenario Dependencies of results • Maintain knowledge base of dependencies and “corrections” • Involvement in planning of studies, addressing assumptions • Involvement in reporting of studies, esp. assumptions

  17. Study 1 Study 2 Study 3 MAIN DATABASE OF STUDY RESULTS ‘Offline’ analysis tools Comparison & Analysis Study planning and analysis Data provided to other studies DATABASE OF SCENARIO COMPENSATION FACTORS

  18. Study 1 Study 2 Study 3 MAIN DATABASE OF STUDY RESULTS Calculate SCF’s from new studies DATABASE OF SCENARIO COMPENSATION FACTORS Assessment and comparison of SCF’s Modified SCF’s

  19. Rule 2 You will never assess the right scenarios.

  20. Opposing - Technology Ground - Numbers Equipment - Own Intell. Posture & - Posture (Defensive, attacking) Deployment - Deployment and detectablity Air - Aircraft types Capability - Level of technology - Numbers - Own Intell. Anti-Air - Numbers of units Capability - Capability - Own Intell. Maritime - Maritime involvement - Capability BLUE ROLE - Peace keeping, combat (defensive) combat (hunt and kill) Scenario Parameters Climate - Temperature - Precipitation Ground - Vegetation - Topology - Roads Geography - Geographic isolation & Politics - Neighbouring countries - Local cilvilian population Opposing - Nuc., Chem., Bio. Max. Cap. - Short range Long range Opposing - Numbers Troops - Capability

  21. Scenario 1 Scenario 2 Scenario 3

  22. continuous parameter

  23. Rule 3 A combat model cannot address widely differing scenarios.

  24. Example Study Comparative assessment of two potential candidates for a cannon system for light armoured vehicles.

  25. ORIGINAL STUDY PLAN Input data Engagement Model (developed for this study) Combat model (existing) 3 Scenarios

  26. REVIEW OF PROVIDED DATA 1. When multiple hits are likely, SSKP may not be appropriate. 2. Lethality figures give no azimuth dependency. 3. No information on range dependency. 4. Data required for wider range of target types. Lethality models re-run, in concert with Engagement model.

  27. REVIEW OF EXISTING COMBAT MODEL 1. Tends to choose tanks as preferred target type. 2. All targets are land vehicles. 3. Terrain in all 3 scenarios tends to give long engagement ranges. 4. No variations in met-vis or day/night > long ranges 5. Same Blue positions for both System A and System B. 6. Units are static when firing.

  28. THE ALTERNATIVE APPROACH 1. Use a range of methods, including Military Judgement, to derive intermediate data and distributions reflecting a wide range of scenarios. • relative frequencies of target types engaged • engagement range distributions • azimuth distributions • probability of kill per burst - function of range and target type

  29. THE ALTERNATIVE APPROACH 1. Use a range of methods, including Military Judgement, to derive intermediate data and distributions reflecting a wide range of scenarios. 2. Develop a simple tool to calculate specific Measures of Effectiveness from the input data and distributions. MoE 1: Military Worth of kills per burst MoE 2: Military Worth of kills per ammunition load

  30. THE ALTERNATIVE APPROACH 1. Use a range of methods, including Military Judgement, to derive intermediate data and distributions reflecting a wide range of scenarios. 2. Develop a simple tool to calculate specific Measures of Effectiveness from the input data and distributions. • quick to develop • quick to run • facilitates review and scrutiny of data • stores data and maintains audit trails

  31. THE ALTERNATIVE APPROACH 1. Use a range of methods, including Military Judgement, to derive intermediate data and distributions reflecting a wide range of scenarios. 2. Develop a simple tool to calculate specific Measures of Effectiveness from the input data and distributions. • permit results to be adjusted by Military Judgement • to account for factors not addressed by calculations - the value of the ability to fire on the move - the value of the greater manoeuvrability afforded by the lighter system

  32. SUMMARY AND CONCLUSIONS Appropriate methods of addressing scenario dependencies are essential to ensure study conclusions are valid. 1. ALL DATA should be regarded as being scenario-dependent. It is very useful to have an analyst in every team with special responsibility for addressing this problem. 2. Using combat models to compare performance of systems can be hazardous. Consider using a range of methods to generate intermediate results which are open to scrutiny and to sensitivity studies.

  33. Framework Title Model results Feedback Alternative approach Contents Rule 2 Data screen 1 Study levels Results screen Scen Pars Study purpose Conclusions Histogram Rule 1 Graph Highlight top-level Rule 3 Highlight all Further Dev’t TarDes pic Example study Data Leth’y depends Current issues Original plan MutliDisciplinary Data review Management Soln Responsibilities Model review

  34. Further Development of the CST Tool 1. Development of proper library of routines 2. Improved statistical routines for increase in speed 3. Automated methods for parametric studies 4. Use of EDMS technologies to manage and access study reports

  35. CURRENT ISSUES / PROBLEMS WITH CST-01 1. It is not clear how best to address the problem of firing multiple bursts at a target, depending upon whether it is perceived to be killed. 2. It is not clear whether (and how) costs (or numbers of units) should be included, or handled separately.

More Related