Doe in temps t e concepts test plans and blrips
This presentation is the property of its rightful owner.
Sponsored Links
1 / 35

DOE in TEMPs, T&E Concepts, Test Plans and BLRIPs PowerPoint PPT Presentation


  • 42 Views
  • Uploaded on
  • Presentation posted in: General

DOE in TEMPs, T&E Concepts, Test Plans and BLRIPs. Lessons Learned from Case Studies. Purpose. Discuss lessons learned from past tests Illustrate how DOE thinking can be applied to TEMPs , Test Plans, and other documents. Outline. Overview Elements of “DOE” Process Examples

Download Presentation

DOE in TEMPs, T&E Concepts, Test Plans and BLRIPs

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Doe in temps t e concepts test plans and blrips

DOE in TEMPs, T&E Concepts,Test Plans and BLRIPs

Lessons Learned from Case Studies


Purpose

Purpose

  • Discuss lessons learned from past tests

  • Illustrate how DOE thinking can be applied to TEMPs, Test Plans, and other documents


Outline

Outline

  • Overview

  • Elements of “DOE” Process

  • Examples

    • Quantitative, Mission-Oriented Metrics

    • Coverage of Operational Envelope

    • Confidence and Power of Test

  • Summary


Overview

Overview

  • Based upon DOT&E initiative:

    • “Whenever possible, our evaluation of performance must include a rigorous assessment of the confidence level of the test, the power of the test and some measure of how well the test spans the operational envelope of the system.”

  • IDA conducted analysis of select BLRIPs from last two years

    • Noted a structured approach to testing that capture many aspects of these concepts

    • The analysis also identified areas of potential improvement

  • Modify TEMPs, T&E concept papers, Test Plans, and BLRIPs to incorporate “DOE” concepts


Elements of doe process

Elements of “DOE” Process

  • Have quantitative, mission-oriented metrics:

    • What is the question(s) we are trying to answer?

      • e.g., Can a unit equipped with the Mobile Gun System (MGS) successfully accomplish its missions?

    • What are the applicable metrics?

  • Describe how well the operational envelope is covered:

    • Identify factors that drive performance

      • e.g., threat, terrain, environment, mission

    • Identify levels for each factor

    • Show how well the test covers the operational envelope

      • For both individual test periods and the overall test program

  • Calculate the confidence level and power of the test:

    • Test plan:

      • Significance, Power, Effect Size, sample size …

    • Test reports:

      • XX% confidence intervals

      • Confidence performance above threshold

  • Consider whether standard DOE techniques are applicable

There is no “one size fits all” solution


Quantitative mission oriented metrics

Quantitative, Mission-Oriented Metrics


Mission oriented metrics

Mission-Oriented Metrics

  • Case studies identified several areas for potential improvement

  • Ensure metrics and KPPs are measureable and testable

    • As defined, many are not, e.g., “The Mobile Gun System (MGS) primary armament must defeat a standard infantry bunker and create an opening in a double reinforced concrete wall, through which infantry can pass.”

  • Frequently mission-oriented metrics do not have thresholds

    • Consider whether they should have a threshold

  • Is the standard “at least as good as (or better than) the legacy system?”

    • Do you have quantitative data on the legacy system?

Look at metrics during JCIDS process


Surveys

Surveys

  • Surveys frequently have been qualitative and poorly designed

  • There is a science behind survey design; use it

    • Be quantitative (e.g., Likert scale)

  • During analysis, watch for discrepancies between numerical scores and written comments

Be careful with surveys


Coverage of operational envelope

Coverage of Operational Envelope


Coverage of the operational envelope

Coverage of the Operational Envelope

  • 1st Step: Identify factors & levels of interest

    • In case studies, factors & levels of interest were sometimes specified, other times they were added in retrospective study.

  • 2nd Step: Determine breadth of coverage of operational envelope

    • Tools illustrated in following examples: cross-tabular matrices, continuous plots, other graphical representations

    • These are examples, do not restrict yourself to them

  • Power analysis can help determine if test design is sufficient

    • Next section of brief


Mobile gun system mgs coverage of operational envelope

Mobile Gun System (MGS)Coverage of Operational Envelope

  • 4 Factors: Mission Type, Terrain Type, Threat Level & Illumination

  • IOT test design builds on evidence from previous events

    • Mission Rehearsal Exercise prior to unit deployment (basis for Section 231 report)

    • Field data from unit deployment

  • IOT scoped to focus on voids in medium and high threat levels

Weather: as it occurred; not controlled

Key

Early deployment changed original DOE plan


Mobile gun system mgs coverage of operational envelope1

Mobile Gun System (MGS)Coverage of Operational Envelope

Lesson Learned:

“DOE” identified gaps in coverage, partially filled from other sources

  • 4 Factors: Mission Type, Terrain Type, Threat Level & Illumination

  • IOT test design builds on evidence from previous events

    • Mission Rehearsal Exercise prior to unit deployment (basis for Section 231 report)

    • Field data from unit deployment

  • IOT scoped to focus on voids in medium and high threat levels

Weather: as it occurred; not controlled

Key

Early deployment changed original DOE plan


Uss virginia anti submarine warfare asw search

USS VirginiaAnti-Submarine Warfare (ASW) Search

  • What is the operational envelope? (factors and levels)

    • Environmental Factors

      • Shipping Levels and Sea State (ambient noise)

      • Sound Velocity Profiles (several types – each with different sound propagation characteristics)

    • Target types and operating modes

      • SSN (signature, sonar capability/proficiency)

      • SSK (signature, operating modes, sonar capability/proficiency)

    • Test submarine configurations (two towed arrays and wide aperture array)

    • Scenarios (area search, barrier search, cued intercept, multiple targets)

  • Cross tabular matrix from previous example might not illustrate breadth of coverage appropriately!


Uss virginia asw coverage of operational envelope

USS Virginia – ASWCoverage of Operational Envelope

  • Plot simplifies environmental and target type factors into ordinal comparisons

  • Only tested Virginia with TB-29 towed array (inadequacy noted in BLRIP)

  • Area search considered most difficult, other scenarios not examined in IOT&E

  • Stimulated sensors to simulate multiple target scenario

  • No SSK testing with Virginia conducted

    • ARCI data used to provide assessment

  • Two Virginia tests do not cover entire environmental space


Uss virginia asw coverage of operational envelope1

USS Virginia – ASWCoverage of Operational Envelope

  • Plot simplifies environmental and target type factors into ordinal comparisons

  • Only tested Virginia with TB-29 towed array (inadequacy noted in BLRIP)

  • Area search considered most difficult, other scenarios not examined in IOT&E

  • Stimulated sensors to simulate multiple target scenario

  • No SSK testing with Virginia conducted

    • ARCI data used to provide assessment

  • Two Virginia tests do not cover entire environmental space

UnknownPerformance

Historical Data Sufficient to assess performance

Difficult to determine response curve from two SSN tests

UnknownPerformance

Lesson Learned:

“DOE” helped identify gaps


Uss virginia strike coverage of operational envelope

USS Virginia – StrikeCoverage of Operational Envelope

Strike mission broken into phases with multiple factors/levels

PMissile Placement = PEPPAPTGTPLPM


Uss virginia strike coverage of operational envelope1

USS Virginia – StrikeCoverage of Operational Envelope

Strike mission broken into phases with multiple factors/levels

PMissile Placement = PEPPAPTGTPLPM

Limited missile firings will be discussed later


Joint chemical agent detector jcad

Joint Chemical Agent Detector (JCAD)

  • What is the operational envelope? (factors and levels)

    • Agent (9 agents and 2 simulants)

    • Temperature, water vapor concentration, agent concentration, interferent (continuous)

    • Environment (sand, sun, wind, rain, snow, fog)

    • Service (Army, Air Force, Navy, Marine Corps)

    • JCAD Mode (Monitor, Survey, TIC)

    • Operator (Any MOS to CBRN Specialist)

    • TTP (Monitor Mission, Survey Mission, Decon Support)


Joint chemical agent detector jcad coverage of operational

Joint Chemical Agent Detector (JCAD)Coverage of Operational


Joint chemical agent detector jcad coverage of operational1

Joint Chemical Agent Detector (JCAD)Coverage of Operational

“DOE” applied to full test program for breadth of coverage

Response Surface Design applied to chamber tests


Joint chemical agent detector jcad dt chamber test response surface design

Joint Chemical Agent Detector (JCAD)DT Chamber Test – Response Surface Design


Joint chemical agent detector jcad dt chamber test response surface design1

Joint Chemical Agent Detector (JCAD)DT Chamber Test – Response Surface Design

“DOE” helps determine whether gaps are significant to the overall assessment


Confidence and power of test

Confidence and Power of Test


Confidence and power of test1

Confidence and Power of Test

  • Test Planning vs. Test Reporting

  • Test Planning

    • What confidence level do we need?

    • Construct power of test – do we have high probability that the test will detect important differences?

  • Test Reporting

    • Provide confidence levels for all results.

    • Provide confidence above threshold where required.


Joint chemical agent detector jcad power of test

Joint Chemical Agent Detector (JCAD)Power of Test

  • Power Analysis for JCAD Chamber Test

    • DT Testing

    • Statistical Response Surface Design (I-Optimal)

    • High power test plan

*S:N – signal-to-noise ratio, goal detectable difference as a ratio to the design standard deviation


Mobile gun system mgs power of test

Mobile Gun System (MGS)Power of Test

  • Original Test Plan

  • (Sample Size = 22)

  • DOE Interrupted by Deployment

  • (Sample Size = 16)

*S:N – signal-to-noise ratio, goal detectable difference as a ratio to the design standard deviation

Lesson Learned: smaller sample size decreases power


Ea 18g ea 6b comparison confidence intervals

EA-18G/EA-6B ComparisonConfidence Intervals

Figure from DOT&E EA-18G BLRIP

Percent Success


Ea 18g ea 6b comparison confidence intervals1

EA-18G/EA-6B ComparisonConfidence Intervals

Figure from DOT&E EA-18G BLRIP

Percent Success

Confidence intervals make it clear performance is comparable


Mobile gun system mgs confidence intervals

Mobile Gun System (MGS)Confidence Intervals

Even without a threshold, confidence intervals quantifies how well the metric was measured


Mh 60r s p3i confidence above threshold

MH-60R/S P3IConfidence Above Threshold

For both aircraft, all mission failures were due to legacy airframe issues vice P3I systems


Mh 60r s p3i confidence above threshold1

MH-60R/S P3IConfidence Above Threshold

Lesson Learned:

Data not available to calculate. Watch data collection and management plan

For both aircraft, all mission failures were due to legacy airframe issues vice P3I systems


Mobile gun system mgs data analysis

Mobile Gun System (MGS)Data Analysis

“DOE” illustrates how performance varies across envelope

  • Overall Mission Success Rate is 69%

  • Mission Success tied to unit achieving assigned objectives and unit losses


Uss virginia metrics confidence intervals

USS Virginia MetricsConfidence Intervals

Statistical metrics may require special techniques

Provide supplementary details from past testing. Previous Tomahawk testing demonstrated … Use factors and past data to identify limited test scenarios


Summary

Summary


Summary1

Summary

  • Next Steps: Modify TEMPs, T&E concept papers, Test Plans, and BLRIPs to incorporate “DOE” concepts

  • Have quantitative, mission-oriented metrics:

    • What is the question(s) we are trying to answer?

    • What are the applicable metrics?

  • Describe how well the operational envelope is covered:

    • Identify factors that drive performance

    • Identify levels for each factor

    • Show how well the test covers the operational envelope

      • For both individual test periods and the overall test program

  • Calculate the confidence level and power of the test:

    • Test plan:

      • Significance, Power, Effect Size, sample size …

    • Test reports:

      • XX% confidence interval

      • Confidence performance above threshold

  • Consider whether standard DOE designs are applicable


  • Login