slide1
Download
Skip this Video
Download Presentation
ASW METOC Metrics: MPRA Committee Report

Loading in 2 Seconds...

play fullscreen
1 / 26

ASW METOC Metrics: MPRA Committee Report - PowerPoint PPT Presentation


  • 97 Views
  • Uploaded on

ASW METOC Metrics: MPRA Committee Report. Bruce Ford Clear Science, Inc. (CSI) bruce@clearscienceinc.com Tom Murphree Naval Postgraduate School (NPS) murphree@nps.edu. Brief for ASW METOC Metrics Symposium Two 02-04 May, 2007.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' ASW METOC Metrics: MPRA Committee Report' - meagan


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

ASW METOC Metrics: MPRA Committee Report

Bruce Ford

Clear Science, Inc. (CSI)

bruce@clearscienceinc.com

Tom Murphree

Naval Postgraduate School (NPS)

murphree@nps.edu

Brief for ASW METOC Metrics Symposium Two

02-04 May, 2007

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide2

MPRA Focus Committee Report

  • Scope
  • Customers
  • METOC inputs to mission phases
  • METOC performance metrics
  • Customer performance metrics
  • Operational performance metrics
  • Proxy operational metrics
  • Other metrics
  • Data collection systems
  • Data analysis process
  • Operational modeling
  • Funding Levels

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide3

MPRA Focus Committee Members

  • LEAD: Clear Science – Mr. Bruce Ford
  • NOAD Kadena OIC - LCDR Danny Garcia
  • NOAD JAX OIC – LT Eric MacDonald
  • CPRG - CDR Sopko
  • NRL - Pat Hogan
  • APL- UW – Mr. Bob Miyamoto
  • FNMOC – LTJG Dave Watson
  • PDD South - Doug Lipscombe
  • SPA - Paul Vodola, Matt McNamara, Luke Piepkorn

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide4

MPRA Focus Committee Report

  • Scope
  • Brick from which MPRA metrics could be built is data surrounding an individual MPRA mission
    • Mission execution package (MEP)
    • Verification of MEP discrete elements
    • Mission objectives (GREEN, PURPLE)
    • Mission outcomes (PURPLE)
    • Note: No routinely-produced planning product for
    • individual MPRA missions
  • Expanded scope will be proposed for additional metrics

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide5

MPRA Focus Committee Report

  • Customers
  • Primary
    • MPRA Aircrews  Primary focus
  • Secondary
    • Wing/TSC stations
    • Supported activities
      • Other warfare communities

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide6

MPRA Focus Committee Report

Planning Timeline

  • Climo/advisory inputs at IPC/MPC/FPCs for large scale exercises
  • Wing level training planned about a month in advance of missions
  • Individual aircrew planning occurs within 24-36 hours prior to mission
    • No routinely produced planning product
    • Planning info passed informally (conversation, phonecon)
    • Amount of planning is mission dependent (e.g., multistatic missions may involve more planning by aircrew)

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide7

MPRA Focus Committee Report

Planning Timeline

  • GREEN messages released within 24 hours prior to launch
    • Mission date/event number
    • Mission type
    • Squadron and callsign
    • On-station area
    • Flight levels
    • On and off station times
  • Mission execution brief
    • Conducted about 3 hours prior to mission launch
    • MEP briefed
    • Copy of MEP provided to the aircrew

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide8

MPRA Focus Committee Report

Execution Timeline

  • During mission, data collected for inclusion in PURPLE
    • Weather conditions data
    • BT data
    • AN data

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide9

MPRA Focus Committee Report

Debrief Timeline

  • Other Post-mission Activities
  • Mission Construction and Evaluation (MC&E) assigns mission grade – within a week following mission
  • Mission data archived for a month
  • BT data archived for 1 year

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide10

MPRA Focus Committee Report

Data for Potential METOC Performance Metrics

  • List may be expanded – Recommend all verifiable elements be collected and verified
  • Verification scheme needs to be developed
  • Many ranges may be forecasted, but few verified

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

key asw issue sensor performance prediction
Key ASW Issue: Sensor Performance Prediction
  • Considerable effort is taken to predict sensor performance (measurements, databases, models)
  • This results in an estimate of signal-to-noise (SNR) on target
    • Fundamental metric
    • SNR potentially a very good proxy metric
  • Difficult to compare SNR to “fleet detections”
    • Detect-to-engage sequence involves many more factors in detection, classification and localization.
  • Can we compare predicted SNR to measured SNR for MPRA?
    • Not final step, but key metric
slide12

MPRA Focus Committee Report

Potential Customer Performance Metrics

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide13

MPRA Focus Committee Report

Other Metrics – Drawn from PURPLEs

  • Number of ASW contacts detected by acoustic sensors
  • Possibly subdivide this further by sensor (e.g., sonobuoy, EER)
  • Number of ASW contacts detected by MAD sensors
  • Number of ASW contacts detected by IR sensors
  • Number of ASW contacts detected by RADAR sensors
  • Number of ASW contacts detected by visual sensors
  • Number of surface contacts detected by IR sensors
  • Number of surface contacts detected by RADAR sensors
  • Number of surface contacts detected by visual sensors
  • Contact investigation time

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide14

MPRA Focus Committee Report

Potential Operational Impacts Metrics

  • Draw correlations between METOC performance metrics and customer performance metrics
  • * Proposed proxy metrics: SLD, BLG, visibility, and sig wave height
  • Those elements with high correlations over time may be good proxy operational impacts metrics

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide15

MPRA Focus Committee Report

Data Collection Methods – Three Proposed Levels

  • Primary data collection
    • Largely automated process
    • Data drawn from MEP inputs, GREENs, PURPLEs with limited free form entries (if any)
    • MEP inputs collected by MEP builder
    • MEP builder
      • Proposed web interface for entering discrete elements
      • Potentially automate data pulls (RBC, JAAWIN)
      • Collect data for metrics computation
      • Output brief ready slide for start of MEP
    • QC by RBC or other organization

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide16

MPRA Focus Committee Report

Secondary

Collection

Data Collection Methods – Three Proposed Levels

  • Secondary data collection
    • To be undertaken if primary data collection is inadequate
    • Collect additional information from mission debriefs that is not included in PURPLEs
    • Would require NOAD personnel directly collecting/entering information

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide17

MPRA Focus Committee Report

Data Collection Methods – Three Proposed Levels

  • Tertiary/Exercise-level data collection
    • Flag missions as part of an exercise (MPRA, Surface ASW, etc.)
    • Collect data regarding impacts of METOC information on exercise planning process (e.g., IPC/MPC/FPC)
    • Collection data on outcomes from post-exercise (hot wash) meetings
    • Prepare whole-exercise data for further analysis

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide18

MPRA Focus Committee Report

Data Analysis/Display - Multi-Level Access

  • Display determined by user permissions
  • Level 1 – Single-mission metrics information
  • Level 2 – Multiple mission, single METOC unit metrics. Metrics displayed by combination of:
    • NOAD
    • Geographical region
    • Span of time
  • Level 3 – Multiple mission, multiple METOC unit metrics. Metrics displayable by:
    • METOC unit
    • Geographical region
    • Span of time
    • Include directorate level metrics for top level users

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide19

MPRA Focus Committee Report

Data Analysis/Display – Level 1

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide20

MPRA Focus Committee Report

NOAD

NOAD A

NOAD B

NOAD C

Data Analysis/Display – Level 2

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide21

MPRA Focus Committee Report

Data Analysis/Display – Level 3

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide22

MPRA Focus Committee Report

Operational Modeling

  • Recommend modeling studies to simulate ASW conflicts and model existing metrics forward
    • Identify sensitivities of warfighter to METOC information
    • Provide a basis for the metrics evaluation process
    • Inform future funding and R&D decisions
    • Improve data collection methods
    • Align training and research to add value to and improve METOC information
  • Metrics data collected should be integrated with operational modeling in a continual feedback loop
    • Real-world data used to improve the fidelity of the operational model
    • Model results used to identify the type and methods of data to collect

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide23

MPRA Focus Committee Report

Funding Recommendations

  • 1. Bare bones – Initiate primary collection system with analysis and display of metrics. Provide training to personnel who will enter data or administer collection system.
  • 2. Adequate to complete project – Same as 1 above but also institute the secondary collection system and conduct operational modeling when feasible
  • 3. Completely funded – Same as 1 and 2 above, but also institute the tertiary (exercise level) data collection system. Train and equip R&A personnel to enter data and administer exercise metrics system.

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide24

Back-up Slides

B. Ford and T. Murphree, MPRA Report, May 07, bruce@clearscienceinc.com, murphree@nps.edu

slide25

Tactical Decision Aid

(ASPECT)

Predicted SNR

Can we predict sensor performance?

Reconstruction

P-3 flies

What happened during the mission (TSC)?

Measured SNR

Predicted SNR

Measured SNR

ad