1 / 20

ASW METOC Metrics: Symposium Hot Wash

ASW METOC Metrics: Symposium Hot Wash. ASW METOC Metrics Symposium Two 02-04 May, 2007. Symposium Objectives. Identify connections between ASW METOC metrics efforts and NMETL standards.

lexiss
Download Presentation

ASW METOC Metrics: Symposium Hot Wash

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASW METOC Metrics: Symposium Hot Wash ASW METOC Metrics Symposium Two 02-04 May, 2007

  2. Symposium Objectives • Identify connections between ASW METOC metrics efforts and NMETL standards. • Review ASW METOC metrics committee reports for MPRA, ARBC, and NOAT areas of ASW support, including identification of: • report strengths and limitations • additional investigations needed • optimal set of metrics for MPRA, ARBC, and NOAT areas • Outline plan for ASW METOC metrics effort for Valiant Shield 2007 (VS07) • Establish course of action for overall ASW METOC metrics program

  3. Symposium Objectives 1. Identify connections between ASW METOC metrics efforts and NMETL standards. • Received METL training from Mr. Dave Brown • Discussion of METL/ASW metrics tie-in • ASW METOC metrics can enhance the effectiveness of METLs • ASW METOC metrics can determine mission essential criteria for METOC elements • ASW METOC metrics can serve as measures within METL standards • METLs can serve to focus METOC metrics attention on elements within customer specific METLs • Align METL standards with attainable metrics • Propose measures that will be collected on a regular, long term basis • Noted potential METL shortcomings • METL system based on endless improvement vs point of diminishing returns • Mission essential criteria are assumed and not tested • Many measures of standards are provided for most METLs but few measures are collected/monitored

  4. The Navy’s NMETL Process Common Language/ Framework Mission Analysis UJTL Navy Enterprise and Operating Concepts “NMETs” Level of War Plans/Orders NTTL Specify Conditions Doctrine Mission Essential Criteria Assigned Missions ROC-POE, OPTASKS, MCPs, MFT’s, Local Knowledge, Lessons Learned, Best Practices Establish Standards Annotate Supported Tasks Specified Tasks Commanders’ Guidance Implied Tasks ID Command-Linked Tasks NMETL

  5. Symposium Objectives 2. Review ASW METOC metrics committee reports for MPRA, ARBC, and NOAT areas of ASW support • Identified report strengths and limitations • Discussed need for additional investigation • In the course of VS07 • Proposed exercise metrics • Collecting MPRA, NOAT or RBC data binned by exercise • Collect exercise planning impacts • Collect exercise outcomes • Compute exercise metrics/prepare for R&A analysis • Initial set of metrics for MPRA, NOAT and RBC proposed • Initial metrics to be tested in VS07 • Customer measures of success provide biggest challenge • They differ based on who within the customer organization you ask • Are often based on peripheral factors (e.g., training qualifications) and not on tactical performance

  6. MPRA Focus Committee Report Potential Operational Impacts Metrics • Draw correlations between METOC performance metrics and customer performance metrics • * Proposed proxy metrics: SLD, BLG, visibility, and sig wave height • Those elements with high correlations over time may be good proxy operational impacts metrics

  7. NOAT Focus Committee Report Potential METOC Performance Metrics • List may be expanded – Recommend all verifiable elements be collected and verified • Verification scheme needs to be developed • Verification BT info should be launched during analysis valid time • May be many ranges forecasted, but few verified. Strive for: • Sensor • Mode (active/passive) • Sensor depth • Target depth • Propagation path • Environmental data source (MODAS, MODAS LT, BT) • Predicted range NOAT tends to use RBC products without changes. If NOAT modifies RBC product, this change and the reasons for the change, need to be documented, and the NOAT modified products need to be verified. Collect NOAT data via NOAT event-by-event battle watch log.

  8. Symposium Objectives 3. Outline plan for ASW METOC metrics effort for VS07 • Initial MPRA, RBC and NOAT metrics agreed upon • Metrics data collection objectives identified • Product performance • Customer performance • Operational impacts • METOC Support process • VS07 metrics manning finalized

  9. Symposium Goals VS07 METOC Metrics Manning • GUAM (8 PERS) • Dr. Bob Miyamoto • LCDR Garcia, AG1 Mason, & AG1 Herring • LT Scheidecker, & AG1 Colonvalentin • NAVO – Mr Mike Horowitz, & Mr Jon Sheptis • IWO JIMA (3 PERS) • AGC Berger, AG2 Long & AG1 Ingram • CTF 74 • Augment by YOKO • Metrics/Training – LT Parker • CUS – LT Sowards, LT Caton, & • STGCS McRae • RV Cory Chouest (3 PERS) • LT Price • CUS - Mr Jesse James & Mr Jim Compton • USNS Impeccable (5 PERS) • NOAC YOKO - AG2 Sacco • CUS – Mr Jeff Friesz, Mr Bill Rodgers, • Mr Mike Lamczyk & Mr Brian Turban • DESRON 21 JCS (4 PERS) • LT Ray, AG1 Welch, & AG2 Loht • Metrics Rider - Mr Bruce Ford • DESRON 15 KHK (8 PERS) • YOKO - LT Campo, AGC Saenz, • AG1 Maxey, AG1 McLeod, • & AG3 Figeoura • CUS – LT Poteete & LT Preuss • Metric Rider – LCDR Simms • DESRON 23 NIMITZ (4 PERS) • AGC Fargo, AG2 Dukes, AG3 Waddell • Metrics Rider – LT Lambert • THIRD FLEET (JFMCC-1 PER) • AGC Green • COMPACFLT (1 PER) • CUS – CWO3 McGowan

  10. General VS07 Metrics Data Collection Plan • Aboard each CV • Mimic NOAT data collection role by collecting all: • Discrete data • Verifying in situ data • Recommendations (tactical and mitigation) • Recommendation outcomes • Customer measures of success • Investigate other potential data sources, methods, customer measures of success, etc. • At RBC • Monitor watch officer actions and log for potential data collection • Monitor NOAT interactions • Observe product generation process • Embedded with deployed MPRA TSC (MOCC) • Mimic MPRA data collection by collecting: • Data from each GREEN and PURPLE • Discrete forecast elements • Investigate other potential data sources, methods, customer measures of success, etc.

  11. General VS07 Metrics Data Collection Plan (continued) • Embedded with each major staff • Mimic NOAT data collection role by collecting all: • Recommendations (tactical and mitigation) • Recommendation outcomes • Customer measures of success • Investigate other potential data sources, methods, customer measures of success, etc.

  12. Symposium Objectives 4. Establish course of action for overall ASW METOC metrics program • Results of VS07 METOC metrics collection/analysis intended to fine tune overall project plan • Project scope will depend on future funding • Estimated to be a three-year project (’08-’10) • Initial project funding proposals were briefed for: • Real World data collection/display component (NPS-CSI) • Operational modeling component (SPA) • Need to identify multi-year funding sources • RTP • TOC/USW • Feedback provided on draft proposals and additional detail on proposals forthcoming

  13. Potential ASW Metrics Data Collection System NOAT Metrics Node MPRA Metrics Node Customer Measures of Success MEP Builder Recco Information Green Contact Information Purple In Situ Data Freeform Data Entry FCST/Anal Metrics Server Quality Control Exercise Intentions WatchOfficer Log PlanningImpacts NOAT Survey Exercise Outcomes Flag ExerciseRecords Objective Data Collection RBC Metrics Node R&A (Exercise) Metrics Node METOC Data Source Non-METOC Data Source

  14. Model Design VS07 Lessons Learned Model as Laboratory METOC Data Collection Development • METOC Support Process • Products • Timelines • ASW Cdr Decision Process • Inputs • Timelines • Mission Planning Process • Etc… ASW Metrics Project Objectives – Operations Analysis Performance Thresholds New Support Products & Concepts Support Evaluation Performance Benchmark Cyclical Metrics-Based Analysis of METOC Performance Updated benchmarks

  15. Back-up Slides

  16. VS07 Tasks • Real World Data Metrics Component • Identify key data to be collected • Develop: • collection process, including training materials • data analysis methods • reporting tools • Collect data • Product performance • Customer performance • Operational impact • METOC support process • Conduct and report • analyses of product performance • operational impacts

  17. VS07 Tasks • Real World Data Metrics Component • Coordinate and integrate data collection efforts • Integrate real world data and operational modeling efforts • Coordinate with R&A efforts • Develop initial components of larger ASW METOC metrics project • Apply VS07 findings to planning for larger ASW METOC metrics project • Plan organize, and participate in post exercise meetings, symposium, and reports • Coordinate with automated ocean products metrics RTP project

  18. SPA Participation In Metrics Data Collection • One Ops Analyst at CTF-74 • Participate in METOC data collection • Gather information on METOC support process • Gain understanding of ASW decision support process • Complementary to existing metrics manning plan • Objective is to examine the overall ASW decision support process • View the METOC support process from a “top-down” perspective • Provide preliminary information regarding ASW ops model design • ASW decision process (includes but not limited to METOC inputs) • Exploring CDR COAs and “what-if” scenarios • Provide initial contextual data of the ASW mission

  19. SPA Participation In Metrics Data Collection • Deliverables • Provide input into: • VS07 Overview and Event Summary • Command/decision structure • METOC Nodes and network/relationship to decision-makers • Supporting ops analysis • Develop foundation for modeling of decision process from ASW CDR perspective • Budget • $25k to support pre-exercise preparation, exercise support and post-exercise evaluation

  20. SPA Participation In Metrics Data Collection • Alternative use of funding: Preliminary operations model design • Objective: Develop a model schematic of an MPRA ASW mission • Including METOC support process • Potential linkages to operational impacts • Deliverable: • Provides foundation for FY08-FY10 modeling efforts

More Related