html5-img
1 / 80

METOC Metrics for Naval Special Warfare

METOC Metrics for Naval Special Warfare. Tom Murphree and CDR Rebecca Stone Naval Postgraduate School (NPS) Gregg Jacobs, Rick Allard, and Larry Hsu Naval Research Laboratory - Stennis Space Center (NRL-SSC Dr. Paul Vodola, Luke Piepkorn, and Tom Pentony Systems Planning and Analysis (SPA)

oakley
Download Presentation

METOC Metrics for Naval Special Warfare

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. METOC Metrics for Naval Special Warfare Tom Murphree and CDR Rebecca Stone Naval Postgraduate School (NPS) Gregg Jacobs, Rick Allard, and Larry Hsu Naval Research Laboratory - Stennis Space Center (NRL-SSC Dr. Paul Vodola, Luke Piepkorn, and Tom Pentony Systems Planning and Analysis (SPA) Bruce Ford Clear Science, Inc. (CSI) A Research Development, and Transition Project Funded by SPAWAR / PMW180 Brief for NSW METOC Metrics Meeting San Diego, CA 22-23 March 2007 Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  2. Definitions • Metrics: Objective, quantitative, data based measures of an organization’s operations, products, and services. Examples: • Metrics of product quality • Metrics of effects of products on customers • METOC Metrics: Metrics of METOC organization’s products and impacts. Three main types: • Performance metrics: metrics of capacity, readiness, quality, efficiency / return on Investment • Impacts metrics: metrics of impacts on warfare customer operations (planning, execution, post-assessment) • Phenomena metrics: metrics that relate product performance, customer performance, or operational impacts to specific environmental phenomena • Methods for Generating METOC Metrics •  Collect / analyze real world data on METOC and customer ops •  Model METOC and customer ops Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  3. 3-D METOC Metrics Space Organizational Axis Spatial/Temporal Axis Whole organization Multiple regions, long period METOC Performance Proxy Ops Impacts Operational Impacts Metrics Type Axis Multiple units Small region, short period Individual unit For details on this figure, see speaker notes section of this slide. Metrics process to be conducted in this 3-D space and continuously over time. time Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  4. Goals • 1. Develop and transition to operational use systems for: • a. collecting data from NSW METOC units and their customers • b. quantifying NSW METOC performance and impacts on customer • operations • c. modeling and predicting impacts of NSW METOC support on • war fighting operations • 2. Identify methods for improving quality and efficiency of NSW METOC support. • 3. Recommend: • a. focus directions for METOC resources • b. methods to improve METOC support • c. methods to improve warfighter use of METOC support • b. methods to incorporate METOC into OPNAV assessments Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  5. Objectives Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu • Develop METOC metrics system: • a. data collection methods • b. databases • c. data analysis tools • d. operations analysis models • e. online, real time access and reporting • Apply systems to: • a. collect and analyze data • b. simulate impacts of METOC products on warfighting operations • c. determine metrics that quantify the performance and operational • impacts of METOC forecasts • Transition to METOC community a metrics toolset composed of data collection, database, data analysis, and modeling systems for NSW. • Develop recommendations for METOC leadership.

  6. Methods Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu • Develop automated system to collect and analyze real world • data. • Develop and apply warfare mission model. • Use real world data to modify and verify model. • Use model to identify additional data to collect and analyses to • conduct.

  7. Process for Developing Metrics of METOC Impacts on Military Operations Operational Outcomes Operational Plans METOC Forecasts * METOC Observations Operational Performance Metrics METOC Performance Metrics Metrics of METOC Impacts on Operational Performance We apply this process to both real world data and output from a military mission model. * or other products Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  8. Real World Data Collection and Analysis Operational Engagement Model Real World Metrics Model Metrics • Synthesis • improved metrics • process improvements • improved warfighting operations Integration of Real World and Model Metrics See briefs by Paul Vodola for more on our METOC metrics modeling efforts. Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  9. METOC Metrics Contact Information Tom Murphree, Ph.D. Department of Meteorology Naval Postgraduate School 254 Root Hall 589 Dyer Road Monterey, CA 93943-5114 831-656-2723  commercial 312-756-2723  DSN 831-656-3061  fax murphree@nps.edu jtmurphr@nps.navy.smil.mil http://wx.met.nps.navy.mil/metrics/metrics_reports.html http://wx.met.nps.navy.mil/smart-climo/reports.php Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  10. METOC Metrics Backup Slides Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  11. Overview NPS METOC Metrics Program Tom Murphree, Ph.D. Quantitative objective metrics of the performance and operational impacts of forecasts are critical in enabling the military meteorology and oceanography communities to: (1) assess and improve their support for war fighting operations; and (2) increase their participation in Pentagon-level assessment and budgeting processes. \We are conducting a multi-year program to develop, test, and transition to operational use methods for objectively measuring both: (1) the performance of forecasts; and (2) the impacts of forecasts on the planning and operations of end users of the forecasts. The two main methods we have developed and are now transitioning to operational use are: 1. an online data collection and analysis system with automated real time data displays for assessing the performance of both forecasts and impacts on end-user operations 2. a rules-based operations analysis model that simulates the impacts of forecasted and observed environmental conditions on end-user operations We have applied our methods to the collection and analysis of data from Operation Iraqi Freedom, Naval Strike and Air Warfare Center, Naval strike and amphibious units, Air Mobility Command, Air Combat Command, and Pacific Air Forces. Our analysis results include: (a) forecast verification metrics (e.g., accuracy, false alarm rate, probability of detection, bias, Heidke skill); (b) operational impacts metrics (e.g., delay and cancellation of missions, operations saved from adverse environmental impacts; correlation of environmental mitigation rate and negative operational impacts rate, changes in weapons load outs); and (c) metrics of the relationship of specific environmental phenomena to forecast performance and operational impacts. We will begin development of a METOC metrics program for Naval special warfare (NSW) in fall 2006. The deliverables from our program include: 1. online data collection and analysis system (now in operational use) 2. operations analysis model (soon to be in operational use) 3. metrics of actual performance and operational impacts 4. metrics of simulated operational impacts 5. recommendations to the meteorology and oceanography communities Our program is based on a collaboration of Naval Postgraduate School (NPS) faculty and staff, Navy and Air Force students at NPS, operational Navy and Air Force units, and operations research and IT contractors. Reports on our program are available at: http://wx.met.nps.navy.mil/metrics/metrics_reports.html. For more information on the NPS METOC Metrics Program, please contact the program director, Dr. Tom Murphree, at: murphree@nps.edu Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  12. Evolution of NPS and SPA METOC Metrics Work Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  13. Evolution of NPS and SPA METOC Metrics Work Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  14. Evolution of NPS and SPA METOC Metrics Work NPS program reports and briefs available at:http://wx.met.nps.navy.mil/metrics/metrics_reports.html Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  15. NPS & SPAMETOC Metrics Reports – Most are available at: http://wx.met.nps.navy.mil/metrics/metrics_reports.html • Systems Planning and Analysis, 1995. Impact of Environment on Amphibious Operations in Mining Environment, Report to NRL-SSC. • LCDR A. Cantu, USN, 2001. The Role of Weather in Class A Naval Aviation Mishaps. Masters of Science Thesis, Naval Postgraduate School. Co-Advisors: C. Wash and T. Murphree • Systems Planning and Analysis, 2001. Assessing the Value of Accurate Meteorological Forecasting in Strike Operations, Report to CNMOC. • LCDR B. Martin, USN, 2002. METOC and Naval Afloat Operations: Risk Management, Safety, and Readiness. Masters of Science Thesis, Naval Postgraduate School. Advisor: T. Murphree, Co-Advisor: C. Wash. • Systems Planning and Analysis, 2003. METOC Campaign Impact Analysis, Report to Oceanographer of the Navy. • LCDR J. Hinz, USN, 2004. Developing and Applying METOC Metrics to Sea Strike: A Case Study of Operation Iraqi Freedom. Masters of Science Thesis, Naval Postgraduate School. Advisor: T. Murphree, Co-Advisor: C. Wash. • Capt. J. Anderson, USAF, 2004. An Analysis of a Dust Storm Impacting Operation Iraqi Freedom, 25-27 March 2003. Masters of Science Thesis, Naval Postgraduate School. Advisor: C. Wash, Second Reader: T. Murphree. • Capt. J. Jarry, USAF, 2005. Analysis of Air Mobility Command Weather Mission Execution Forecasts: Metrics of Forecast Performance and Impacts on War Fighting Operations. Masters of Science Thesis, Naval Postgraduate School. Advisor: T. Murphree, Co-Advisor: Col. D. Smarsh. • LCDR M. Butler, USN, 2005. Automated Metrics of METOC Forecast Performance and Operational Impacts. Masters of Science Thesis, Naval Postgraduate School. Advisor: T. Murphree. • LCDR Jake Hinz, USN, T. Murphree, C. Wash, 2005. Developing and Applying METOC Metrics to Sea Strike: A Case Study of Operation Iraqi Freedom. Battlespace Atmospheric and Cloud Impacts on Military Operations (BACIMO) Conference Papers (available at: http://www.nrlmry.navy.mil/BACIMO/2005/bacimo.html). • Capt. J. Jarry, USAF, T. Murphree, and Col. D. Smarsh, USAF, 2005. Analysis of Air Mobility Command Weather Mission Execution Forecasts: Metrics of Forecast Performance and Impacts on War Fighting Operations. Battlespace Atmospheric and Cloud Impacts on Military Operations (BACIMO) Conference Papers (available at: http://www.nrlmry.navy.mil/BACIMO/2005/bacimo.html). • LCDR R. A. Cantu, USN, LCDR M. Butler, USN, and T. Murphree, 2005. The Impacts of Weather Forecasts on Military Operations: A System for Conducting Quantitative Real-World Analyses. Battlespace Atmospheric and Cloud Impacts on Military Operations (BACIMO) Conference Papers (available at: http://www.nrlmry.navy.mil/BACIMO/2005/bacimo.html). • Systems Planning and Analysis, 2005. Assessments of the Value of Timely and Accurate METOC Information, Report to CNMOC. • Maj K. Darnell, USAF, 2006. Analysis of Weather Forecast Impacts on United States Air Force Combat Operations. Masters of Science Thesis, Naval Postgraduate School. Advisor: T. Murphree, Co-Advisor: Col. D. Smarsh. • LT J. Callahan, USN, 2006. Metrics of METOC Forecast Performance and Operational Impacts on Carrier Strike Operations. Masters of Science Thesis, Naval Postgraduate School. Advisor: T. Murphree, Co-Advisor: CDR R. Stone. Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  16. Major Results • 1. Real World Metrics: • --- Based on Navy and AF data • ---Product performance metrics (aka quality metrics) • --- Operational impacts metrics (aka impacts metrics) • 2. Online Real World Metrics System: • --- Collects and analyzes data • --- Produces forecast performance and operational impacts • metrics • 3. Weather Impacts Assessment Tool (WIAT): • --- Modeling system for simulating impacts of forecasts and • observed weather on strike operations • --- Produces simulated operational impacts metrics Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  17. Mission Planning Uncertainty Forecast Uncertainty Mission Performance Forecast Performance Predicted (better outcome) Actual (worse outcome) Actual (worse environment) Predicted (better environment) Mission Outcome Uncertainty better Actual Environment better worse Predicted Environment Effects of Environmental Uncertainty on Mission Planning and Execution Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu Accurate Predictions • Opportunity Costs • Actual is better than predicted • Over-prepared for environmental risks • Spend too many resources, too much time • Resources better used for other locations/missions • Effectiveness Costs • Actual is worse than predicted • Under-prepared for environmental risks • Deploy too few resources, too little time • Increased risk of mission failure Need to know relationships between forecasts, planning, and outcomes in order to quantify, and eventually reduce, outcome uncertainty and associated costs.

  18. Dollars and Metrics • Metrics are Quantitative Measures of Performance • May relate performance to other areas • Visibility forecast accuracy and CVN green deck for air ops • Performance to Customer Dollars • Iterative Process • Do not need to be dollar related initially! From NPS thesis research by LCDR Jake Hinz, 2004 Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  19. Organizational Axis Spatial/Temporal Axis Whole organization Multiple regions, long period METOC Performance Proxy Ops Impacts Operational Impacts Metrics Type Axis Multiple units Initial focus region (~next 1-3 years) Small region, short period Individual unit For details on this figure, see speaker notes section of this slide. Metrics process to be conducted in this 3-D space and continuously over time. time 3-D METOC Metrics Space:Initial Focus Region

  20. Organizational Axis Spatial/Temporal Axis Whole organization End state focus region (3+ years ahead) Multiple regions, long period METOC Performance Proxy Ops Impacts Operational Impacts Metrics Type Axis Multiple units Small region, short period Individual unit For details on this figure, see speaker notes section of this slide. Metrics process to be conducted in this 3-D space and continuously over time. time 3-D METOC Metrics Space:End State Focus Region

  21. Schematic METOC Inputs to Planning and Execution of Military Operations METOC Intelligence Analysis Strengthen The Case Clarify Objective and Possible COAs Identify Data Needs Operational Decision Making Exploratory Data Analysis (EDA) Explanation (Reporting) B’ C’ D’ E Military Objective Mission Task Definition Drill Down B A Operational Value Structured Data Analysis (SDA) Operational Planners METOC Personnel Prediction (Recommendation and Modeling) F C D Information Supply Chain See speaker notes for this slide for explanations of each component of this figure. Based on NPS thesis research of LCDR Jake Hinz, USN, 2004 Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  22. WIAT - METOC Inputs to CAS/KI Planning and Execution JIPTL: Joint Integrated Prioritized Target List Longer Term Inputs Longer-Term Target Designation METOC Forecasts Short-Term Target Designation ATO Sortie Generation Shorter Term Inputs Available Planes In the Air Real-Time METOC Phenomena, METOC Observations, and Nowcasts Planes Select and Strike Targets Outputs Need to determine how METOC inputs affect outputs (e.g., effectiveness of target selection, weapons selection, strikes, etc.). Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  23. Forecast Performance Metrics - OIF 00-24 hr Forecast and Obs 96-120 hr Forecast and Obs Forecast Accuracy, Mission X, Location Y fcst accuracy (%) for all fcsts 00-24 hrs 24-48 hrs 48-72 hrs 72-96 hrs 96-120 hrs forecast lead time From NPS thesis research of LCDR Jake Hinz, USN, 2004 For official use only Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  24. Operational Impacts Metrics - OIF OIF Aviation Sorties: Plans and Outcomes Dust Storm Front March 2003 April 2003 From NPS thesis research of LCDR Jake Hinz, USN, 2004 For official use only Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  25. 24% Observed and Forecasted Conditions Category % Weather Cancellations Forecast Performance and Operational Impacts Metrics: OIF From NPS thesis research of LCDR Jake Hinz, USN, 2004 For official use only Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  26. Forecast Performance Metrics: OIF Forecast Accuracy, Mission X, Location Y March Front % fcst accuracy for al RYG fcsts forecast lead time April Front % fcst accuracy for al RYG fcsts forecast lead time From NPS thesis research of LCDR Jake Hinz, USN, 2004 For official use only Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  27. Forecast Performance Metrics: AMC False Alarm Rate for No Go Mission Execution Forecasts Minimum Performance Benchmark? Target Benchmark? From NPS thesis research of Capt Jeff Jarry, USAF, 2005 For official use only Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  28. Comparison of Forecast Accuracy (FAC) and Skill Metrics Critical Skill No Go Fcsts FAC Go Fcsts FAC No Go Fcsts FAC All Fcsts Heidke Skill All Fcsts Forecast Performance Metrics: AMC Overall performance very good. But high overall FAC comes mainly from relatively common Go events. Persistent Go forecasts would have given almost the same overall FAC. FAC and skill for relatively uncommon No Go events much lower. In No Go cases of critical importance to operators, forecast performance much lower than overall. From NPS thesis research of Capt Jeff Jarry, USAF, 2005 For official use only Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  29. Air Mobility Command WXM Unit, FY 2004 Number of Weather Mitigation Actions Taken by Operators Estimated Number of Missions Saved Estimated Number of Unnecessary Actions Taken 130 120 110 100 90 80 70 60 50 40 30 20 10 0 APR OCT NOV DEC JAN MAR MAY JUN JUL AUG SEP FY04 AVG FEB Operational Impacts Metrics: AMC Saved Mission: a. No Go forecast for original plan is accurate; and b. Go forecast for accepted mitigation plan is accurate; and c. Mission successful using mitigation plan; and d. Mission would have failed using original plan. From NPS thesis research of Capt Jeff Jarry, USAF, 2005 For official use only Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  30. Operational Impacts Metrics: AMC METOC Mitigation Rate and Weather Delays • Mitigation rate = mitigation recommendations accepted / mitigation recommendations provided • Correlation coefficient -0.48, 93% certainty • Useful in reducing decision maker uncertainty High percentage of must fly missions From NPS thesis research of Capt Jeff Jarry, USAF, 2005 For official use only Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  31. Online Real World Metrics System – Overview • A web based system for use by METOC units in measuring forecast performance, and the operational impacts of: a. Forecasts b. METOC phenomena • Allows near real time data entry by METOC units: http://web.ntsstl.nps.navy.smil.mil/Metrics/shipboard_metrics/strike_metrics/index.php • Delivers metrics reports within seconds of data being entered. • Standardized for use by METOC units supporting strike operations, but readily adaptable to other customer operations (e.g., ASW). • Includes database for long term data collection and analyses. • Optimized for low impact on work loads of METOC units and their customers. Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  32. Online Real World Metrics System – System Schematic Carrier Strike Group (CSG) Data obtained by CSG personnel via interaction with air wing NPS Data written to database Data analyzed per NPS metrics procedures Output reports delivered via web site. Reports available to CSG on demand via automated metrics calculation process. Reports delivered by OA division to air wing personnel Data entered remotely via web site CSG From NPS thesis research of LT Jeremy Callahan, USN, 2006 Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  33. Online Real World Metrics System – METOC Support and Data Collection During Mission Planning and Debriefing 3 Hours Prior 3 – 4 Days Prior • Flight Brief • DD-175 and safety of flight • Airfield forecast • Enroute & target weather • Divert weather • Launch and recovery weather • ATO Development • Weather briefing given to ATO planners at the Combined • Air Operations Center (CAOC) • ATO developed based on needs of the Combined Forces • Air Component Commander (CFACC) while taking large • scale weather into consideration • ATO promulgated to supporting units • Event Debrief • Data collected during standard Intelligence debriefing • Most weather impacts reviewed for intelligence report • Supplemental data obtained one-on-one after intelligence debriefing Mission Execution 1-2 Hours After From NPS thesis research of LT Jeremy Callahan, USN, 2006 Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  34. Online Real World Metrics System – Data Collection Mission Information Debriefing Information Mission Data Forecast Negative Weather Events Observed Negative Weather Events Weather Events Impacts of Weather on Mission Mission Impact Changes Made to Original Mission by Pilot Target Changes And Cancellations Mitigating Actions From NPS thesis research of LT Jeremy Callahan, USN, 2006 Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  35. Online Real World Metrics System – Data Collection From NPS METOC Metrics SIPRNet Site: http://web.ntsstl.nps.navy.smil.mil/Metrics/shipboard_metrics/strike_metrics/index.php From NPS thesis research of LT Jeremy Callahan, USN, 2006 Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  36. Online Real World Metrics System – Data Collection Mission Information Debriefing Information NPS METOC Metrics SIPRNet Site: http://web.ntsstl.nps.navy.smil.mil/Metrics/shipboard_metrics/strike_metrics/index.php From NPS METOC Metrics SIPRNet Site: http://web.ntsstl.nps.navy.smil.mil/Metrics/shipboard_metrics/strike_metrics/index.php From NPS thesis research of LT Jeremy Callahan, USN, 2006 Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  37. Online Real World Metrics System – Data Collection Observed Negative Weather Events Forecast Negative Weather Events Impacts of Weather on Mission Changes Made to Original Mission by Pilot Target Changes and Cancellations From NPS METOC Metrics SIPRNet Site: http://web.ntsstl.nps.navy.smil.mil/Metrics/shipboard_metrics/strike_metrics/index.php From NPS thesis research of LT Jeremy Callahan, USN, 2006 Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  38. Forecast Performance Metrics: Probability of Detection of Negative Impact Phenomena – ACC & PACAF 100.00% 90.00% Planning Forecast 80.00% Mission Execution Forecast 70.00% 60% 60.00% 50.00% POD Forecast accuracies and probabilities of detection for negative impact phenomena  60% for all our real world Navy and Air Force data sets. 40.00% 30.00% 20.00% 10.00% 0.00% Cloud Ceiling Cloud Layers Surface Vis Vis Aloft In-flight Icing Icing Run Times Turbulence Contrails Negative Impact Weather Phenomena From NPS thesis research of Maj Karen Darnell, USAF, 2006 For official use only NPS & SPA METOC Metrics Project, murphree@nps.edu, Sep06

  39. Operational Impacts Metrics: Mission Plan Changes – ACC & PACAF Based only on planning forecast Based only on execution forecast Based on planning and execution forecasts Percentage of Missions Percent of missions for which indicated changes were made due to planning and/or execution forecasts. From NPS thesis research of Maj Karen Darnell, USAF, 2006 For official use only Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  40. Operational Impacts Metrics: Missions and Weapons Saved - NSAWC/Fallon a. b. c. • The percent of NSWAC missions for which the weather forecast led to the • mission being saved from: • delays or cancellation • Incorrect weapons load outs • Bar c shows the percent of missions that could have been saved from delays, cancellations, or weapons changes had the mission plans been changed in response to the forecasts. • Net result: 36% of missions were or could have been saved. From NPS thesis research of LCDR Mark Butler, USN, 2005 For official use only Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  41. Operational Impacts Metrics: Mission Plan Changes - NSAWC/Fallon Percent of missions that experienced the indicated types of impacts on their scheduling, weapons or tactics selection, and/or effectiveness due to negative METOC conditions. From NPS thesis research of LCDR Mark Butler, USN, 2005 For official use only Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  42. Operational Impacts Metrics: Negative Impact Phenomena - NSAWC/Fallon Percent of missions that experienced negative impacts from the indicated weather phenomena. From NPS thesis research of LCDR Mark Butler, USN, 2005 For official use only Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  43. Operational Impacts Metrics: Responses to Forecasts of Negative Impacts – ACC & PACAF Percent of missions for which: • Planning forecasts indicated negative weather impacts: 36% • Planning forecasts led to mission plan changes: 21% • Execution forecasts indicated negative weather impacts: 39% • Execution forecasts led to mission plan changes: 21% • Missions that experienced negative weather impacts: 36% Missions that avoided, or could have avoided, negative weather impacts based on forecasts issued  36 % for all our real world Air Force and Navy data sets. From NPS thesis research of Maj Karen Darnell, USAF, 2006 For official use only Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  44. Online Real World Metrics System – Users • In operational use by Naval Pacific Meteorology and Oceanography Detachment at NAS Fallon in support of Naval Strike and Air Warfare Center (NSAWC). • System in operational testing by Enterprise, Eisenhower, Stennis, and Iwo Jima. • Web site and database components of system being transitioned from NPS to FNMOC. Beta version expected on FNMOC development server winter 2007. • System adapted and tested for use by USAF Air Combat Command (ACC) and Pacific Air Forces (PACAF). • Adaptations of system for other warfare areas: • System being adapted for use in NSW (collaboration with NOSWC) • Discussions underway to adapt system for automation of ocean model product metrics (collaboration with NRL-SSC) • Planning underway to adapt system for use in ASW (collaboration with ASW directorate) Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  45. Online Real World Metrics System – Value • Allows METOC units to routinely and efficiently document their performance and the impacts they have on their customers' operations (e.g., impacts on mission planning and execution). • Eliminates need for METOC units to develop expertise in metrics data analysis and IT issues. • Provides immediate results to units and their customers once data has been entered. • Provides METOC leadership with a real time dashboard display of individual unit and overall community: • Capacity / Readiness • Quality • Impacts • Efficiency / Return on Investment Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu See brief by CDR Steve Woll for more on uses of metrics by METOC leadership.

  46. Online Real World Metrics System – Value • Provides METOC leadership with quantitative measures for: • allocating resources for research, development, operations, education, and training • use in OPNAV assessments and budget decisions • Provides warfighter leadership with quantitative measures for: • assessing how their performance (e.g., mission success) is affected by environmental conditions and forecasts • determining how to more effectively use METOC info in planning and executing missions (e.g., what products to use, when to use them, how to weight them in decision making processes, etc.) • determining how to allocate resources for research, development, operations, education, and training • accounting for METOC factors in M&S, OPNAV assessments and budget decisions Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu See brief by CDR Steve Woll for more on uses of metrics by METOC leadership. See brief by CDR Mike Angove for more on uses of metrics in assessment / budget processes.

  47. Level of Effort COCOM Battlespace Prep Survey Program TAS TOS Mission execution briefs OPLAN/CONPLAN Studies Years Months Weeks Days Hours Strategic Operational Tactical Future Extensions - ASW Metrics as tools for evaluating different METOC support strategies From ASW Coordination/CONOPs Conf 14 Mar 05; CAPT Jeff Best, CNMOC Director for ASW; CDR Van Gurley, CNMOC Deputy Director for ASW Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  48. Operational PlanningTeam Engagement MissionPlanning Cell Engagement OPLAN/CONPLAN Development EnvironmentalReconnaissance EnvironmentalReconstruction and Analysis Level of Impact Level of Effort COCOM Battlespace Prep Survey Program TAS TOS Mission execution briefs OPLAN/CONPLAN Studies Years Months Weeks Days Hours Strategic Operational Tactical From ASW Coordination/CONOPs Conf 14 Mar 05; CAPT Jeff Best, CNMOC Director for ASW; CDR Van Gurley, CNMOC Deputy Director for ASW Future Extensions - ASW Metrics as tools for evaluating different METOC support strategies Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  49. Level of Impact Level of Effort Trade Space Opportunity COCOM Battlespace Prep Survey Program TAS TOS Mission execution briefs OPLAN/CONPLAN Studies Years Months Weeks Days Hours From ASW Coordination/CONOPs Conf 14 Mar 05; CAPT Jeff Best, CNMOC Director for ASW; CDR Van Gurley, CNMOC Deputy Director for ASW Strategic Operational Tactical Future Extensions - ASW Metrics as tools for evaluating different METOC support strategies Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  50. Future Extensions - ASW • Preliminary Proposal: • METOC metrics program for ASW should be designed to: • Develop ASW real world data collection, analysis, and reporting system • Develop ability to model ASW operations and METOC support of them • Use real world and model systems to test and experiment (e.g., with variations in objectives, weighting of objectives, means to meeting objectives, success measures, etc.) • Report results rapidly, automatically, and in optimal formats to both METOC personnel and ASW customers • Support facts-based decision making at multiple levels by both METOC personnel and ASW customers Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

More Related