1 / 48

METOC Metrics for MIW: Metrics System Design/Best Practices

METOC Metrics for MIW: Metrics System Design/Best Practices. Tom Murphree, David Meyer Naval Postgraduate School (NPS) murphree@nps.edu Bruce Ford, Manuel Avila Clear Science, Inc. (CSI) bruce@clearscienceinc.com Paul Vodola, Matt McNamara, Luke Piepkorn, and Ed Weitzner

Download Presentation

METOC Metrics for MIW: Metrics System Design/Best Practices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. METOC Metrics for MIW: Metrics System Design/Best Practices Tom Murphree, David Meyer Naval Postgraduate School (NPS) murphree@nps.edu Bruce Ford, Manuel Avila Clear Science, Inc. (CSI) bruce@clearscienceinc.com Paul Vodola, Matt McNamara, Luke Piepkorn, and Ed Weitzner Systems Planning and Analysis (SPA) pvodola@spa.com Presented to MIW METOC Metrics Symposium I CNMOC, Stennis Space Center, MS MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  2. METOC Metrics: Steps for Developing a Metrics System • Determine what we want to know and be able to do once we have a fully • functioning metrics system. • 2. Determine what metrics we need in order to know and do these things. • Determine what calculations need to be done in order to come up with the • desired metrics.  • Determine what data needs to be collected in order to do the desired • calculations (i.e., data analyses).  • Determine the process to use to collect and analyze the needed data. • 6. Implement the data collection and analysis process. • a. If data can be collected, go to step 7.  • b. If data can't be collected, repeat steps 1-5 until it can be. • Use metrics obtained from steps 1-6 to improve processes, products, and • operational impacts. • 8. Assess the results of steps 1-7. • 9. Make adjustments to steps 1-8. • 10. Repeat steps 1-9 until satisfied with the process and the outcomes from the • process. Steps above describe the process for the real world data component of a METOC metrics project. The steps are the same for the operational analysis and modeling component of the project, except for steps 4-6 in which data collection is replaced by modeling and generation of model output (model proxy data). MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  3. Operational Outcomes Operational Plans METOC Forecasts * METOC Observations Operational Performance Metrics METOC Performance Metrics Metrics of METOC Impacts on Operational Performance Apply this process to both real world data and output from operations models. METOC Metrics: Data and Methods MIW METOC Metrics Best Practices, bruce@clearscienceinc.com * or other products

  4. Organizational Axis Spatial/Temporal Axis Whole organization Multiple regions, long period METOC Performance Proxy Ops Impacts Operational Impacts Metrics Type Axis Multiple units Small region, short period Individual unit For details on this figure, see speaker notes section of this slide and symposium summary. Metric process to be conducted in this 3-D space and continuously over time. time 3-D METOC Metrics Space MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  5. Organizational Axis Spatial/Temporal Axis Whole organization Multiple regions, long period METOC Performance Proxy Ops Impacts Operational Impacts Metrics Type Axis Multiple units Initial focus region (~next 1-3 years) Small region, short period Individual unit For details on this figure, see speaker notes section of this slide and symposium summary. Metric process to be conducted in this 3-D space and continuously over time. time 3-D METOC Metrics Space:Initial Focus Region MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  6. Organizational Axis Spatial/Temporal Axis Whole organization End state focus region (3+ years ahead) Multiple regions, long period METOC Performance Proxy Ops Impacts Operational Impacts Metrics Type Axis Multiple units Small region, short period Individual unit For details on this figure, see speaker notes section of this slide and symposium summary. Metric process to be conducted in this 3-D space and continuously over time. time 3-D METOC Metrics Space:End State Focus Region MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  7. Metric process to be conducted in this 3-D space and continuously over time. time Organizational Axis Spatial/Temporal Axis MIW community Multiple ops areas, several years CNMOC SLD forecast accuracy to screen penetration correlation SLD forecast accuracy T forecast accuracy Metrics Type Axis Exercise region, several weeks RBC Single point, one day Unit Level Support MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  8. Basic METOC Support Information Flow MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  9. Basic METOC Support Information Flow Customers METOC MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  10. Basic METOC Support Cycle • Concept applies to most areas of METOC support • For METOC metrics data collection to succeed, we must collect information regarding each cycle component. • In addition there are post-event data to collect to complete the picture • This cycle may be repeated multiple times prior to an event MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  11. Basic METOC Support Cycle Short Lead (Real World) Operations Planning Cycle Execution Cycle MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  12. Basic METOC Support Cycle Exercise Planning Execution FPC IPC Pre-Event MPC Time MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  13. Basic METOC Support Cycle Short Duration Exercises Obs Outcomes Execution Event FPC IPC Pre-Event Post Event MPC Time MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  14. Basic METOC Support Cycle Long Duration Exercises (>24 hours) MultipleForecasts MultipleObs MultipleOutcomes Event FPC IPC Pre-Event MPC Time MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  15. Basic METOC Support Cycle Inter-exercise Pre-Event Time Event in progress MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  16. Exercise Data Collection Goal ManyForecasts ManyObs ManyOutcomes Event FPC IPC Pre-Event MPC • For each briefing cycle: • Forecast elements • Verifying obs • Recommendation category • Changes between pre and post forecast/recco plans • Customer outcomes • Recommendation (category) • Change between pre and post forecast/recommendation plan MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  17. Exercise METOC Metrics ManyForecasts ManyObs ManyOutcomes Event FPC IPC Pre-Event MPC • % accuracies • Numbers of recommendations made • Types of recommendations • % of time recommendations impacted • Reasons for not implementing recommendations • Customer outcomes • Types of recommendations made • Numbers of recommendations made • % of time recommendations impacted (influenced plan change) • Comparisons with briefing cycle metrics • …sorted by time, geographic region, unit, or a combination MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  18. Exercise METOC Metrics Benefits of Time in Collecting Metrics Data Time • Examples: • During 2008, when METOC recommendations regarding sensor placement were implemented XX mines were detected as opposed to XX mines where METOC sensor placement recommendations were not implemented. • During exercise planning conferences, METOC recommendations influenced exercise locations XX% of the time. • XX prediction accuracy has improved over the last XX months for exercises happening along the Gulf Coast. • MIW RBC performance surface data was used in XX% of customer briefs and the overall accuracy of those briefs was XX%. MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  19. Best Practices for Measuring the Impact of Environmental Products on Warfighters • Broad Categories • 1. Institutionalization within the military unit – addressing paradigm shifts that must occur to ensure regular and consistent data collection • 2. Human behavioral factors – understanding the priorities and limitations of those tasked with entering critical data • 3. Human-machine interface design – designing a system that is intuitive and allows rapid entry, updating, administration, and use by the military unit’s managers • Note: Many best practices may fall info multiple categories MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  20. Best Practices for Measuring the Impact of Environmental Products on Warfighters • Best practice #1: • Begin with the end in mind • Determine what measures of success (metrics) the leadership needs to monitor, how often, formatting, etc. • Determine what metrics the individual forecast activity needs to monitor • Determine what metrics the individual forecaster needs to monitor • Conceptualize/sketch the eventual metrics output for all levels • Ensure the end is kept in mind throughout the planning process MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  21. Best Practices for Measuring the Impact of Environmental Products on Warfighters • Best practice #2: • Involve key persons from all levels of the organization within the planning process • Often one level of an organization does not fully understand how other levels operate, or their information is dated • Where possible, meet with key persons at a central location with no other duties other than planning the metrics process for 1-2 days, depending on the scope of the metrics project. Multiple meetings may be required. • Form multi-level committees to fact-find if necessary MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  22. Best Practices for Measuring the Impact of Environmental Products on Warfighters • Best practice #3: • Focus scope of data collection at the correct level • Very often leaders will want grand-scale metrics that depend on smaller scale metrics (without a system providing the smaller scale metrics) • Metrics are most useful when they provide information to multiple levels of the organization (e.g., individual forecaster, immediate supervisor, forecast activity commander, directorate and higher) • Fact-based metrics are most useful when developed from data from the lowest levels of the organization • Critical to collect data on the smallest “unit” of support (e.g., forecast, mitigation recommendation) • Quality higher level metrics (directorate, CNMOC) rely on lower level data collection/metrics • Operational modeling is enhanced by the quality and quantity of real world information (e.g., significant numbers of data points) MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  23. Bricks and House Analogy • Used to illustrate the: • Lowest common metrics unit • Hierarchy of metrics Impact on Warfighter Customers (higher-level metrics) • Each “brick”: • Each brick represents a different warfare support area or subset of an area • Takes many records to make good high-level metrics • Each record must be well constructed to make quality high-level metrics • Support Unit Record • Forecast data • Verification data • Customer plans • Customer outcomes • Recommendations • Other data MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  24. Best Practices for Measuring the Impact of Environmental Products on Warfighters • Best practice #4: • Invest effort, time, funds in the planning process • Metrics efforts can be very involved, so changes in mid development can be confusing and costly • Collection interfaces are complicated web-database applications • Changes are often the result of a lack of understanding of how the forecast activity works or how the customer warfighter conducts operations • It pays to get familiar with both the forecast activity and the warfighter • Involve them in the planning process • Observe real-world operations • Test metrics collection ideas within an exercise MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  25. Best Practices for Measuring the Impact of Environmental Products on Warfighters • Best practice #5: • Assist all levels of the organization in understanding how the metrics process works and in defining attainable project scope • State and re-state the goals and objectives • Some forecasters will fear evaluation of performance • Some will perceive the effort as “big brother” tactic • Prepare to repeat main points • When possible use easy-to-understand conceptual analogies for grasping and easily discussing key topics • Examples: MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  26. Bricks and House Analogy • Used to illustrate the: • Lowest common metrics unit • Hierchy of metrics Impact on Warfighter Customers (higher-level metrics) • Each “brick”: • Each brick represents a different warfare support area or subset of an area • Takes many records to make good high-level metrics • Each record must be well constructed to make quality high-level metrics • Support Unit Record • Forecast data • Verification data • Customer plans • Customer outcomes • Recommendations • Other data MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  27. Fence and Gates Analogy: Overall Concept Future capability (pri 1) Future capability (pri 3) Immediate goals Future capability (pri 2) Used for defining the scope of metrics projects/phase/near future objective MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  28. Best Practices for Measuring the Impact of Environmental Products on Warfighters • Best practice #6: • Start with “Low Hanging Fruit” • Establish a metrics program using an area that will yield quality results • Don’t tackle the hardest problems first • Use simpler problems to work out metrics procedures and technical hurdles • Example: • NSW – World wide desire for metrics, but initial METOC metrics effort was: • Training missions • Southern California operating area • ASW – Refined potential collection to three phases • Reach back cell – Low hanging fruit • NOATs – more challenging MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  29. Best Practices for Measuring the Impact of Environmental Products on Warfighters • Best practice #7 • Be sensitive that you may need to propose significant procedural change. Propose them when necessary. • Some metrics data collection will require major changes in the way forecast activities function • E.g., a requirement to collect post mission data from returning strike warfare aircrews • Some metrics data collection may require changes in the way the warfighter customer does business • E.g., allowing forecast personnel to participate in post-mission briefs • What to do: • Get buy-in from the highest levels of leadership • To mandate change within forecast organization • To lobby for cooperation among warfighter organization • If multiple procedural changes are necessary, consider instituting them one-at-a-time to allow “institutionalization” before the next change. MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  30. Best Practices for Measuring the Impact of Environmental Products on Warfighters • Best practice #8: • Have the smallest possible impact on the forecast activity • Lobby for only those procedural changes that are essential for collecting the necessary metrics data • Automate the collection wherever possible • Collect observation data post-transmission if possible • If forecast data is transmitted and parse-able, collect it automatically • Some customer plans and outcomes can be collected remotely, (e.g., Navy MPRA) • Avoid re-entering data if possible • Opportunities to innovate the process MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  31. Best Practices for Measuring the Impact of Environmental Products on Warfighters • Forecast Builder MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  32. Best Practices for Measuring the Impact of Environmental Products on Warfighters • Best practice #9: • Have the smallest possible impact on the warfighter customer • If possible, do not involve the warfighter customer (e.g., collect plans and outcomes remotely, or from embedded forecast personnel) • Leverage existing procedures such as regular pre/post mission briefings (e.g., strike debrief) • If information is not available via other means: • Clear special interactions with the warfighter customer leadership • Designate knowledgeable key persons within the warfighter organization to collect information from • Collect the required data in a consistent, professional manner (e.g., same time, same place, same face). This will institutionalize the process. MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  33. Best Practices for Measuring the Impact of Environmental Products on Warfighters • Best practice #10: • Where changes are required to collect metrics data, strive to institutionalize those changes as quickly as possible. • Stress the need to make metrics collection part of the overall environmental support process: • Mission planning • Forecast • Mission execution/observation • Post-mission debrief • Paradigm shift: the forecast process is not complete until the metrics data is collected • This thinking must be pushed down to the forecast level • In general, change works best when: • It benefits the personnel that are required to change (e.g., use of a handy tool, time savings, etc.) • It quickly become part of the process (institutionalization) MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  34. Best Practices for Measuring the Impact of Environmental Products on Warfighters • Best practice #11: • Use collected data to model warfighting scenarios and sensitivities • Extends the capability of a real world collection and display of metrics data • Current situation does not reflect realistic war time possibilities • Advances in campaign modeling/simulation can apply current metrics to complicated potential scenarios by: • Holding the environmental metrics fixed and varying the campaign variables • Holding campaign variables fixed and varying the environmental variables • Lessons learned in simulations can be used to adjust the forecast process to maximize the positive impact on the warfighter customer MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  35. Best Practices for Measuring the Impact of Environmental Products on Warfighters • Best practice #12: • Make collection interface uncomplicated and intuitive • Use the best research available human-machine interface design • Favor the most common outcomes • If 90% of missions report no impact, make that report easiest to make • If possible, customize the report form for the forecast activity/mission • Require that collection forms follow the work flow of the personnel entering the data • Simple electronic forms that expand depending on user inputs work well MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  36. MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  37. Best Practices for Measuring the Impact of Environmental Products on Warfighters • Best practice #13: • Make METOC metrics reports that are real time, on-demand and level appropriate • Allow metrics to be displayed by: • Time span • Forecast activity/forecaster • Geographical region • Environmental phenomena • Any/all of the above • Prepare metrics display that is appropriate to the visitor • Leadership will dictate what is appropriate for their organization • X forecast activity may not need to see Y’s forecast metrics • Leaders will likely require a customized “dashboard” display MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  38. Mitigated Received Negative Received Negative Metrics MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  39. Best Practices for Measuring the Impact of Environmental Products on Warfighters Summary Begin with the end in mind Involve key persons from all levels of the organization within the planning process Focus scope of data collection at the correct level Invest effort in the planning process Assist all levels of the organization in understanding how the metrics process works and in defining attainable project scope Be sensitive that you may need to propose significant procedural change. Propose them when necessary. Start with “Low Hanging Fruit” Have the smallest possible impact on the forecast activity Have the smallest possible impact on the warfighter customer Where changes are required to collect metrics data, strive to institutionalize those changes as quickly as possible Use collected data to model warfighting scenarios and sensitivities Make collection interface uncomplicated and intuitive Ensure metrics reports are real time, dynamic and level appropriate MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  40. References http://wx.met.nps.navy.mil/metrics/metrics_reports.html MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  41. Project Contact Information • Systems Planning and Analysis, Inc. • 2001 N. Beauregard Street • Alexandria, VA 22311 • Fax: 703-399-7365 • Paul Vodola, Ph.D. • Email: pvodola@spa.com • Paul.Vodola_Contractor@spa.dtra.smil.mil • Phone: 703-399-7225 • Matt McNamara • Email: mmcnamara@spa.com • Phone: 703-399-7266 • Luke Piepkorn • Email: lpiepkorn@spa.com • Phone: 703-399-7239 • Ed Weitzner • Email: eweitzner@spa.com • Phone: 703-399-7229 Naval Postgraduate School Department of Meteorology 254 Root Hall, 589 Dyer Road Monterey, CA 93943-5114 Fax: 831-656-3061 Tom Murphree, Ph.D. (project lead) Email: murphree@nps.edu murphrjt@nps.navy.smil.mil Phone: 831-656-2723 David Meyer Email: dwmeyer@nps.edu Phone: 831-656-3647 Clear Science, Inc. 7801 Lonestar Rd Suite #17 Jacksonville, FL 32211 Fax: 904-379-9704 Bruce W. Ford Email: bruce@clearscienceinc.com fordbw@tsc-jax.navy.smil.mil Phone: 904-379-9704 Manuel Avila Email: manny@clearscienceinc.com Phone: 904-379-9704 MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  42. Backup Slides MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  43. Abstract The collection of data for determining the impacts of METOC products on military operations is generally problematic. In some special situations, the data collection process may be completely automated. But in the vast majority of cases, data collection from and/or by human is required. In our experience, impacts data collection systems requiring human intervention that adhere to a growing set of best practices that greatly increase the likelihood of collecting accurate, complete, quantitative, and objective data on a continual basis. Such best practices fall into three categories: 1. Institutionalization within the military unit – addressing paradigm shifts that must occur to ensure regular and consistent data collection 2. Human behavioral factors – understanding the priorities and limitations of those tasked with entering critical data 3. Human-machine interface design – designing a system that is intuitive and allows rapid entry, updating, administration, and use by the military unit’s managers This presentation proposes a set of best practices within each category on which to design and build data collection system for assessing operational impacts. Examples of operational impacts metrics systems that we have developed for USAF and USN units will be presented, along with examples of the resulting impacts metrics. MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  44. Steps for Developing a METOC Metrics System • Determine what we want to know and be able to do once we have a fully • functioning metrics system. • 2. Determine what metrics we need in order to know and do these things. • Determine what calculations need to be done in order to come up with the • desired metrics.  • Determine what data needs to be collected in order to do the desired • calculations (i.e., data analyses).  • Determine the process to use to collect and analyze the needed data. • Start with “Low Hanging Fruit.” Implement the data collection and analysis process. • a. If data can be collected, go to step 7.  • b. If data can't be collected, repeat steps 1-5 until it can be. • Use metrics obtained from steps 1-6 to improve processes, products, and • operational impacts. • 8. Assess the results of steps 1-7. • 9. Make adjustments to steps 1-8. • 10. Repeat steps 1-9 until satisfied with the process and the outcomes from the • process. Steps above describe the process for the real world data component of a METOC metrics program. The steps are the same for the operational analysis and modeling component of the proposed program, except for steps 4-6 in which data collection is replaced by modeling and generation of model output (model proxy data). MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  45. Definitions • Metrics: Objective, quantitative, data based measures of an organization’s operations, products, and services. • METOC (Meteorology and Oceanography) Metrics: Metrics of METOC organization’s products and impacts. Three main types: • Performance metrics: metrics of capacity, readiness, quality, efficiency / return on Investment • Impacts metrics: metrics of impacts on warfare customer operations (planning, execution, post-assessment) • Phenomena metrics: metrics that relate product performance, customer performance, or operational impacts to specific environmental phenomena • Methods for Generating METOC Metrics •  Collect / analyze real world data on METOC and customer ops •  Model METOC and customer ops MIW METOC Metrics Best Practices, bruce@clearscienceinc.com Murphree et al., METOC Metrics, Feb 07, murphree@nps.edu

  46. Why bother with Measuring our Impact on the Warfighter? • Better forecasts • Increased safety • Provides truth-based tool for managers that allocate • Research dollars • Personnel • Equipment • Time • Budget justification/defense MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  47. METOC Performance Proxy Ops Impacts Operational Impacts Metrics process to be conducted in this 3-D space and continuously over time. time 3-D METOC Metrics Space Organizational Axis Spatial/Temporal Axis Whole organization Multiple regions, long period Metrics Type Axis Multiple units Small region, short period Individual unit MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

  48. Real World Data Collection and Analysis Operational Engagement Model Real World Metrics Model Metrics • Synthesis • improved metrics • process improvements • improved warfighting operations Integration of Real World and Model Metrics MIW METOC Metrics Best Practices, bruce@clearscienceinc.com

More Related