1 / 22

Submarine Maintenance & Modernization EPS Metrics Read Ahead Presentation

Submarine Maintenance & Modernization EPS Metrics Read Ahead Presentation. Prepared 8 March 2007. SUBMARINE FORCE DIRECTION FOR 2007. R 192306Z DEC 06 COMSUBFOR NORFOLK VA SUBMARINE FORCE DIRECTION FOR 2007. Alignment/Enterprise Management

cybele
Download Presentation

Submarine Maintenance & Modernization EPS Metrics Read Ahead Presentation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Submarine Maintenance & Modernization EPS Metrics Read Ahead Presentation Prepared 8 March 2007

  2. SUBMARINE FORCE DIRECTION FOR 2007 R 192306Z DEC 06 COMSUBFOR NORFOLK VA SUBMARINE FORCE DIRECTION FOR 2007 • Alignment/Enterprise Management • CFT Alignment - extend effective alignment through our Cross Functional Teams by: • 1) mapping processes and costs to develop a productivity baseline, • 2) developing major process metrics, and • 3) developing hierarchical metrics that directly support USE top tier objectives. • Common readiness processes • Common metrics - develop a predictive metric, common to the Undersea and Naval Air Enterprises, that improves productivity in ships' depot maintenance programs. Develop accurate maintenance notionals to properly predict budget requirements.

  3. METRICS 101 REFRESHER • You can’t manage what you don’t understand, you can’t understand what you don’t measure, you can’t measure what you don’t define, you can’t define what you don’t understand . . • “You get what you measure” (drives behaviors) • Metric Types: • Lagging (Rear View Mirror) / Leading (Windshield) • Control / Improve • Activity (Process) / Results (Output) • Continuous / Discrete • “Garbage In = Garbage Out” • Data vs. Information • Dashboard Balanced Scorecard • Cascading Levels – Integrated and Aligned • Defined and understood (Data Source, Methodology, all the above…..)

  4. Metrics/Data State of the Union • METRICS PACKAGES / PRESENTATIONS • NAVSEA Tier 3 EBA 1 Performance Metrics for CNO Availabilities • WAR ROOM • CNO Availability Assessment Report (04) • DMP/ERO/EOH/SRA Comparison Report (Cannon) • METRICS • Ao (Sub Team 1) • Total Alt changes per FY (Sub Team 1) • Days gained by MCA (Sub Team 1) • Non nuclear ShipAlt planning and authorization letters (Sub Team 1) • DSRA/ERO/EOH/DMP major ship system analysis (PMS 392) • Production vs. Services (PMS 392) • Total TYCOM manday comparison (PMS 392) • Cost per % complete (Shipyards) • WF 250 (Shipyards) • 100-700 percent complete (Shipyards) • Testing (Shipyards) • Overtime (Shipyards) • Closed Key Ops (Shipyards) • AWP timeliness (SUBMEPP) • AWP Product quality and accuracy (SUBMEPP) • Maintenance instruction document program feedback (SUBMEPP) • Rotatable pool delivery timeliness (SUBMEPP) • Rotatable pool component quality (SUBMEPP) • Timeliness of feedback answered (SUBMEPP)

  5. CURRENT FUTURE COST NAE / Carrier Maintenance and Modernization

  6. Ao Metric

  7. Ao Metric Analysis • 2 out of 31 data points from FY05-06 were executed without losing any days (714 PNS, 764 NNSY)

  8. USS SANTA FE (SSN 763)FY06 DMP 9/30/06 – 10/30/07 02 March 2007 (A+23) PS: Mark Evans CO: CDR Vern Parks 1. KEY PROJECT PARAMETERS: 2. AVAILABILITY ASSESSMENT: WAR ROOM METRICS PAGE – EXECUTION – SLIDE 1 OF 21

  9. KEY PROJECT PARAMETERS:7/01/07 – 4/01/08 Pre-Availability Planning USS MEMPHIS (SSN-691) PIRA 26 February 2007 A-18 Weeks Portsmouth Naval Shipyard WAR ROOM METRICS PAGE – PLANNING – SLIDE 1 OF 15

  10. ERO Comparison

  11. DMP Comparison

  12. Sub Team One Metrics • FY-04-06 100% of Alts had changes

  13. Sub Team One Metrics

  14. Sub Team One Metrics

  15. Sub Team One Metrics

  16. LANT/PAC Comparison Data shows there is a large delta between LANT and PAC performance Non-VLS ERO 93-06 Average (LANT) = 25.5 month duration, $247,980,000 per avail, 367,801 mandays Average (PAC) = 28.7 month duration, $308,600,000 per avail, 432,039 mandays VLS DMP 93-06 Average (LANT) = 13.5 month duration, $133,380,000 , 178,072 mandays Average (PAC) = 16.9 month duration, $156,672,000 , 201,031 mandays Non-VLS DMP 89-92 Average (LANT) = 12.1 month duration, $67,920,000 , 151,162 mandays Average (PAC) = 18 month duration, $103,540,000 , 169,583 mandays *Data taken from Cannon report

  17. XYZ Draft Metric Months

  18. Interview Results • What is the most important? • Ao • Cost per percent complete • Total cost of avail • 100-700 percent complete • Resources per day requested/received • X=Y=Z What is the USE Submarine maintenance and modernization METRIC that is most important / useful to you?

  19. Interview Results • What was missing? • BPMP deviation metrics • How well we are doing as a deviation of “X” • Metric that shows if we missed or beat the CNO date by so many days • Cost of a day of sub time • Lots of data/metrics, none that drives behavior • A metric that indicates the backlog of maintenance • Amount of work deferred from avails

  20. SUBMARINE FORCE DIRECTION FOR 2007 R 192306Z DEC 06 COMSUBFOR NORFOLK VA SUBMARINE FORCE DIRECTION FOR 2007 • Alignment/Enterprise Management • CFT Alignment - extend effective alignment through our Cross Functional Teams by: • 1) mapping processes and costs to develop a productivity baseline, • 2) developing major process metrics, and • 3) developing hierarchical metrics that directly support USE top tier objectives. • Common readiness processes • Common metrics - develop a predictive metric, common to the undersea and naval air enterprises, that improves productivity in ships' depot maintenance programs. Develop accurate maintenance notionals to properly predict budget requirements. How is the USE / M&S CFT / ST1 executing this direction??

  21. Brainstorm Discussion • Do the EXISTING metrics give you confidence in the decisions you make? • Do the EXISTING metrics drive the right behaviors? • Do you understand how your responsibilities & decisions effect Ao? • What else more or different do you need for metrics? • When do you need the metrics? • When do you need to make decisions based on those metrics? (XYZ) • How do delays or disruption in your process map “block” effect the execution cost and time? • Are there metrics at all appropriate “places” on the end to end process map?

More Related