1 / 14

By: Dr. Joe H. Dean & Edward L. Safford III, PE Lockheed Martin Aeronautics Company

Managing Customer Expectations Through an Integrated Approach to Risk Management, Program Performance Measures, and Trade Studies. By: Dr. Joe H. Dean & Edward L. Safford III, PE Lockheed Martin Aeronautics Company.

zaza
Download Presentation

By: Dr. Joe H. Dean & Edward L. Safford III, PE Lockheed Martin Aeronautics Company

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Managing Customer Expectations Through an Integrated Approach to Risk Management, Program Performance Measures, and Trade Studies By: Dr. Joe H. Dean & Edward L. Safford III, PE Lockheed Martin Aeronautics Company

  2. Program Performance Measures (PPMs) are Valuable Metrics for Establishing Expectations Across Both Customer and Contractor Organizations • PPMs Consist of Technical, Cost and Schedule Categories. – Technical Performance Measures (TPMs) Are Key Engineering Metrics That Summarize Performance Resulting From Decisions Regarding Detailed Design Variables. – Cost Performance Measures (CPMs) Cover Development, Production (Implementation) and Support Phases. – Schedule Performance Measures (SPMs) Are Appropriate for Development and Production (Implementation) Acquisition Phases. • Negotiated Objectives and Thresholds for TPMs, CPMs and SPMs Bound the Trade Space (also known as the Design Space).

  3. Program Performance Measures Impacted by Software Performance Measures Planned Objective Threshold T1 - Bus Frame Latency Time (msec) 40 20 60 T2 - Direct Connect Latency for Analog, Synchro or Discrete Wires (msec) 4 2 18 T3 - Interface Signals (# Handled) 2600 3600 2400 Technical C1 - Development Cost (Eng Hrs) 28K 25K 30K C2 - Anomaly Cost (Eng Hrs) 2K .1K 3K Cost Sched-ule S1 - Development Time (Months) 16 15 21

  4. Risk Assessments Based on a Work Breakdown Structure Provide a Comprehensive Look at Program Risks Program Support Systems Programmatics Air Vehicle SW Maint. Facility Airframe Eng & Other Avionics Flight Test Tech Pubs Software OFPS DT&E Fwd/Cntr/Wing/Aft. Spares EMC EMI/RFC Test Vehicle Deliverables Dorsal/Empennage Depot Sup. Equip. Integration Test Station O & I Sup. Equip. Note: Shaded WBS Elements Are Software Focused. Sys. Integ. Test Unique Avionics Training

  5. Significant Risks Within the Software Maintenance Facility WBS Element 1 - Software Automated Test (SWAT) - Some SWAT Procedures Assume a Particular Machine State Prior to Their Execution and Therefore May Fail to Give Repeatable Results if the Procedures Are Ever Rearranged in Their Order of Execution. This Means That Some Latency and Interface Tests Cannot Be Considered as Reliable. 2 - SWAT Procedure Conversions - Early Automated Test Procedures Were Developed in a Prototype Language for Avionic Subsystems That Have Since Been Updated. These Procedures Must Be Converted to the New SWAT Language and Updated for the New or Modified Avionics. They May Not Be Converted Correctly or May Not Be Converted in Time to Support All of the Latency and Interface Tests. 3 - New Simulation Host Computer - A Next Generation Computer Is Needed to Provide Sufficient Reserve Execution Time Within Each Avionics Frame for Growth and to Prevent Overruns. This State-of-the-Art Hardware and Operating System Combination Is Likely to Have Some Bugs That Cause Unreliable Operation. This Will Make it Take Longer to Find Errors and Reduce the Number of Tests Which Can Be Run. 4 - Reuse Software Incompatibilities - There Are Significant Incompatibilities Between the New Host Computer and Its Previous Generation. This Could Drive New Development for Portions of the Software Baseline and SWAT Procedures Which Were Planned To Be Rehosted and Reused Without Modification. There Would Be Fewer Cost and Schedule Resources Available for Testing.

  6. Assessment of Significant Risks Within the Software Maintenance Facility Risk Item Prob. of Program Consequences Occurrence Technical Cost Schedule 1 SWAT Testing Is Not .3 T1=+6ms C1=+10K S1=+3m Repeatable/Reliable T2=+15ms C2=+8K 2 Conversion of SWAT Proc. .3 T3=-1600sig C2=+18K S1=+10m Is More Difficult Than Anticipated 3 Host Computer Not Reliable .7 T1=+10ms C1=+10K for Extended Sim. Runs T3=-300sig C2=+4K 4 Rehosted SWAT Procedure .7 T1=+6ms C1=+30K S1=+3m T3=-200sig C2=+4K

  7. “Expected” and Worst Case PPM Degradations are Based on Risk Assessments TPMs and CPMs Are Calculated Using Expected Value Procedures. Preferred Outcome Planned Performance Level “Expected” Performance Level Worst Case Performance Level Assumption - Probability Multiplied by Degradation and Summed for All Risks Impacting TPM or CPM of Interest. Assumption - No Risks Impact This Measure. Assumption - All Risks Occur.

  8. “Expected” and Worst Case PPM Degradations are Based on Risk Assessments (Cont.) SPMs Are Calculated Using “Longest Pole in the Tent” Concept. Preferred Outcome Planned Performance Level “Expected” Performance Level Worst Case Performance Level Assumption - Worst Schedule Slip Incurred From Any Risk. Assumption - No Risks Impact This Measure. Assumption - Probability Multiplied by Worst Case Schedule Slip.

  9. Representative Program Performance Measure Adjusted for Risk Objective Planned Performance Level (No Risks Occur) Preferred Outcome Risk Adjusted or “Likely” Performance Level Threshold Worst Case Scenario (All Risks Occur) Note: In This Example the “Likely” Performance Measure Is Above the Customer Threshold. Representative Program Performance Measure

  10. Program Performance Measures Adjusted for Risk Provide Rapid Decision Making Insight 2ms 3600 25K .1K 15 20ms Objectives Thresholds 18ms 2400 30K 3K 21 60ms Bus Frame Latency Time Direct Connect Latency Interface Signals Handled Development Cost Anomaly Cost Development Time TPMs CPMs SPM

  11. Comparison of Specific and Traditional Risk Assessments … Objectives Statement of Risk: SWAT testing is not repeatable / reliable Prob. of Occurrence: .3 Objectives Thresholds Planned “Likely” “Worst Case” x x Thresholds x x x T1 T2 T3 C1 C2 S1 T1 T2 T3 C1 C2 S1 Specific Risk Assessment – Conveys Expectations to Customer Risks Identified at 3rd Level WBS Traditional Risk Assessment – Alerts Customer to Severity … High Statement of Risk: SWAT testing is not repeatable / reliable Prob. of Occur. Prob. of Occur. Low x High x Neg. x Crit. Technical Low Cost Neg. x 50% Neg. Critical Consequence of Failure Schedule Neg. x 40%

  12. Risk Fall Back Decisions Should be Scheduled and Statused Along with Other Program Trade Studies 2001 2002 2003 2004 Prime Arch. & Design Prime Development, Code, & Unit Test SWMF Development SWMF Integ.&Test Begin Fall Back Tasks? Prime Module Integ. Prime Integ. Test Workstation Based Simulation / Avionics Simulation Environment (ASE) Release to Flight Test ASE Reconfigure Prior SWMF New Host Computer Software Maintenance Facility (SWMF)

  13. Program Performance Measures are Key Trade Study Decision Criteria Discontinue Primary & Pursue Fall Back Fall Back Not Needed Pursue Both Primary & Fall Back Delay Decision Two Months Bus Frame Latency Time 40 m sec 50 m sec 40 m sec 40 m sec? Direct Connect Latency 4 m sec 4 m sec 4 m sec 4 m sec? Interface Signals Handled 2600 signals 2400 signals 2600 signals 2600 signals? Development Cost 28K hrs + per day slip 28K hrs 28K hrs 35K hrs Anomaly Cost 2K hrs 4.8K hrs 4.8K hrs 2K hrs? Development Time 16 months 18 months 18 months 18 months?

  14. Managing to Customer Acceptability Thresholds for PPMs Provides Effective, High-Payoff Management Control • PPMs Adjusted by Risk Provide an Important Communications Bridge Between Contractor and Customer. • CAIV Implementation is Improved by Including Awareness of Risk Adjusted CPMs in Trade Study Decision Making. • Establishing a Comparative Visual Reference for Thresholds and Objectives for Planned Achievement of PPMs Is a Useful Summary Level Decision Aide for Trade Studies. • Using Results of a Risk Assessment to Determine the Likely Levels of PPMs Influences the Determination of Weights for Trade Study Decision Criteria. • Using Worse Case Scenario Risk Data Provides a “Lower” Bound for Key PPMs by Accounting for All Currently Known Risks.

More Related