1 / 22

Advanced Risk Management and Its Applications to Systems Engineering

Advanced Risk Management and Its Applications to Systems Engineering. The Hampton Roads Area International Council on Systems Engineering Conference Proceedings, Newport News, Virginia November 8 th and 9 th , 2007. Colin K. S. Barrows, Ph.D. Presenter Resit Unal, Ph.D.

Download Presentation

Advanced Risk Management and Its Applications to Systems Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Advanced Risk Management andIts Applications to Systems Engineering The Hampton Roads Area International Council on Systems Engineering Conference Proceedings, Newport News, Virginia November 8th and 9th, 2007 Colin K. S. Barrows, Ph.D. Presenter Resit Unal, Ph.D. Charles B. Keating , Ph.D. C. Ariel Pinto , Ph.D. Trina M. Chytka , Ph.D. Old Dominion University, Norfolk, Virginia NASA Langley Research Center, Hampton, Virginia

  2. Presentation Outline __________________________________________________________________________________________________________________ Introduction Motivations & Values Research Problem Research Motivations and Value Methodology & Application Research Design and Approach Application Computations and Results Summary Limitations, Conclusions and Remarks

  3. Introduction __________________________________________________________________________________________________________________ Systems Engineering Domain Engineering Management Domain Cost 100% 80% Cost Incurred 60% Cost Committed 40% 20% 0% Conceptual & Preliminary Design Detailed Design & Integration Construction or Production Use, Refinement & Disposal Time Life Cycle Cost Commitment Farr (2006)

  4. Problem __________________________________________________________________________________________________________________ How do we effectively elucidate the extent of the ‘estimation of cost uncertainty bias’, found in High Risk Environment, when using Expert Judgment elicitation to aid the Decision Making Process?

  5. Motivations __________________________________________________________________________________________________________________ To investigate an epistemological approach that emphasizes the subjective nature of risk. • To obtain insight on the ‘estimation of cost uncertainty bias’ found in expert judgment elicitation of cost parameters. To elucidate the under or overestimation of the cost uncertainty bias within high risk environments, specifically, high consequence conceptual designs.

  6. Value __________________________________________________________________________________________________________________ • Allow decision makers to: • Be better informed to address this ‘estimation of cost uncertainty’ bias in relationship to the decision making process. • Assist in justification for decision makers’ actions based on scientific or technical specifics of a problem. • Provide documented arguments in support of or to justify use of expert judgment method.

  7. High Consequence Conceptual Engineering Environment (LV: Conceptual Design Phase) • Degree of the Estimation of Cost Uncertainty • Underestimation • (Failure to Identify Cost Uncertainty in the Conceptual Design Phase) • Overestimation • (Exaggeration of Identified Cost Uncertainty in the Conceptual Design Phase) Expert Judgment Elicitation (Cost Commitment in the Conceptual Design Phase) Approach __________________________________________________________________________________________________________________ • Conducted in two distinct phases: • A structured method for the elicitation process coupled with the respective bias handling techniques (qualitatively). • Data analysis; an evidence based approach, with intent to assess the remaining ‘estimation of cost uncertainty bias’ (quantitatively).

  8. Qualitative Process Developed Method for the Elicitation Process Based on Ross (2002)

  9. Approach __________________________________________________________________________________________________________________ Quantitative Process • Basic likelihood of cost uncertainty assignments were elicited from the experts. • Obtained a Calibration Co-efficient by: • Identifying the expertise (E) of each expert • Identifying Confidence/Risk Profile (CRP) of each expert • Substituting into a Calibration Co-efficient (CC) equation to obtain the calibration co-efficient • The CC was used to calibrate the elicited assessments from the experts.

  10. Approach __________________________________________________________________________________________________________________ Quantitative Process • Combined the calibrated assessments using Yager’s rule of combination. • Obtained the belief and plausibility measures: the upper and lower limits of uncertainty (considered be calibrated and aggregated limits; the control for study). • Found the aggregated but uncalibrated limits for each expert and mapped against the limits of the control. • Monte Carlo simulation was used to model the data allowing for further analysis of the ‘estimation cost uncertainty bias’.

  11. Application __________________________________________________________________________________________________________________ • Interested in cost uncertainty for a Launch Vehicle (LV) LCC • A questionnaire was developed using Adobe Acrobat (PDF format) and facilitated: • Electronically distributing, completing and returning of the questionnaire. • Data extraction from the questionnaire by linking it directly to spreadsheets in MS Excel and the simulation program for computations .

  12. Computations & Results __________________________________________________________________________________________________________________ • The CCs were obtained for the Experts respectively. • These calibrated assignments were aggregated using Yager’s rule.

  13. Computations & Results __________________________________________________________________________________________________________________ • Three alternate representations of uncertainty regarding cost parameters: • A Belief measure (strongest) • A Plausibility measure (weakest) • A Basic assignment (collected evidence/likelihood of the cost parameters)

  14. Computations & Results __________________________________________________________________________________________________________________ AN EXAMPLE OF CALCULATING THE BELIEF (Bel) AND PLAUSIBILITY (Pl) LIMITS

  15. Computations & Results __________________________________________________________________________________________________________________ • Obtaining the combined solution spaces from the simulation; considered to be aggregated and calibrated (the control for the study). • The resulting ranges were fairly diminutive at a few ‘error-states-values’ (y-axis). • These wider limits of ‘uncertainty-values’ (x-axis) depict the error-states-values where uncertainty is greater. Combined Belief And Plausibility Functions For LV 1

  16. Computations & Results __________________________________________________________________________________________________________________ Example of Overlay Graphs of Experts’ Uncalibrated Solution Space vs. the Control • Overlay graphs of cost parameter uncertainty solution spaces for each experts’ uncalibrated assignments mapped against the controls. • In contrast to the controls not diminutive at any point. • These wider limits of cost uncertainty-values (x-axis) depict the error-states (y-axis) where uncertainty is greater.

  17. Computations & Results __________________________________________________________________________________________________________________ • Identified if the level of cost uncertainty changed as the designs mature: the solution spaces of LV1 and LV2 were mapped against each other . • Cost uncertainty estimates (x-axis) had overlapping and interchanging limits at the various error-state-values (y-axis). Solution Space For the Combined Functions of LV 1 and LV 2 Compared

  18. Computations & Results __________________________________________________________________________________________________________________ • Similar for the combined experts’ solution spaces. • Ranges of experts’ uncertainty solution spaces varied when compared.

  19. Limitations __________________________________________________________________________________________________________________ • The availability of high risk experts due to the uniqueness of conceptual design environments. • Potential implications on the reliability and validity of the study. • The test application was of average size; larger or more complex applications may be more challenging. • May be dissimilarities in effectiveness from the previously validated questionnaire due to modifications; however, grounding philosophies were enforced.

  20. Summary & Remarks __________________________________________________________________________________________________________________ • This evidence based approach concluded that: • There is more of the estimation of cost uncertainty bias present than what is being anticipated. • From the perspective of maturing designs when comparing the two Launch vehicles: • It was concluded that the estimation of cost parameters uncertainty at different error-state-values was interchangeably larger or smaller when compared.

  21. Summary & Remarks __________________________________________________________________________________________________________________ • Benefits of the study include: • Contributing to and supporting the Expert Judgment method as it relates to the ‘estimation of uncertainty bias’ and its relationship to the decision making process. • Assisting in justification for decision makers’ actions based on scientific or technical specifics of a problem. • Providing a none probabilistic approach to risk analysis compared to previous methods. • Addressing the epistemological (subjective nature emphasis) approach of the performance of a system being studied.

  22. Thank You Questions?

More Related