1 / 39

When Do You Need Systems of Systems Engineering: A Quantitative Analysis

When Do You Need Systems of Systems Engineering: A Quantitative Analysis. Jo Ann Lane 17 March 2009 University of Southern California Center for Systems and Software Engineering. Overview. Key definitions Scope of research Methodology Model implementation Results of research

burke
Download Presentation

When Do You Need Systems of Systems Engineering: A Quantitative Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. When Do You Need Systems of Systems Engineering: A Quantitative Analysis Jo Ann Lane 17 March 2009 University of Southern California Center for Systems and Software Engineering

  2. Overview • Key definitions • Scope of research • Methodology • Model implementation • Results of research • Conclusions and future work USC CSSE Annual Research Review

  3. What is a “System of Systems”? • Very large systems developed by creating a framework or architecture to integrate constituent systems • SoS constituent systems independently developed and managed • New or existing systems in various stages of development/evolution • May include a significant number of COTS products • Have their own purpose • Can dynamically come and go from SoS • SoS exhibits emergent behavior not otherwise achievable by component systems • Typical domains • Business: Enterprise-wide and cross-enterprise integration to support core business enterprise operations across functional and geographical areas • Military: Dynamic communications infrastructure to support operations in a constantly changing, sometimes adversarial, environment Based on Mark Maier’s SoS definition [Maier, 1998] USC CSSE Annual Research Review

  4. Types of SoS • Virtual [Maier, 1998] • Lacks a central management authority and a clear SoS purpose • Often ad hoc and may use a service-oriented architecture where the constituent systems are not necessarily known • Collaborative [Maier, 1998] • Constituent system engineering teams work together more or less voluntarily to fulfill agreed upon central purposes • No SoSE team to guide or manage activities of constituent systems • Acknowledged [Dahmann, 2008] • Have recognized objectives, a designated manager, and resources at the SoS level (SoSE team) • Constituent systems maintain their independent ownership, objectives, funding, and development approaches • Directed [Maier, 2008] • SoS centrally managed by a government, corporate, or Lead System Integrator (LSI) and built to fulfill specific purposes • Constituent systems maintain ability to operate independently, but evolution subordinated to centrally managed purpose This research focused on identifying the “home-ground” for these two types of SoSs... USC CSSE Annual Research Review

  5. Scope of Research • Research question • When is it cost effective to establish and use a system of systems engineering (SoSE) team to oversee and guide the evolution of a system of systems (SoS)? • Hypothesis • There exists a threshold where it is more cost effective to manage and engineer capability changes to an SoS using an SoSE team and this threshold can be determined by modeling the SoS system complexity and desired capability interdependency characteristics. Focus is on software-intensive SoSs owned by the United States Department of Defense (DoD)... USC CSSE Annual Research Review

  6. Statement of Topic and Contribution (continued) • Research contribution • Provides guidance to DoD leadership with respect to the management of sets of inter-related systems that are functioning as a system of systems. • Guidance also applies to SoSs in other domains that are managed as collaborative or acknowledged SoSs • Model for management and engineering guidance also provides • A method for conducting trade-off analyses for different approaches for implementing a given SoS capability for a given SoS • A model that can evolve into an SoSE cost model through calibration for a given SoS or SoS domain • A cost model that can better model complex systems USC CSSE Annual Research Review

  7. Methodology • Using COSYSMO, developed a process model that can compare the SoS management strategies as SoS characteristics are varied • SoS size (number of constituent systems) • Size of SoS capability (number of equivalent nominal requirements) • Scope of SoS capability (number of constituent systems affected by SoS capability) • Constituent system volatility (level of constituent system change being engineered at the same time as SoS capability) • Process model based on data from • 18 large-scale DoD SoS programs • 16 DoD systems that participate as constituent systems in one or more SoSs • Analyze model outputs to determine under what conditions an SoSE team is cost effective USC CSSE Annual Research Review

  8. SoSE Process Model Overview • Purpose • Estimate and compare the effort required to implement an SoS capability using two different management approaches • Collaborative (no SoSE team) • Acknowledged (SoSE with limited authority/control) • Assumptions and constraints • All constituent systems currently exist and have their own evolutionary paths based on system-level stakeholder needs/desires • Model assumes SoSE and traditional SE teams are using relatively mature processes • SoS capabilities are software-intensive • No SoS capability/requirements volatility • SoS internal volatility represented by constituent system volatility • No accommodation of schedule factors or the asynchronous nature of SoS constituent system upgrades • Management of SoS internal interfaces reduces complexity for systems USC CSSE Annual Research Review

  9. Systems Engineering Requirements Categories • Requirements related to SoS capabilities • Acknowledged SoS: Initially engineered at SoS level by SoSE team with support from constituent system engineers for those systems impacted by the SoS capability, then allocated to constituent systems for further SE • Collaborative SoS: Not engineered at the SoS level, but must be engineered fully at the constituent system level through collaborative efforts with other constituent system engineers • Non-SoS requirements related to constituent system stakeholder needs • Must be monitored by SoSE team to identify changes that might adversely impact SoS • Represents on-going volatility at the constituent system level that is occurring in parallel with SoS capability changes USC CSSE Annual Research Review

  10. SoSE Model Structure Focus is on software-intensive SoSs owned by the US DoD, the number and volatility of constituent systems within an SoS, and the complexity of typical capability enhancements to the SoS... USC CSSE Annual Research Review

  11. Calculations based on SoS characteristics/size and capability implementation approach using COSYSMO algorithm Conversion to COSYSMO size units Effort using an “acknowledged” SoSE team Equivalent set of “sea-level” requirements System Capability Effort for a “collaborative” SoS Overview of SoSE SDM Flow USC CSSE Annual Research Review

  12. Stocks Inputs SoS Equivalent Requirements Outputs SoSE Effort SoS Upgrade Effort with SoSE SoS Upgrade Effort without SoSE Flows Capability Rate SoSE Effort Rate SE Effort Rate with SoSE SE Effort Rate without SoSE Converter Parameters COSYSMO effort multipliers COSYSMO SoSE EM COSYSMO SE EM with SoSE COSYSMO SE EM without SoSE COSYSMO SE EM SoS complexity factors Number of systems in SoS Number of systems affected by capability Average system rate of change Model Parameters by SDM Construct General Form of COSYSMO Equation Effort (person months) = [38.55 * EM * (size)1.06] / 152 USC CSSE Annual Research Review

  13. SoSE Effort Multiplier 2.50 USC CSSE Annual Research Review

  14. Effort Multiplier for SoSE Monitoring of Constituent System Requirements 0.47 USC CSSE Annual Research Review

  15. SE Effort Multiplier for SoS Requirements with SoSE Support 1.06 USC CSSE Annual Research Review

  16. SE Effort Multiplier SoS Requirements without SoSE Support 1.79 USC CSSE Annual Research Review

  17. SE Effort Multiplier for System-Specific (Non-SoS) Requirements 0.72 USC CSSE Annual Research Review

  18. Effort Calculations SoSE Effort SoSE Effort = 38.55*[((SoSCR/SoSTreq)*(SoSTreq)1.06 *EMSoS-CR)+ ((SoSMR/SoSTreq)*(SoSTreq)1.06 * EMSoS-MR)/152] Where: Total SoSE requirements = SoS Capability Requirements + SoS “Monitored” Requirements SoS “monitored” reqs = [∑SE non-SoS requirements being addressed current upgrade cycles for all SoS constituent systems] * “Oversight Factor” “Oversight Factor” = 5% , 10%, 15% (these values are based on expert judgment from various CSSE affiliates and the SoS SE Guidebook team) Based on COCOMO II approach for combining components with different EMs (SoS changes and Constituent System oversight) USC CSSE Annual Research Review

  19. Effort Calculations (continued) Single System Effort with Support from SoSE Team Total single system reqsw-SoSE = SoS requirements allocated to system + SE reqs in upgrade cycle Single system SE Effort with SoSE Team = 38.55*[1.15*( (SoSCSalloc / CSTreqSoSE)*( CSTreqSoSE)1.06* EMCS-CRwSOSE) + (CSnonSoS / CSTreqSoSE)*( CSTreqSoSE)1.06* EMCSnonSOS] /152 Based on COCOMO II approach for combining components with different EMs plus including a 15% “tax” to support SoSE team in their engineering effort for the SoSE requirements. 15% represents half of the system design effort in the EIA 632 tasks. USC CSSE Annual Research Review

  20. Effort Calculations (continued) Single System Effort with No SoSE Team Support Total single system reqs wo-SoSE = SoSE capability reqs + SE non-SoS requirements Single system SE Effort without SoSE Team = 38.55*[(( SoSCR / CSTreqwoSoSE)*( CSTreqwoSoSE)1.06* EMCS-CRnSOSE) + ((CSnonSoS / CSTreqwoSoSE)*( CSTreqwoSoSE)1.06* EMCSnonSOS)] /152 Based on COCOMO II approach for combining components with different EMs (SoS changes and non-SoS changes) USC CSSE Annual Research Review

  21. Range of SoS Complexity Factor Values USC CSSE Annual Research Review

  22. Results of Research Scenario 2 (SoS Size Varies) Scenario 1 (SoS Size Varies) Scenario 3 (SoS Size Varies) Scenario 4 (SoS Size Varies) USC CSSE Annual Research Review

  23. Results of Research (continued) Scenario 6 (SoS Size Varies) Scenario 5 (SoS Size Varies) Scenario 7-a (SoS Size = 10) Scenario 7-b (SoS Size = 100) USC CSSE Annual Research Review

  24. Results of Research (continued) Scenario 8-b (SoS Size = 100) Scenario 8-a (SoS Size = 10) Scenario 9 (SoS Size = 10) Scenario 10 (SoS Size = 5) USC CSSE Annual Research Review

  25. Results of Research (continued) Scenario 12 (SoS Size = 5) Scenario 11 (SoS Size = 5) USC CSSE Annual Research Review

  26. Conclusions ? When is it cost effective to create and empower an SoSE team to oversee and guide the evolution of an SoS? There exists a threshold where it is more cost effective to manage and engineer changes to an SoS using an SoSE team and this threshold can be determined by modeling the SoS’ interdependency and complexity characteristics. Model parameters: SoS size Scope/size of SoS change CS volatility SoSE oversight SoSE Model USC CSSE Annual Research Review

  27. Conclusions (continued) • SoSE team is cost effective when • SoS contains more than a “few” systems • SoS capability changes typically affect a “significant percentage” of constituent systems • SoS capability requirements are a “significant percentage” of the total requirements being addressed by constituent systems in an upgrade cycle • SoS oversight activities and the rate of capability modifications/changes being implemented are sufficient to keep an SoSE team engaged (i.e., little-to-no slack time) • SoSE team is NOT cost effective when • The number of systems in an SoS is “small” • The constituent system volatility is high and the SoS changes are small USC CSSE Annual Research Review

  28. Conclusions (continued) • The “oversight factor” (the amount of effort spent by the SoSE team to monitor non-SoS changes in the constituent systems) is a key factor in determining the cost effectiveness of the SoSE team • More work is needed to determine a more accurate “oversight factor” • This factor may be variable across multiple SoSs • There may be reasons other than cost to engage an SoSE team • Importance of SoS • Critical SoS performance requirements requiring extensive analysis at the SoS level USC CSSE Annual Research Review

  29. Future Work • Expand SoSE model to • Include schedule factors to allow trade-offs between “faster” and “cheaper” • Include quality factors based on complexities and the resulting rework due to inadequate SoS engineering • Allow users to specify specific constituent system configurations to allow capability alternative trade-offs • Investigate the factors in going from an Acknowledged SoS to a Directed SoS USC CSSE Annual Research Review

  30. Backup Charts USC CSSE Annual Research Review

  31. Translating capability objectives Translating capability objectives Translating capability objectives Assessing (actual) performance to capability objectives Assessing (actual) performance to capability objectives Assessing performance to capability objectives Developing, evolving and maintaining SoS design/arch Developing, evolving and maintaining SoS design/arch Developing & evolving SoS architecture Understanding systems & relationships (includes plans) Understanding systems & relationships (includes plans) Understanding systems & relationships Monitoring & assessing changes Monitoring & assessing changes Monitoring & assessing changes Orchestrating upgrades to SoS Orchestrating upgrades to SoS Orchestrating upgrades to SoS Addressing new requirements & options Addressing new requirements & options Addressing requirements & solution options External Environment Traditional SE and SoSE Activities SoSE (SoS SE Guidebook View Based on Interviews and Analysis of 18 DoD SoSs in Various Stages) Traditional SE (Defense Acquisition Guide [DoD, 2006] View) USC CSSE Annual Research Review

  32. Key COSOSIMO Research Findings • Limitations of COSYSMO for “Directed” SoSE effort estimation • Missing cost factors • Cost/schedule compatibility of proposed SE approach • Level of overall risk resolution • Number of constituent systems and associated organizations • Constituent system maturity and stability • Constituent system readiness • Need to adjust for SoSE oversight of constituent system SE • Need ability to assign different EMs to various parts of SE USC CSSE Annual Research Review

  33. System dynamics modeling tool: visual modeling tools that allow one to conceptualize, simulate and analyze models of dynamic systems and processes Consist of causal loops or stock and flow diagrams Models are executable, allowing use to explore behaviors of the model as variables representing process influences are changed Examples: Hybrid/plan-driven ICM [Madachy et al., 2007] Intergovernmental collaboration [Cresswell et al., 2002] Inter-Organizational Baseline Alignment [Greer et al., 2005] Requirements volatility [Ferreira, 2002] Under-allocation of resources in early phases of a project [Black and Repenning, 2001] Interactions between concurrently developed projects [Ford and Sterman, 2003] Using System Dynamics Models to Explore Alternatives or Influences in the Development of Large Software-Intensive Systems USC CSSE Annual Research Review

  34. Model Validity Rationale • SoSE model description • Comparison model based on a modified version of the validated academic systems engineering cost model, COSYSMO • Modifications based upon key findings of the OSD SoSE case studies • Validation goal: Show that the SoSE cost model is a valid method conducting sensitivity analyses for two different SoS management strategies • Collaborative • Acknowledged • Not part of the validation goal: The estimation of actual effort associated with a specific SoS or a given set of SoSs • The calibration/validation of the SoSE model for this purpose is left for future work USC CSSE Annual Research Review

  35. Model Validity Rationale (continued) • Validity argument • COCOMO II and Academic COSYSMO are multiple regression models that have been calibrated and validated with actual data from primarily DoD programs • Academic COSYSMO calibration data contains 3 SoS data points • Most other COSYSMO calibration data points interface to other systems, which implies that they are part of one or more SoSs • SoSE model was developed using • Academic COSYSMO that includes ability to distribute effort across SE phases • Locally validated COSYSMO extension to adjust effort for reuse/oversight of evolving system components • COCOMO II technique for using multiple effort multipliers to characterize components with different characteristics and complexities • SoSE model parameters • Based on ranges of size drivers determined through case studies and surveys • Uses nominal cost driver values unless reasons identified in SoSE or SE survey data to indicate otherwise • Resulting in a relative comparison of the two management approaches USC CSSE Annual Research Review

  36. Model Validity Rationale (continued) • Validity argument (continued) • The prediction accuracies (PRED factors) are • COCOMO [Clark and Reifer, 2007] • PRED(30) = 75% (with no stratification of projects) • PRED(30) = 80% (with stratification of projects) • COSYSMO [Valerdi, 2005] • PRED(30) = 75% (with stratification of projects) • PRED(30) = 85% (anecdotal evidence from local calibrations) • The OSD SoSE cases studies show that SoS systems engineers perform the same types of activities as addressed by the SE cost model, COSYSMO • The OSD SoSE case studies identify differences between SoSE and SE for a single system and most of these differences are with respect to parameters in the SE cost model, COSYSMO • There exists a local (single organization) calibrated and validated method within COSYSMO to estimate effort for oversight of related/interfacing systems or reusable components [Wang et al, 2008] USC CSSE Annual Research Review

  37. References Ackoff, R. (1971); “Towards a System of Systems Concepts”, Management Science, Vol 17, No. 11, Theory Series, pp. 661-671. ANSI/EIA (1999). ANSI/EIA-632-1988 Processes for Engineering a System. Berry, B. (1964); “Cities as Systems within Systems of Cities”, The Regional Science Association Papers, Volume 13.. Black, L. and Repenning, N. (2001); “Why Firefighting is Never Enough: Preserving High-Quality Product Development”, System Dynamics Review, Vol. 17, No: 1, pp. 33-62 Blanchard, B. and Fabrycky, W. (1998). Systems Engineering and Analysis, Prentice Hall. Boehm, B., Abts, C., Brown, A. W., Chulani, S., Clark, B., Horowitz, E., Madachy, R., Reifer, D. J. and Steece, B. (2000). Software Cost Estimation With COCOMO II, Prentice Hall. Boehm, B. and Lane J. (2006) "21st Century Processes for Acquiring 21st Century Software-Intensive Systems of Systems." CrossTalk - The Journal of Defense Software Engineering, Vol. 19, No. 5, pp.4-9. Boehm, B., Valerdi, R., Lane, J., Brown, A., (2005) “COCOMO Suite Methodology and Evolution,” CrossTalk - The Journal of Defense Software Engineering, Vol. 18, No. 4, pp. 20-25, April 2005. Cocks, D. (2006); “How Should We Use the Term “System of Systems” and Why Should We Care?”, Proceedings of the 16th Annual INCOSE International Symposium. Cresswell, A. et al. (2002); "Modeling Intergovernmental Collaboration: A System Dynamics Approach", Proceedings of the 35th Annual Hawaii International Conference on System Sciences. Dahmann, J. and Baldwin. K. (2008); “Understanding the Current State of US Defense Systems of Systems and the Implications for Systems Engineering”, Montreal, Canada: Proceedings of the IEEE Systems Conference, 7-10 April. Department of Defense (DoD) (2006); Defense Acquisition Guidebook, Version 1.6, accessed at http://akss.dau.mil/dag/ on 2/2/2007. Department of Defense (DoD) (2008); Systems Engineering Guide for System of Systems, version 1.0. Dorner, D. (1996); The Logic of Failure, Metropolitan Books. Ferreira S. (2002); Measuring the Effects of Requirements Volatility on Software Development Projects. Ph.D. Dissertation, Arizona State University. Finley, J. (2006); “Keynote Address”, Proceedings of the 2nd Annual System of Systems Engineering Conference Ford D. and Sterman J. (2003); "Iteration Management for Reduced Cycle Time in Concurrent Development Projects", Concurrent Engineering Research and Application (CERA) Journal. Friedman, T. (2005), The World is Flat: A Brief History of the Twenty-First Century, Farrar, Straus and Giroux, New York. Greer, D., Black, L., Adams, R. (2005), "Improving Inter-Organizational Baseline Alignment in Large Space System Development Programs", Proceedings of IEEE Aerospace Conference. Highsmith, J. (2000); Adaptive Software Development: A Collaborative Approach to Managing Complex Systems, Dorset House Publishing. USC CSSE Annual Research Review

  38. References (continued) INCOSE (2006); Systems Engineering Handbook, Version 3, INCOSE-TP-2003-002-03. isee Systems (2007), "iThink", http://www.iseesystems.com/Softwares/Business/ithinkSoftware.aspx accessed on 2/10/2007. ISO/IEC (2002). ISO/IEC 15288:2002(E) Systems Engineering - System Life Cycle Processes. Ferreira S. (2002); Measuring the Effects of Requirements Volatility on Software Development Projects. Ph.D. Dissertation, Arizona State University. Finley, J. (2006); “Keynote Address”, Proceedings of the 2nd Annual System of Systems Engineering Conference Ford D. and Sterman J. (2003); "Iteration Management for Reduced Cycle Time in Concurrent Development Projects", Concurrent Engineering Research and Application (CERA) Journal. Friedman, T. (2005), The World is Flat: A Brief History of the Twenty-First Century, Farrar, Straus and Giroux, New York. Greer, D., Black, L., Adams, R. (2005), "Improving Inter-Organizational Baseline Alignment in Large Space System Development Programs", Proceedings of IEEE Aerospace Conference. Highsmith, J. (2000); Adaptive Software Development: A Collaborative Approach to Managing Complex Systems, Dorset House Publishing. INCOSE (2006); Systems Engineering Handbook, Version 3, INCOSE-TP-2003-002-03. isee Systems (2007), "iThink", http://www.iseesystems.com/Softwares/Business/ithinkSoftware.aspx accessed on 2/10/2007. ISO/IEC (2002). ISO/IEC 15288:2002(E) Systems Engineering - System Life Cycle Processes. Kreitman, K.(1996), "From 'The Magic Gig' to Reliable Organizations: A New Paradigm for the Control of Complex Systems", Symposium on Complex Systems Engineering, http://cs.calstatela.edu/wiki/index.php/ Symposium_on_Complex_Systems_Engineering, accessed on 1/11/2007. Krygiel, A. (1999); Behind the Wizard’s Curtain; CCRP Publication Series, July, 1999, p. 33 Lane, J. and Boehm, B. (2007); Modern Tools to Support DoD Software Intensive System of Systems Cost Estimation: A DACS State-of-the-Art Report, Data and Analysis Center for Software, DACS Report Number 347336. Lane, J. and Valerdi, R., (2007); "Synthesizing System-of-Systems Concepts for Use in Cost Estimation", Systems Engineering, Vol. 10, No. 4. Lu, S. (2003); Engineering as Collaborative Negotiation: A New Paradigm for Collaborative Engineering, http://wisdom.usc.edu/ecn/about_ECN_what_is_ECN.htm accessed on 2/14/2007. Madachy, R., B. Boehm, and J. Lane (2007); “Assessing Hybrid Incremental Processes for SISOS Development”, Software Process: Improvement and Practice, Vol. 12, Issue 5, pp. 461-473. Maier, M. (1998); “Architecting Principles for Systems-of-Systems”; Systems Engineering, Vol. 1, No. 4 (pp 267-284) USC CSSE Annual Research Review

  39. References (continued) NAVSTAR Global Positioning System Joint Program Office, http://gps.losangeles.af.mil/ , accessed on 12/6/2006. Northrop, L., et al. (2006); Ultra-Large-Scale Systems: The Software Challenge of the Future, Software Engineering Institute. Pinney, B. (2001); Projects, Management, and Protean Times: Engineering Enterprise in the United States, 1870-1960, PhD Dissertation, Massachusetts Institute of Technology. Pressman, J. and Wildavsky, A. (1973); Implementation: How Great Expectations in Washington are Dashed in Oakland; Or, Why It’s Amazing that Federal Programs Work at All, This Being a Saga of the Economic Development Administration as Told by Two Sympathetic Observers Who Seek to Build Morals on a Foundation of Ruined Hopes, University of California Press. Rechtin, E. (1991); Systems Architecting: Creating & Building Complex Systems, Prentice Hall. SEI (2001), Capability Maturity Model Integration (CMMI), CMU/SEI-2002-TR-001. Sheard, S. (2006), "Foundations of Complexity Theory for Systems Engineering of Systems of Systems", Proceedings of the IEEE Conference on System of Systems Engineering. United States Air Force (USAF) Scientific Advisory Board (SAB) (2005); Report on System-of-Systems Engineering for Air Force Capability Development; Public Release SAB-TR-05-04 Valerdi, R. (2005); Constructive Systems Engineering Cost Model. PhD. Dissertation, University of Southern California. Valerdi, R. and Wheaton, M. (2005); "ANSI/EIA 632 as a Standardized WBS for COSYSMO", AIAA-2005-7373, Proceedings of the AIAA 5th Aviation, Technology, Integration, and Operations Conference, Arlington, Virginia. Wang, G., Valerdi, R., Ankrum, A., Millar, C., and Roedler, G. (2008), "COSYSMO Reuse Extension", Proceedings of the 18th Annual International Symposium of INCOSE, The Netherlands. USC CSSE Annual Research Review

More Related