1 / 46

Systems Engineering = Best Practices

Systems Engineering = Best Practices. Nicholas M. Torelli Deputy Director, Human Capital and Specialty Engineering Systems and Software Engineering (SSE) Office of the Deputy Under Secretary of Defense for Acquisition and Technology Presented to 2009 Systems Engineering Summit:

Rita
Download Presentation

Systems Engineering = Best Practices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Systems Engineering = Best Practices Nicholas M. Torelli Deputy Director, Human Capital and Specialty EngineeringSystems and Software Engineering (SSE)Office of the Deputy Under Secretary of Defense for Acquisition and Technology Presented to2009 Systems Engineering Summit: Harnessing Best Practices Sponsored by the Huntsville Regional Chapter of INCOSE,US Army, RDECOM, AMRDEC, DAU, and NASA March 3, 2009

  2. DoD Vision for Systems Engineering Systems engineering principles and disciplines are fully accepted and assimilated into the DoD acquisition workforce positioning the DoD for acquisition excellence leading to a stronger national defense.

  3. SSE’s 2009 Priorities • Acquisition work force development • Renewed focus on early application of systems engineering to affect affordability and total ownership cost • Increased level of visibility for our role in developmental test and evaluation • Systems of Systems engineering tools and techniques • Integration of program protection activities into acquisition oversight to address cyber threat • Measuring results of our efforts

  4. DoD SE Best Practice Continuum Policy &Guidance Developingthe Workforce& AdvancingSE Practice Program Support& Assessment Teamwork &Collaboration

  5. Production and Deployment O&S Overview of Acquisition Policy Changes* • Mandatory Materiel Development Decision (MDD) • Mandatory competing prototypes before MS B • Mandatory PDR and a report to the MDA (“the sliding PDR”)[PDR Report to MDA if before MS B; formal PDR Assessment by MDA if after MS B] • Configuration Steering Boards at Component level to review all requirements changes • Renewed emphasis on manufacturing during system development: • Re-titles SDD phase to EMD with two sub phases: Integrated System Design and System Capability and Manufacturing Process Demonstration • Establishes consideration of manufacturing maturity at key decision points • Mandatory system-level CDR with an initial product baseline and followed by a Post-CDR Report to the MDA • Post-CDR Assessment by the MDA between EMD sub phases MS A MS B MS C Materiel Solution Analysis Engineering and Manufacturing Development CPD Joint Concepts CBA ICD CDD Technology Development Strategic Guidance MDD Full Rate Production Decision Review JCIDS Process or PDR CDR * DoDI 5000.02, 8 December 2008

  6. Acquisition Policy Opportunities for SE • Early SE engagement with programs • Program Support Reviews (PSRs) Pre-MS A/B/C • Risk Reduction activities (e.g., Technical Risk assessment in AoAs, Competitive Prototyping) • SE Technical Reviews - Informed Trades for Feasible Solutions • Developmental Test and Evaluation • Integrated DT/OT • Updated T&E Strategy at MS A

  7. New Systems Engineering Enclosure • Codifies several previous SE policy memoranda • Codifies a number of SE-related policies and statutes since 2003: • Environmental Safety and Occupational Health • Modular Open Systems Approach • Data Management and Technical Data Rights • Item Unique Identification • Spectrum Supportability • Corrosion Prevention and Control • Introduces new policy on Configuration Management

  8. DoD Guidance • Defense Acquisition • Systems Engineering • Developmental Test and Evaluation (DTE) • Modeling and Simulation (M&S) • Safety • System Assurance http://www.acq.osd.mil/sse/pg/guidance.html

  9. DoD SE Best Practice Continuum Policy &Guidance Developingthe Workforce& AdvancingSE Practice Program Support& Assessment Teamwork &Collaboration

  10. NRC Study Recommendations • Milestones A/B Critical • Correct SE Staffing • Component Development Planning • Pre-MS A analysis OSD Systems Engineering Strategy DSB DT&E Report & 231 Report Recommendations • Integrated DT/OT • Reliability Improvement • Early SE • Access to Relevant Data forEvaluation & Decision Making OSD Policy & Guidance DT&E Revitalization Reliability WorkforceDevelopment Enhanced SEPre-MS B Systemic Analysis Recommendations* • Achievable Acquisition Strategy • Enhanced Gate Review Process • Enhanced Staff Capabilities *Based on 3,700 Program Assessment findings from 40 Programs Support Reviews

  11. Workforce Development The Defense Acquisition Community 126,033 Government and Military Certified Professionals 500,000+ Defense Industry Personnel SSE Functional Leader for SE, T&E, and PQM workforce

  12. SE Human Capital Strategy • FY08 NDAA Section 852: DoD Acquisition Workforce Development Fund: >$300M per year across DoD • SE, PQM and T&E initiatives to Recruit, Train, and Retain the workforce • DoD Human Capital Initiative - Published Annex for SPRDE, PQM, and T&E Career Fields • SE “core competency” assessment effort; completion - Summer 2009 • “Program Systems Engineer” career path • Partnership with INCOSE SE Certification Program • “CSEP-Acq” aligned with Defense Acquisition Guidebook • Equivalency granted for DAU Courses SYS101 and SYS202 • Expanding potential for Industry “certifications” • what the future could hold

  13. Human Capital Initiatives (Defense Acquisition Workforce Development Fund 1) 1 Based on NDAA Section 852, Defense Acquisition Workforce Development Act

  14. Notional DoD Systems Engineering Population Workforce Size 25-35 35-45 45-55 >55 Workforce Age

  15. Notional DoD Systems Engineering Workforce Strategy Retain: Rotational Training Assignments Education with Industry Qualification Cards Incentives Hazardous Duty Pay for PSEs?? Train: Mentor Recruit: Highly Qualified Experts Workforce Size Retain Recruit: Journeymen Recruit: Interns 25-35 35-45 45-55 >55 Workforce Age

  16. NDIA Systems Engineering Division E&T Committee Mission:Strengthen Systems Engineering capabilities through education, training and experience across government, industry, and academia. 2009 Task: To address the issue of “self-proclaimed” Systems Engineers, the E&T Committee is attempting to determine the core essential set of competencies that distinguish systems engineers from domain engineers. Contact Govt Committee Chair: Dr. Don Gelosh, CSEP-Acq ODUSD (A&T) SSE / HC 703-695-0472 donald.gelosh@osd.mil

  17. Life Cycle View Unit of Competence Lvl Element(s) Competence Material Solution Analysis Professional / Interpersonal Technology Development Unit of Competence EMD Element(s) Competence Lvl Production & Deployment Operations & Support SE Competency Model Systems Thinking Unit of Competence Element(s) Lvl Competence Factors that Affect the SE Competency Profile Behavior Structure Goal Essence of Systems Analytical Problem Solving Method Problem Solving Analytical Systems Engineer Composite Competency Map SE Technical Unit of Competence Element(s) Lvl Competence Stakeholder Requirements Analysis Analytical Growth Comp. Level Analytical Requirements Analysis Minimum Essential SE Criteria Architecture Design Analytical Time Analytical Implementation Domain Knowledge SE Technical Mgmt Unit of Competence Lvl Element(s) Competence Technical Management Decision Management Technical Management Risk Management Technical Management Configuration Management Technical Management Technical Planning

  18. Research to Practice Ongoing Development Acquisition Community (DoD and Industry) Policy Guidance Education & Training Provide Lessons Learned and Challenges • SERC Governance & Collaborators • Co-Sponsors • Government PMO • Tasking Activities • Industry • Associations • Academia SERC Deliver New/Improved SE Methods, Processes & Tools (MPTs)

  19. Need for SE Research • State-of-the-practice must keep up with application needs of DoD acquisition • Methods, processes, and tools to enable effective acquisition and sustainment of systems • Leveraging Modeling & Simulation for SE • Size and complexity of modern systems drive need for commensurate extensions of SE, • To System of Systems (SoS), to complex systems, to “architecting,” to net-centric sets of services • Validation & Verification challenges • SE “theory and practice” should be inclusive of, and establish linkages to, challenged sub-specialty areas • Software engineering, reliability engineering, system safety, costing, etc.

  20. Stevens-SERC Team Stevens Institute of TechnologyLead University University Partners • Auburn University • Air Force Institute of Technology • Carnegie Mellon University • Fraunhoffer Center at UMD • Massachusetts Institute of Technology • Missouri University of Science and Technology (S&T) • Pennsylvania State University • Southern Methodist University • Texas A&M University • Texas Tech University • University of Alabama in Huntsville • University of California at San Diego • University of Maryland • University of Massachusetts • University of Southern California • University of Virginia • Wayne State University The DoD Systems Engineering Research Center will be responsible for systems engineering research that supports the development, integration, testing and sustainability of complex defense systems, enterprises and services. Stevens has MOUs to develop enhanced SE courseware for competency development within DoD with AFIT, NPS and DAU. Further, SERC members are located in 11 states, near many DoD facilities and all DAU campuses.

  21. Initial SERC Research Tasks • “Assessing Systems Engineering Effectiveness in Major Defense Acquisition Programs (MDAPs)” • USC task lead, with support from Stevens, Fraunhofer Center, MIT, University of Alabama at Huntsville • Characterize SE effectiveness within the context of DoD acquisition and identify methods of measurement suitable for application to project execution organizations (e.g., defense contractors), program management organizations (e.g., project managers and program executive offices) and oversight organizations (e.g., OUSD(AT&L)) (OSD) • “Evaluation of Systems Engineering Methods, Processes, and Tools (MPT) on Department of Defense and Intelligence Community Programs” • Stevens task lead, with support from USC, University of Alabama at Huntsville, and Fraunhofer • Examine and recommend areas for advancing current SE methods, processes, and tools (MPTs) as they are applied across the DoD acquisition life cycle focusing on three different development environments: individual weapons systems, SoS, and network-centric systems (NSA)

  22. Ways Your Organizations Can Become Involved… • Identify SE research challenges • Provide funding to sponsor research as a Tasking Activity • Collaborate on DoD-sponsored research by identifying pilot programs to participate, provide relevant acquisition information, identify subject matter experts • Make use of research findings to improve systems engineering on acquisition programs

  23. DoD SE Best Practice Continuum Policy &Guidance Developingthe Workforce& AdvancingSE Practice Program Support& Assessment Teamwork &Collaboration

  24. DoD Components and Agencies Product Centers Program Executive Offices Acquisition Programs (PMs and CSEs) Practicing PMs and SEs Other DAWIA Career Fields Industry First tier contractors (e.g., Lockheed Martin, Boeing, Northrop-Grumman, Raytheon) Second tier contractors Practicing SEs SE tool vendors Education & Training Institutions DoD organizations: DAU, AFIT, NPS US universities with SE programs (~60) SE short course providers Professional and Industrial Associations DAU Alumni Association INCOSE NDIA SE Division, other assoc Div IEEE Standards Committee; Systems Council AIAA SE Committee TechAmerica ISO Our World of Stakeholders

  25. Budgeting Requirements Acquisition DoD Challenge: “A”cquisition Process • ICD, CDD, CPD • Future Threat and Op Tempo • Planning, Programming, and Budgeting Process • FYDP • POM Process • Appropriation and Authorization of Funds “Sweet Spot” • Acquisition Policy, Guidance, and Oversight. • MDAP Decision Authority

  26. Budgeting Requirements Acquisition DoD Goal: Increase the Overlap “Sweet Spot”

  27. Systems & Software EngineeringOrganization Overview

  28. Opportunities for SSE Engagement • SSE • Policy & Guidance • Systems Engineering • DT&E • Program Support • Program Support Reviews • OIPT, T&E and SE WIPTs • AOTR, Post-CDR Review & Assessment • Workforce Planning • Competency Models • Certification Reqts • Education & Training • Emerging Concepts • Systems of Systems • SE Research • Outreach • SE Forum • Engagement Strategy Statutory Direction Congress Sec Def RequirementsDevelopers Service AcquisitionExecutives PEOs Program Offices Prime Contractors Second Tier Contractors Education & Collaboration Infrastructure Professional / Industry Associations (NDIA, INCOSE, AIA, ITEA, TechAmerica, etc.) DAU, Academic Institutions, SERC AT&L Direction ICD, CDD, CPD DAB, ITAR, DSAB, OIPT PSR, SEP, TEMP,Technical Reviews CSEP-Acq, Research, Industry-University SE Workforce Initiatives, Industry-Gov’t Projects

  29. DoD Human Systems Integration (HSI) Problem • HSI is not being consistently integrated within DoD acquisition resulting in increased ownership costs and lower system effectiveness. Approach • Assign DUSD(A&T) Executive Authority to define policy, as necessary, to implement changes. • Maximize the good work that has been done (policies, plans, methods, tools and standards). • Engage DoD stakeholders and strengthen industry affiliations including NDIA, TechAmerica and INCOSE. • Identify gaps and apply necessary resources to resolve gaps or barriers to successful HSI implementation. Cross-DoD and Technical Community Collaboration

  30. Collaboration with INCOSE July 19 – 23, 2009 • International Panel: “What Defines a Systems Engineer? Comparing and Contrasting Global Perspectives on Systems Engineering Competency” • Moderator: Dr. Don Gelosh, Human Capital Strategy and Planning, OSD Directorate of Systems and Software Engineering • Dr. Arthur Pyster, Distinguished Research Professor, School of Systems and Enterprises, Stevens Institute of Technology; Deputy Executive Director, DoD Systems Engineering Research Center; and member of the INCOSE Board of Directors • Dr. John Snoderly, Program Director, Systems Engineering, Defense Acquisition University • Samantha Brown, President-Elect of INCOSE and Systems Engineering Innovation Centre (SEIC), Loughborough, UK • Mark Kupeski, Director, Complex Systems Integration, IBM Global Business Services • Dr. U. Dinesh Kumar, Professor in Quantitative Methods and Information Systems, Indian Institute of Management Bangalore • Professor Stephen Cook, Director, Defence and Systems Institute, University of South Australia

  31. Senior Level DisciplineExtensions Entry Level INCOSE Multi-LevelProfessional Certification Foundation Level ESEP Expert Systems Engineering Professional CSEP-Acq CSEP w/ US DoD Acquisition CSEP Certified Systems Engineering Professional ASEP Associate Systems Engineering Professional

  32. CSEP with US DoD Acquisition Extensions • Targeted towards Systems Engineers who support or work in a US Department of the Defense acquisition environment • Same core CSEPs experience, education, and knowledge requirements • Additional acquisition knowledge items tested • Available since July 2008

  33. DoD Best Practices Clearinghouse (BPCh) The DoD Acquisition Best Practices Clearinghouse (BPCh) facilitates the selection and implementation of systems engineering and software acquisition practices appropriate to the needs of individual acquisition programs. The BPCh uses an evidence-based approach, linking to existing resources that describe how to implement various best practices. https://bpch.dau.mil/

  34. We Need You! • Browse the Clearinghouse for applicable best practices • Provide feedback on practices • Submit additional evidence for an existing practice • Submit a lead or a possible best practice • Tell others about the Clearinghouse • Consider becoming an editor to help vet practices submitted in your area of expertise

  35. DoD SE Best Practice Continuum Policy &Guidance Developingthe Workforce& AdvancingSE Practice Program Support& Assessment Teamwork &Collaboration

  36. OSD’s Program Support Reviews…“The Source of Systemic Root Cause Analysis Data” Rigor in process … Defense Acquisition Program Support Methodology (DAPS) v1.1 Plan • Tools & Materials • Tailorable, Scalable Methodology (DAPS v1.1) • Templates • Criteria • Questions • Training Materials • Execution Guidance • Subject Matter Expertise Conduct Analyze Report …ensures integrity

  37. Program Support Review Data Set(since March 2004) • PSRs/NARs completed: 57 • AOTRs completed: 13 • Nunn-McCurdy Certification: 13 • Participation on Service-led IRTs: 3 • Technical Assessments: 13 • Reviews planned for CY08: • PSRs/NARs: 10 • AOTRs: 1 • NM: 1 Data derived from diverse & broad program set

  38. Program Effects & Root Cause Program Unique Solutions Systemic Root Causes Corrective Actions Systemic Issues DoD Acquisition Community OSD Systemic Analysis: Data Model Tactical, Program and Portfolio Management Use collective set of program findings… Program Review Findings … to identify systemic issues at the root cause level… Strategic Management … and develop recommendations that mitigate problems at their source! • Other Processes (JCIDS, etc) • Oversight (DAB/ITAB) • Execution (staffing) • Policy/Guidance • Education & Training • Best Practices

  39. Use “Core” and “Systemic” Root Cause tagging to cull data into focused subsets Analyze and trend using Pareto charting and Pivot Tables Findings Analysis Pareto & Pivot Tables) Prelim. Analysis Results “Negative” Findings • PSR • findings Deriving Systemic Issues… Systemic Issues 44 Programs Hypothesis testing: Re-analyze, review… Occurred on 50% or more of the programs

  40. SRCA Milestones & Results OUSD/SSE root cause analysis yields systemic issues and preliminary recommendations OUSD/SSE sponsored NDIA Task Group to develop executable acquisition recommendations NDIA Task Group concludes effort and proposes actions for both government and industry Sep 07 – Feb 08 Mar 08 – Oct 08 Oct 08 – Dec 08 3 Recommendation Areas Acquisition Strategy & Planning (ASP) Enhanced Staff Capability (ESC) Decision Gate Review (DGR) Early & sufficient planning People: the right skills and numbers Clear, enforceable execution criteria

  41. O&S New Opportunitiesfor Independent Reviews What’s relevant:Mandatory Milestone A for all “major weapon systems”  MS B after system-level PDR* and a PDR Report to the MDA EMDD with Post-CDR* Report and MDA Assessment PSR and AOTR in policy MS A MS B MS C Materiel Solution Analysis Engineering and Manufacturing Development and Demonstration Production and Deployment CDD CPD CBA ICD Strategic Guidance Joint Concepts Technology Development MDD Full Rate Production Decision Review Potential Independent Technical Reviews - PSRs and AOTRs OTRR • Program Support Reviews (PSRs) • All ACAT ID and IAM • To inform the MDA on technical planning and management processes thru risk identification and mitigation recommendations • To support OIPT program reviews and others • as requested by the MDA • Assessments of Operational Test • Readiness (AOTRs) • All ACAT ID and special interest programs • To inform the MDA, DOTE, & CAE of risk of a system failing to meet operational suitability and effectiveness goals • To support CAE determination of materiel readiness for IOT&E * PDR – Preliminary Design Review * CDR – Critical Design Review * OTRR – Operational Test Readiness Review

  42. DoD SE Best Practice Continuum Policy &Guidance Developingthe Workforce& AdvancingSE Practice Program Support& Assessment Teamwork &Collaboration

  43. Always Our Focus The Mission: Delivering Timely and Affordable Capabilities to the Warfighter The Defense Acquisition Community 126,033 Government and Military Certified Professionals 500,000+ Defense Industry Personnel For additional information: http://www.acq.osd.mil/sse

  44. DoD Guidance (1/2) • Defense Acquisition • Defense Acquisition Guidebook (DAG) • Integrated Defense Acquisition, Technology, & Logistics Life Cycle Management Framework, Version 5.3.2: December 8, 2008 • Systems Engineering • DAG Chapter 4, Systems Engineering • Systems Engineering Plan (SEP) Preparation Guide, Version 2.01, April 2008 • SEP Frequently Asked Questions (FAQs), February 8, 2008 • SE WIPT Brief: An Overview of Technical Planning and Systems Engineering Plan (SEP) Development, Version 1.0, March 20, 2008 • SE Working Integrated Product Team (WIPT) Generic Charter Template • Systems Engineering Guide for Systems of Systems, Version 1.0, August 2008 • Systems Engineering Assessment Methodology, Defense Acquisition Program Support (DAPS), Version 2.0, October 21, 2008 • Guide for Integrating Systems Engineering into DoD Acquisition Contracts, Version 1.0, December 11, 2006 • Risk Management Guide for DoD Acquisition, 6th Edition, Version 1, August 2006 • Integrated Master Plan / Integrated Master Schedule Preparation and Use Guide, Version 0.9, October 21, 2005 • Program Manager's Guide: A Modular Open Systems Approach (MOSA) to Acquisition, Version 2.0, September 2004 • Modular Open Systems Approach (MOSA) Program Assessment and Rating Tool (PART), Version 1.02 • DoD Guide for Achieving Reliability, Availability and Maintainability, August 1, 2005 • Program Reliability and Maintainability Review Template, Version 1.0, August 15, 2008 • Designing and Assessing Supportability in DOD Weapon Systems: A Guide to Increased Reliability and Reduced Logistics Footprint, October 24, 2003 • Technical Review Checklists http://www.acq.osd.mil/sse/pg/guidance.html

  45. DoD Guidance (2/2) • Developmental Test and Evaluation (DTE) • DAG Chapter 9, Integrated Test and Evaluation • Guide on Incorporating Test and Evaluation into DoD Acquisition Contracts, FINAL DRAFT FOR FINAL APPROVAL VERSION, September 2008 • Modeling and Simulation (M&S) • DoD 5000.59-M, DoD Modeling and Simulation (M&S) Glossary, January 15, 1998 • Recommended Practices Guide for Verification, Validation, and Accreditation (VV&A), Build 3.0, September 2006 • Guide for Modeling and Simulation for the Acquisition Workforce, Version 1.01, October 2008 • Safety • System Safety — ESOH Management Evaluation Criteria for DoD Acquisition, Version 1.1, January 2007 • Joint Systems Safety Review Guide for USSOCOM Programs, Version 1.1, October 12, 2007 • Unmanned System Safety Guide for DoD Acquisition, Version 0.92, June 27, 2007 • Joint Weapons and Laser Safety Review Guide (Draft), Version 0.92, September 4, 2007 • ESOH in Acquisition: Integrating ESOH into SE, Version 3.0, January 2008 • System Assurance • DAG Chapter 7, Acquiring Information Technology and National Security Systems • DAG Chapter 8, Intelligence, Counterintelligence, and Security Support • DoD 5200.1-M, Acquisition Systems Protection Program, March 16, 1994 • Engineering for System Assurance, Version 1.0, October 2008 http://www.acq.osd.mil/sse/pg/guidance.html

  46. Repository DoD Best Practices Clearinghouse (BPCh) Program Goals • Useful Information • Help finding, selecting, and implementing practices appropriate to User’s situation; fill the gap between “what” and “how” • Active Knowledge Base • Not just another practice list; experience data updated, expanded refined; encourages organic growth • A Single Source • For answers about practices, how to apply them, when they are good to use; lessons learned; and risks to avoid Living Knowledge Integration with DoD communities of practice (ACC) and experts • Validated practices • Consistent, verifiable information

More Related