1 / 38

DoDAF Improved Harmonization with Systems Engineering -Initial Discussion-

JCIDS. SE. DAS. OPS. PPBE. CPM. DoDAF Improved Harmonization with Systems Engineering -Initial Discussion-. Feburary 2012. Purpose.

tolleson
Download Presentation

DoDAF Improved Harmonization with Systems Engineering -Initial Discussion-

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. JCIDS SE DAS OPS PPBE CPM DoDAF Improved Harmonization with Systems Engineering-Initial Discussion- Feburary 2012

  2. Purpose • Streamline and improve the alignment between DoD Architectural Descriptions (AD) and Systems Engineering (SE) including models (views and viewpoints), model data, artifacts, documents and data items • There are two objectives: • Establish standard cross-DoD relationships: • Architectural Descriptions (AD) and SE artifacts • ADs and SE documents and processes. • Eliminate redundancy to produce and maintain Architectural Descriptions (AD) and SE artifacts and documents.

  3. Initial Approach First objective Establish standard cross-DoD relationships: Architectural Descriptions (AD) and SE artifacts ADs and SE documents and processes. Approach Survey current SE Policy and Guidance Analyze primary SE document standards and guidance Establish common requirement types that comprise SE artifacts and their use in SE processes Formulate relationships between SE artifacts and ADs Models (DoDAF) Data (DM2)

  4. Common Semantics Required to ManageRequirements, Design and Test Complexity

  5. Common Lexicon Facilitates AuditableTraceability and Reduces Ambiguity Vision Guidance Capability Activity Information Rules Desired Effect Conditions Locations Measures Resources Performers Organizations PersonTypes Skill Standards Agreements Project MeasureTypes Systems Services Information

  6. Primary DoD SE Requirements Documents Examined? USAF-2010 Systems Requirements Document (SRD) Navy-2008 Systems Design Specification (SDS) Army-1999 System Performance Specification (SPS) Operational Concept Description (OCD) System/Subsystem Specification (SSS) Data Item Descriptions (DIDs) OSD Contracting 1999-2000 (in current use per Mil-STD-961) System/Subsystem Design Description (SSDD) Software Requirements Specification (SRS) Software Design Description (SDD) Software Product Specification (SPS)

  7. Primary DoD SE Requirements Document Analysis An initial analysis of Component (Army, Navy, etc.) SE policies, directives, guidance and standards revealed: Multitude of guidance documents Components generally follow the contents of the DIDs prescribed by MIL-STD-961E (or original Mil-STD-490) for primary requirements documents Artifacts from Primary SE Requirements Documents replicated in numerous supporting documentation in various forms Regardless of Component, SE documents generally address common requirement types (next slide)

  8. SE Primary Requirements Documents-Common Requirement Types-

  9. Establishing Relationships-System Specifications--Common Requirement Types-

  10. Establishing Relationships-Primary Documents-

  11. Analysis to DateSummary of Issues • Redundancy and Ambiguity • Ambiguous relationship between Architectural Descriptions (AD) and Systems Engineering (SE) products • Inadequate specificity in SE document standards (e.g. DIDs) relative to Architecture data, models and descriptions used in requirements and architecture Documents (e.g. ICD, CDD, ISPs) • Governance • Ambiguity regarding the role of AD in SE processes (Requirements documents, Mil-STD-881C) • Inconsistent precedence and use of DoDAF models in SE documents • SE specification documents are not standardized across Components (e.g. SDS, SRD, PS, Mil-STD-961 DIDs) • Inconsistencies in OSD SETR Guidelines and SE Guidance relative to DoDAF and architecture* • Traceability • Inadequate traceability between AD and SE artifacts • Inadequate traceability between below the document level (not consistent amongst Components, Programs and Projects) *DEFENSE ACQUISITION PROGRAM SUPPORT METHODOLOGY, Version 2.0, January 9, 2009 “4.1.3.C2: The technical system architecture descriptions should use mandated Operational View (OV), System View (SV), and Technical View (TV) products as described in the DoDAF, and should be integral to the system design. There should be System Description Documents (SDDs) and System Capability Specifications (SCSs) that address those for the system and major subsystems.”

  12. Next: Systems Engineering and Architecture Harmonization and Efficiency DoDAF Viewpoints All (AV) Capabilities (CV) Operational (OV) Data / Information (DIV) Systems (SV) Services (SvcV) Standards (StdV) AV CBA System O & M Validation & MS-C Acquisition Model Decisions & Milestones CV TEMPcapabilities System Validation MSA CPD Capabilities Based Assessment (CBA) Material Solutions Analysis (MSA) Verification JCIDS Documents Initial Capabilities Doc (ICD) Capabilities Design Doc (CDD) Capabilities Production Doc (CPD) Information Support Plan (ISP) MS-A SVR Technology Development (TD) OV DIV1 ICD TEMPoperational Prototyping System Verification Engineering & Manufacturing Development (E&MD) SRR SRD,OCD,SPS,SCS TRR System Engineering Technical Reviews System Requirements Reviews (SRR) System Functional Reviews (SFR) Preliminary Design Reviews (PDR) Critical Design Reviews (CDR) Test Readiness Review (TRR) System Verification Review (SVR) SV SvcV DIV2 StdV TEMPsystem • Typical Systems Engineering Work Products • System Requirements Document (SRD) • Operational Concept Description (OCD) • System Capability Specifications (SCSs) • Systems Performance Specification (SPS) • System Design Specification (SDS) • System/Subsystem Specification (SSS) • System/Subsystem Design Description (SSDD) • Software Requirements Specification (SRS) • Software Design Description (SDD) • Software Product Specification (SPS) • Data Base Design Document (DBDD) • Interface Requirements Specification (IRS) • Interface Control Document (ICD) / Interface Design Document (IDD) • Test and Evaluation Master Plan (TEMP) Subsystem Verification SystemDesign SFR SSS,SDS, SRS,IRS CDDprelim; ISPprelim MS-B SV SvcV DIV2 StdV Component Design Component Verification PDR CDDfinal; ISPfinal SSS, SDD,IDD DIV3 CDR SSDD, IDD, SPD, DBDD Build Unit Test Notional Systems Development “V”

  13. Way Forward-Opportunities-

  14. Back Ups

  15. SE Primary Requirements Documents-Common Requirement Types-

  16. DoD/Industry SE Guidance

  17. SETR Milestones

  18. Establishing Relationships-Interface Specifications- Update

  19. Top 5 Systems Engineering Issues in the Defense Industry • Key Systems Engineering practices known to be effective are not consistently applied across all phases of the program life cycle. • Insufficient Systems Engineering is applied early in the program life cycle, compromising the foundation for initial requirements and architecture development. • Requirements are not always well-managed, including the effective translation from capabilities statements into executable requirements to achieve successful acquisition programs. • Collaborative environments, including SE tools, are inadequate to effectively execute SE at the joint capability, System-of-Systems (SoS) and system levels. • The quantity and quality of Systems Engineering expertise is insufficient to meet the demands of the government and the defense industry. Mr. Gary Blohm, Director, US Army RDECOM Communications Electronics Research, Development and Engineering Center , 12 March 2010

  20. Summary of Issues and Improvement Opportunities-From POA&M-

  21. DoD/Industry SE Guidance DAU-2001 OSD-2008 Industry INCOSE-2010 DISA-2011 Navy SysCOMs-2004 Army AMC-2007 USAF SMC-2005 Navy ASN RDA-2006

  22. TEMP CDD CD CD SRD SPS SDS SSS IRS Etc. SEP Info Assur Certs CPD WBS/IMP / IMS TEMP ILSP TDS/AS ISP CDD V&V Plans & Reports ExitCriteria SETR Criteria and Checklists DT&E/OT&E CBA MSA ICD Problem: How to establish and maintain consistency between the numerous products Policy MIL-STDs Handbooks Pamphlets CD Correlated Data? Organizational Interfaces Warfighter Requirements Program /Acquisition Verification Supporting GFI

  23. DoDAF 2.0 Conceptual Data Model

  24. Scoping Architectures to be "Fit-for-Purpose” The architect is the technical expert who translates the decision-maker’s requirements into a set of data that can be used by engineers to design possible solutions. Establishing the scope of an architecture is critical to ensuring that its purpose and use are consistent with specific project goals and objectives. The term “Fit-for-Purpose” is used in DoDAF to describe an architecture (and its views) that is appropriately focused (i.e., responds to the stated goals and objectives of process owner, is useful in the decision-making process, and responds to internal and external stakeholder concerns. Meeting intended objectives means those actions that either directly support customer needs or improve the overall process undergoing change. The architect is the technical expert who translates the decision-maker’s requirements into a set of data that can be used by engineers to design possible solutions. At each tier of the DoD, goals and objectives, along with corresponding issues that may exist should be addressed according to the established scope and purpose, (e.g., Departmental, Capability, SE, and Operational), as shown in the notional diagram in the figure below.

  25. DoDAF Meta-model Groups Mapping to Viewpoints and DoD Key Processes s

  26. Establishing the Scope for Architecture Development

  27. The DM2 Conceptual Data Model-key concepts- • Activity: Work, not specific to a single organization, weapon system or individual that transforms inputs (Resources) into outputs (Resources) or changes their state. • Resource: Data, Information, Performers, Materiel, or Personnel Types that are produced or consumed. • Materiel: Equipment, apparatus or supplies that are of interest, without distinction as to its application for administrative or combat purposes. • Information: The state of a something of interest that is materialized -- in any medium or form -- and communicated or received. • Data: Representation of information in a formalized manner suitable for communication, interpretation, or processing by humans or by automatic means. Examples could be whole models, packages, entities, attributes, classes, domain values, enumeration values, records, tables, rows, columns, and fields. • Architectural Description: Information describing an architecture such as an OV-5b Operational Activity Model. • Performer: Any entity - human, automated, or any aggregation of human and/or automated - that performs an activity and provides a capability. • Organization: A specific real-world assemblage of people and other resources organized for an on-going purpose. • System: A functionally, physically, and/or behaviorally related group of regularly interacting or interdependent elements. • Person Type: A category of persons defined by the role or roles they share that are relevant to an architecture. • Service: A mechanism to enable access to a set of one or more capabilities, where the access is provided using a prescribed interface and is exercised consistent with constraints and policies as specified by the service description. The mechanism is a Performer. The capabilities accessed are Resources -- Information, Data, Materiel, Performers, and Geo-political Extents. • Capability: The ability to achieve a Desired Effect under specified (performance) standards and conditions through combinations of ways and means (activities and resources) to perform a set of activities. • Condition: The state of an environment or situation in which a Performer performs. • Desired Effect: A desired state of a Resource. • Measure: The magnitude of some attribute of an individual. • Measure Type: A category of Measures. • Location: A point or extent in space that may be referred to physically or logically. • Guidance: An authoritative statement intended to lead or steer the execution of actions. • Rule: A principle or condition that governs behavior; a prescribed guide for conduct or action. • Agreement: A consent among parties regarding the terms and conditions of activities that said parties participate in. • Standard: A formal agreement documenting generally accepted specifications or criteria for products, processes, procedures, policies, systems, and/or personnel. • Project: A temporary endeavor undertaken to create Resources or Desired Effects. • Vision: An end that describes the future state of the enterprise, without regard to how it is to be achieved; a mental image of what the future will or could be like. • Skill: The ability, coming from one's knowledge, practice, aptitude, etc., to do something well.

  28. DoD Systems Engineering Technical Reviews (SETRs) DoD SETR Checklists The updated DAG will also describe and refer to these technical review risk assessment checklists. The checklists accessible in the TR CLM are being updated for DoD usage. Seven of the checklists have been updated, and are now accessible on the SE COP. User comments and recommendations for checklist improvements are solicited. NOTE: OSD has established the policy that all of the checklists are intentionally "locked" to preclude minor question changes that may potentially change an evaluation score of "red" to something less problematic ("yellow" or "green"). Most of the Service Technical Authorities endorse this policy. If the checklists were unlocked, any program or evaluator could change the wording of a question to evoke a satisfactory response, thus potentially eliminating oversight during a technical review. The checklists are set up so they can be tailored to exclude questions that are not applicable to a given program. This can be accomplished by selecting "NA" for any given question. The checklist programming will ignore those question(s) when summing totals of each category of responses. https://acc.dau.mil/CommunityBrowser.aspx?id=25710&lang=en-US

  29. Joint Test and Evaluation Methodology (JTEM) DEVELOPMENT STANDARD OPERATING PROCEDURE (SOP), Version 2, January 15, 2011 (Joint Test and Evaluation Methodology (JTEM) Joint Test and Evaluation project) https--acc.dau.mil-adl-en-US-403465-file-56099-MeasuresDevelopmentSOPv2_2011-01-15.pd

  30. Measures Framework Relationship Diagram Capability Hypothesis If one has a combination of means and ways under a set of standards and conditions, then one can perform tasks and achieve desired effects. DEVELOPMENT STANDARD OPERATING PROCEDURE (SOP), Version 2, January 15, 2011 (Joint Test and Evaluation Methodology (JTEM) Joint Test and Evaluation project) https--acc.dau.mil-adl-en-US-403465-file-56099-MeasuresDevelopmentSOPv2_2011-01-15.pd

  31. DoDAF 2.0 Associations Used in Measures Framework DEVELOPMENT STANDARD OPERATING PROCEDURE (SOP), Version 2, January 15, 2011 (Joint Test and Evaluation Methodology (JTEM) Joint Test and Evaluation project) https--acc.dau.mil-adl-en-US-403465-file-56099-MeasuresDevelopmentSOPv2_2011-01-15.pd

  32. Required Relationships for Mission-Based Assessment Mission-Based Test and Evaluation Assessment Process Guidebook, April 1, 2011 https://acc.dau.mil/adl/en-US/439602/file/56839/MBTE_Assessment_Process_Guidebook_2011-04-01.pdf

  33. Complex Task Model Mission-Based Test and Evaluation Assessment Process Guidebook, April 1, 2011 https://acc.dau.mil/adl/en-US/439602/file/56839/MBTE_Assessment_Process_Guidebook_2011-04-01.pdf

  34. System/SoS Scoring Table Mission-Based Test and Evaluation Assessment Process Guidebook, April 1, 2011 https://acc.dau.mil/adl/en-US/439602/file/56839/MBTE_Assessment_Process_Guidebook_2011-04-01.pdf

  35. Aggregate System/SoS Scoring Table Mission-Based Test and Evaluation Assessment Process Guidebook, April 1, 2011 https://acc.dau.mil/adl/en-US/439602/file/56839/MBTE_Assessment_Process_Guidebook_2011-04-01.pdf

More Related