Report of architecture and product working group
This presentation is the property of its rightful owner.
Sponsored Links
1 / 16

Report of Architecture and Product Working Group PowerPoint PPT Presentation


  • 50 Views
  • Uploaded on
  • Presentation posted in: General

Report of Architecture and Product Working Group. ICM Workshop Washington, DC July 17, 2008. J. D. Baker, BAE Systems A. Winsor Brown, USC-CSSE Karl Brunson, Lockheed Martin Paul Croll, CSC Thomas Knott, OSD Art Pyster, Stevens Paul Russell, Aerospace

Download Presentation

Report of Architecture and Product Working Group

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Report of architecture and product working group

Report of Architecture and Product Working Group

ICM Workshop

Washington, DC

July 17, 2008


Working group members

J. D. Baker, BAE Systems

A. Winsor Brown, USC-CSSE

Karl Brunson, Lockheed Martin

Paul Croll, CSC

Thomas Knott, OSD

Art Pyster, Stevens

Paul Russell, Aerospace

Robert Schwenk, Army ASA(ALT)

J. Bruce Walker, SAF/AQRE

Lee Zhou, Boeing

Working Group Members


Working group charter

Identify and prioritize the most important issues associated with Architecture and Products (engineering artifacts) for ICM and Competitive Prototyping (CP)

Suggest OSD initiatives and other actions to address those issues

Working Group Charter


Definition of architecture

IEEE 1471: fundamental organization of a system embodied in its components, their relationships to each other, and to the environment, and the principles guiding its design and evolution.

Don Firesmith (from the SEI): The set of all the most important, pervasive, higher-level strategic decisions, inventions, engineering trade-offs, and assumptions (DIETAs), and their associated rationales concerning how the system meets its allocated and derived product and process requirements.

Definition of Architecture

The Firesmith definition is the more useful for CP and ICM


Focus

Because CP is conducted to reduce risk, and the ICM is a risk-driven life cycle model, we focused on how to use Architecture and Product to understand, manage, and reduce risk.

As defined by Firesmith, the architecture includes many DIETAs and their rationale, not just the risky ones.

For CP and anchor points in the ICM, we will focus on risky DIETAs; i.e., DIETAs with weak rationale which, if wrong, could have a significant negative impact on program cost, schedule, or performance.

Strong rationale is based on objective evidence. Weak rationale is based on assertion and opinion.

Focus


System architecting paradigm

Three activities should happen concurrently and iteratively:

Systems and software engineers establish the most critical requirements/objectives – including those for “ilities”

Systems and software architects develop a system and software architecture that the architects believe will simultaneously support all critical requirements/objectives

Engineers evaluate the architecture for how well it really supports critical requirements/objectives, creating substantiating evidence for the architecture or identifying weaknesses in it

Today, it is common for any of these activities to be shortchanged, especially the third.

System Architecting Paradigm


Types of evidence

Analytic models

Scenario-based execution of prototypes

Scenario-based execution of simulations

Benchmarking

Appeal to historical analogy (we did something similar several times before)

Architecture Quality Cases (analogous to safety cases) with claims, arguments, and evidence

Process execution results, such as test results from early software builds

Types of Evidence


Cp icm issues and actions unordered

Architectures expressed using DoDAF typically do not include all of the DIETAs in sufficient detail to support rigorous evaluation.

Action: Develop architectural representation guidance requiring DIETAs to be developed in sufficient detail to support rigorous evaluation. For example, DoDAF architectures typically don’t contain enough information to perform safety case analyses or to understand the security properties of the system.

The “ilities” are often understated in the requirements/objectives, yet are often a key source of problems later in system development. An architectural view for each of the relevant quality characteristics is required.

Action:Develop guidance requiring “ilities” to be sufficiently documented and articulating what sufficient means.

Action:Research how to present sufficient information in the views to support adequate evaluation.

CP/ICM Issues and Actions (unordered)


Examples of quality characteristics

Efficiency

Completeness

Correctness

Security

Compatibility

Interoperability

Maintainability

Expandability

Testability

Portability

Hardware Independence

Software Independence

Examples of Quality Characteristics

Installability

Reusability

Reliability

Error Tolerance

Availability

Usability

Understandability

Ease of Learning

Operability

Communicativeness

Survivability

Flexibility


Cp icm issues and actions unordered1

Architectures often do not state the rationale (evidence) for their DIETAs in sufficient detail to understand which ones are particularly risky.

Action:Develop guidance requiring the rationale for DIETAs to be stated in sufficient detail and articulating what sufficient means.

There is no guidance for what evidence is adequate for any given situation or how that evidence should be presented (analogous to the problem of knowing when you have tested enough). How much prototyping is “enough”? How much evidence is “enough”?

Action: Conduct research on how much prototyping and evidence is enough and then document the research results in guidance.

Action:Engage Chris Powell on his dissertation research based upon his assessment of ACAT 1D program architectures since July 2004.

CP/ICM Issues and Actions (unordered)


Cp icm issues and actions unordered2

Government program offices are probably not staffed with enough people with the skills to request the correct evidence from the supplier and to evaluate that evidence when the supplier provides it. Government offices should not request evidence unless they are able to evaluate it.

Action:Consider forming an architecture assessment team (and other types of assessment teams) at the OSD level that would be a resource available to interested programs.

Since competing suppliers will have different architectures, the architectures will have different risk profiles and therefore require different evidence. Who decides what evidence will be provided? The government? The supplier? How will the government fairly evaluate competing prototypes when presented with different types of evidence?

Action:Investigate legal and contractual implications of specific evidence requirements.

CP/ICM Issues and Actions (unordered)


Cp icm issues and actions unordered3

A competition should involve regular submission of evidence – not just once at the end of the competition. Can suppliers “fix” problems along the way and resubmit stronger evidence? It would seem to be in the government’s best interest to allow this, but could be construed as “unfair” by some competitors.

Action:Investigate legal and contractual implications of requesting regular submission of evidence and propose ways to enable regular submission of evidence.

Creating evidence is often dependent on exercising scenarios, which are extremely difficult to generate in sufficient number and sufficient diversity to uncover weak DIETAs, especially for SoS.

Action:Research how to generate an adequate and diverse set of scenarios, especially for SoS or investigate alternative approaches to developing scenarios.

CP/ICM Issues and Actions (unordered)


Icm issues and actions unordered

Providing evidence for an SoS at regular milestones is especially challenging because the evidence provided by the individual system elements may not be available when originally expected. Understanding impact analysis across elements when something changes is challenging.

Action:Research how to perform impact analysis across elements and how to respond to “breakage” in synchronization across elements.

As development progresses from milestone to milestone, new evidence reconfirming key DIETAs is needed. There is no guidance as to what that evidence should be and how often it should be collected.

Action:Research what evidence is required to reconfirm key DIETAs and then document the approaches in guidance.

ICM Issues and Actions (Unordered)


Icm issues and actions unordered1

Program offices are inherently biased when it comes to evaluating evidence that a supplier is making sufficient progress to pass a milestone. Having independent non-advocate reviews of evidence eliminates that problem, but can be expensive and difficult to staff.

Action:Investigate the cost and feasibility of independent non-advocate reviews vs. the cost of inadequate review by failing to use independent reviewers.

ICM Issues and Actions (Unordered)


Value and ease of implementing actions

Value and Ease of Implementing Actions


Value and ease of implementing actions1

Value and Ease of Implementing Actions


  • Login