1 / 11

System Architecture Measurement

System Architecture Measurement. Continuation of NDIA Measurements Task. Goal of last year’s task was to:

Download Presentation

System Architecture Measurement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. System Architecture Measurement

  2. Continuation of NDIA Measurements Task • Goal of last year’s task was to: • Identify a set of leading indicators that provide insight into technical performance at major decision points for managing programs quantitatively across their life cycle, with emphasis on Technology Development (TD) and Engineering Manufacturing and Development (EMD) phases. • Build upon objective measures in common practice in industry, government, and accepted standards. Do not define new measures unless currently available measures are inadequate to address the information needs. • Select objective measures based on essential attributes (e.g., relevance, completeness, timeliness, simplicity, cost effectiveness, repeatability, and accuracy). • Measures should be commonly and readily available, with minimal additional effort needed for data collection and analysis.

  3. Architecture Measurement • Architecture was a high priority area but … • No measures met the criteria • Architectures must be complete, consistent, and correct • Complete –All the elements required to describe the are solution present and all the requirements/needs are addressed • Consistent –The artifacts that describe the architecture are internally consistent and consistent with external constraints (e.g. external interfaces) • Correct –The architecture satisfies all requirements within the program constraints (cost, schedule, etc.) and it the architecture that does so to the greatest extent • This results in a need for multiple types of measurement

  4. Architecture Measurement • Two types of measurement needed • Completeness/Maturity and Consistency • Are all the elements required present at the current program phase? • Are all requirements accounted for? • Does it tie together? Within an architecture level? Between levels? Between artifact types? • Correct = Solution Quality • Does it meet the stakeholder needs? • Does it avoid known architecture deficiencies? • Does it do so better than alternatives? • Traditionally this was determined at the milestone reviews and was a lagging indicator • Model based architecting (or architecture modeling) makes the evaluation of completeness and consistency feasible as a leading indicator)

  5. Quantitative Measurement • Two types of measures required • Quantitative • Qualitative • Goal is to measure whether an architecture is complete and consistent • Easier with model-based architecting • Anticipated artifacts / completed artifacts • Internal reports showing missing data and inconsistencies between artifacts • Supported by many of the architecture tools but requires effort on the part of the program to create and customize • Models help visualize heuristics as well • Examples • Progress chart • Requirements trace reports (SELI) • TBx closure rate and TBx counts (SELI) • Empty data field counts • Visual reviews of artifacts • Other reports from the modeling tool database that address consistency

  6. Qualitative Measurement • Goal is to ensure the architecture is correct and coherent • Does it meets stakeholder needs within the program constraints? • Is it better than the alternative architectures in satisfying stakeholder needs? • Still somewhat subjective but has aspects that can be measured • Can only be determined in comparison to the alternatives • TPMs and MOE/KPP satisfaction compared • Examples • TPM/MOE radar charts • Est. At Completion vs TPM/MOE • Architecture design trade study records

  7. Internal Measurements/Heuristics • Additional ways to measure architecture quality • Heuristics – “Does it look right” • Review of the model artifacts can sometimes indicate if an architecture exhibits good / bad characteristics such as low cohesion or high levels of coupling • Internal metrics • Number of internal interfaces • Number of requirements per architecture element can indicate an imbalance

  8. Example Progress Table/Chart

  9. Example Architecture “Radar” Chart / Table Attribute 2 Attribute 1 Attribute 3 “Utility Function” for the architecture assessment is a simple weighted sum of the assessed attribute values… repeat for each candidate architecture! Attribute 4 Attribute N Attribute 5

  10. “The eye is a fine architect. Believe it” Werner Von Braun, 1950 Structural Heuristics “A good solution somehow looks nice” • Robert Spinrad, 1991

  11. Heuristics High External Complexity Low External Complexity Which Partitioning is Better? Why?

More Related