1 / 23

Verification of Software Architectures via Reviews

Verification of Software Architectures via Reviews. Dr. Forrest Shull, FCMD. Outline. Lessons Learned about Software Reviews at NASA Tailoring Lessons Learned for IV&V Architecture Verification Mechanisms for Continuous Improvement. Software Inspection.

carson
Download Presentation

Verification of Software Architectures via Reviews

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Verification of Software Architecturesvia Reviews Dr. Forrest Shull, FCMD

  2. Outline • Lessons Learned about Software Reviews at NASA • Tailoring Lessons Learned for IV&V Architecture Verification • Mechanisms for Continuous Improvement

  3. Software Inspection • A long history of research & application shows that structured human inspection is one of the most cost-effective practices for achieving quality software: • “Cost savings rule”: Cost to find & fix software defects is about 100x more expensive after delivery than in early lifecycle phases, for certain types of defects. • IBM: 117:1 between code and use • Toshiba: 137:1between pre- and post-shipment • Data Analysis Center for Software: 100:1 • “Inspection effectiveness rule”: Reviews and inspections find over 50% of the defects in an artifact, regardless of the lifecycle phase applied. • 50-70% across many companies (Laitenberger) • 64% on large projects at Harris GCSD (Elliott) • 60% in PSP design/code reviews (Roy) • 50-95%, rising with increased discipline (O’Neill) • … many others

  4. Key Lessons Learned Resources Procedures Schedule & Staff Time Training Input Output Meeting Meeting (Find & Record Defects) (Identify Potential Defects) (Verify Fixes) (Select Inspection Team) (Present Background Information) (Fix Defects) Improved Work Product Work Product (Open Issues / Solutions) By-Products Metrics Defects (To Prevent Repetition of Defects) ( For Monitoring and Controlling Process and For Continuous S/W Process Improvement)

  5. Key Lessons Learned • Focusing the review materials • Giving each reviewer a particular and unique perspective on the document under review • Making individual review of a document an active (rather than passive) undertaking • Articulating the quality aspects of interest • …An approach known as Perspective-Based Inspection incorporates these key points.

  6. Perspective-Based Inspection (PBI) • History • 2001-2003: SARP research funding • Refined basic approach, implemented on NASA projects and in classrooms, measured results • 2004: Tech infusion with Flight Software Branch / GSFC • Goal: Produce new Branch standards • 2004: Tech infusion with United Space Alliance / JSC • Goal: Reduce defect slippage over current increment • 2007-2009: SARP funding for improved general inspection planning • 2008: Tech infusion with L-3 Communications / IV&V • Goal: Effective approaches for inspecting system models • 2008-2009: NSC funding for inspection training and workshops • Other Results and Benefits • Improved defect detection effectiveness substantially (NASA & elsewhere) • Helped re-invigorate inspection practices • Forms the basis of industrial and academic training

  7. Recent Application with NASA Teams SSC: LH BargeLadder Logic ARC: Simulink modelsfor smallsat Wallops: SIPS Core Slaving ComputationProcess JSC: LIDS GSFC & MSFC:NPR 7150.2 procedural reqts forsoftware engineering.

  8. Focusing the Review • Guidelines regarding amount of technical material that can be effectively inspected • Based on hundreds of previous NASA inspections • 20 to 40 pages / hour of meeting time • Implies that choices need to be made about where to focus • Find the areas with: • High risk • Large complexity • Critical communication • Architectural “danger signs” ? • Problematic development history ? (e.g. author stovepipes)

  9. Focusing the Review For systems with history, analysis and visualization can help focus on appropriate system components

  10. Focusing the Review • Potential “code smells” can be automatically detected for further investigation. • ATFD= Access To Foreign Data • WMC= Weighted Methods per class • TCC= Tight Class Cohesion Class uses directly more than a few attributes of other classes ATFD > FEW (5) Functional complexity of the class is very high God Class AND WMC ≥ VERY HIGH (47) Class cohesion is low TCC < ONE THIRD (0.33)

  11. Review Perspectives • Improves both team and individual results by up to 33%. Rationale: • Focus the responsibilities of each inspector • Minimize overlap among inspector responsibilities • Maximize union of defects found Rev. 1 0 Designer 2 Rev. 2 1 Tester 3 1 3 8 1 Rev.3 5 3 Use-based 4 6 0 vs. 1

  12. System Architecture Review Perspectives • Tailoring required for new domain, new issues, new artifacts Is architecture complete, correct, consistent, and testable? Architecture verification Architecture construction Needs, goals, objectives elucidation Architecture validation Are business / user needs adequately reflected in arch.? Do defined functionalities map to architecture?

  13. Review Perspectives • Possible perspectives for this domain: • Architect • Assesses architecture against requirements & existing models • Quality foci: Appropriate level of detail; completeness and clarity; consistency and correctness of models • Domain expert • Assesses whether architecture accurately and usefully captures domain • Quality foci: Correctness of architecture (independent of models); identification of stakeholders and evaluation of usability from their POV; checking flow of control & use of reusable components • Quality assurance • Assesses whether architecture and system can be validated properly • Quality foci: Handling of exception cases / unexpected system conditions; robustness of system; testability

  14. Articulating Defect Types • 5-7 defect types per reviewer help them focus on the task appropriately • Sources of defect categories: • Checklists and defect categories from previous work • Experienced inspectors • What is it that the most experienced people are looking for? • Can get from: • Asking them! • Noting recurring questions at inspection meetings • Defect/discrepancy report databases • What types of problems seem to be common? • What types of problems take the longest to close?

  15. Articulating Defect Types • Review has proven effective at finding architectural defects of: • Ambiguity • Formatting / notation / identifier problems • Incomplete architecture • Critical functional definition or detail missing • Missing analysis of possible states / allowable parameters / other info for coders and testers • Extraneous / superfluous components • Problem with supporting or reference documents • Reference to outdated supporting document • Reference to wrong supporting document • Incomplete list of reference documents

  16. Articulating Defect Types • Review has proven effective… (cont.) • Incorrect architecture • Conflicting elements • Incorrect as per domain knowledge or parent requirements • Unsuitability to the current problem • Performance could be improved • Problems with error handling • Error conditions not listed or referenced • Missing error handling requirements • Inappropriate functions may be allowed during safe mode • Too many / unnecessary error checks are included, will affect performance • Interface problems • Interfaces not specified • Interfaces do not match

  17. Active Review • Reading through a document one page at a time is not the most effective approach. • Improved results from using scenarios to guide reviewers • Should be appropriate to the perspective • Should be appropriate to reviewer’s expertise, e.g. • Creating models relevant to the analysis • Ordering components by complexity, importance • Reviewers look for relevant defect types as they follow the scenario

  18. Active Review • Some examples: • Identify use cases, and trace control flow through the architecture from the start conditions onward • Identify scenarios based on business drivers, analyze how the architecture will handle each (SEI’s ATAM method) • Identify potential failure modes, analyze where the functionality resides for handling each • Conduct a separate domain analysis, map the results into the architecture

  19. Continuous Improvement • Review materials as “Experience Base” • Defect taxonomy, perspectives, scenarios represent best knowledge about how to effectively find defects • Can be improved over time as we learn from outcomes …Should be betterdefined or removedfrom checklist? Defect Type X - example data - …Seems to be on-target and effective? Defect Type Y

  20. Continuous Improvement • Using metrics to focus defect types: Analysis of past defect history Analysis of found defects Materials for focusing the review • Analysis of defect disposition:Issues detected vs. • Closed • Withdrawn / not an issue

  21. Continuous Improvement • Use measurement results to further optimize, e.g., • Consistently missing a certain type of defect? • Add a new defect type and associated questions to relevant perspectives • Missing a whole set of defects? • Consider whether a new perspective should be added. • Do we never find defects of type x? Are there any perspectives that don’t seem to add much that the other perspectives didn’t find? • Consider deleting the relevant defect type or perspective. • Do we consistently face a lot of problems of type x? • Add more reviewers using relevant perspectives

  22. Summary • We have many lessons learned about the effectiveness of software reviews for verification at NASA • Architecture reviews are configurable based on: • Specific quality goals of the mission • Specific process in which architecture was produced / will be used • Review materials also permit several “hooks” for creating experience bases on practical and effective techniques for verification

  23. Contact information Forrest Shull fshull@fc-md.umd.edu 301-403-8970

More Related