nasa software iv v facility l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
NASA Software IV&V Facility PowerPoint Presentation
Download Presentation
NASA Software IV&V Facility

Loading in 2 Seconds...

  share
play fullscreen
1 / 29
haracha

NASA Software IV&V Facility - PowerPoint PPT Presentation

138 Views
Download Presentation
NASA Software IV&V Facility
An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. NASA Software IV&V Facility Metrics Data Program Mike Chapman chapman@ivv.nasa.gov 304-367-8341 Pat Callis Patrick.E.Callis@nasa.gov 304-367-8309 Ken McGill Kenneth.G.McGill@nasa.gov 304-367-8300 June 5, 2003

  2. The Metrics Data Program Goal • Establish a centralized repository that provides consistent, fully-involved software product data across multiple domains: • To improve the effectiveness of software assurance • To improve the effectiveness of software research • To improve the ability of projects to predict software errors early in the lifecycle

  3. Data, Data Everywhere: The Data Drought There is very little fully-involved software product data… • Error data associated to smallest functional unit • Requirements traced through the design to the smallest functional unit …available to those who need it.

  4. SARP PI Survey To what degree has a lack of software defect data from actual NASA projects impacted your SARP funded research? -  Greatly (9).  Lack of software defect data from actual NASA project has seriously hampered my research- Moderately (8).  It would be nice to have more (or some) real project data, but I have found non-NASA data sources or other workarounds.-  Not significantly (5).  My project has not been impacted because I either have all the data I need or I don't need software defect data Note: The totals are in parentheses.

  5. The Lack of Data from a Project Perspective • Error data associated to smallest functional unit • There is little value in this activity for the project • Requirements traced through the design to the smallest functional unit • Projects only need to trace requirements to the executable (CSC level or above) • Fully-involved data made available to the research community • Vulnerability of the program • Proprietary issues

  6. The Quest for the Holy Grail Regardless of the development model the following is needed: Requirements (Many to one issue.) Problem Reports dev, test, user, maintenance Design Traceability Code (smallest functional unit) Association to the smallest functional unit.

  7. Recruitment of Project Data • Existing repository data • Error Data Enhancement Program

  8. Error Data Enhancement Program The goal of the enhancement effort is to successfully recruit projects to work with the MDP to provide fully-involved software product data – the Holy Grail. The MDP team will provide: • Requirements analysis and traceability support • Configuration Management Support (Error Tracking) • Metrics generation and analysis • Database and web support • Machine learning analysis

  9. MDP Repository Other Project Data Project Error Data MDP Web Site ClearQuest Interface Queries Sanitized Data Firewall Stored Metrics and Error Data ClearQuest Sanitized Data Project Security Project1………………….Project n Project Metrics and Error Data

  10. Benefits • Agency benefits: • The improved ability to predict error early in the lifecycle • The improved ability to assess the quality of the software • The research community benefits: • Availability of quality error and metric data • Availability of a support team for data needs • Participating projects benefit: • Additional metrics analysis • Additional error analysis • Problem tracking tool • Other support such as requirements traceability

  11. Where to Find More Information mdp.ivv.nasa.gov

  12. Site Metrics Web Site activity for 3 months: • 596 hits • 46 accounts • 146 logins • 85 downloads of data

  13. Special Data Requests • Five time stamps of KC-2 data • Sanitized activity fields of JM-1 data • Error Reports from CSCI of KC Note: Five papers have been written from the repository data so far.

  14. Current Data Request • CM-1 requirements data and associated errors - JPL • CM-1 data including metrics – Dolores Wallace SATC • KC semantic metrics generation – Letha Etzkorn UA – Huntsville • JM-1 time stamp data (five sets) – Tim Menzies WVU

  15. A Study • v(G) – cyclomatic complexity – independent linear paths • ev(G) – essential complexity – unstructured constructs • e – programming effort – mental effort • l – (2/u1)*(u2/N2)-program level – the level at which a program can be understood

  16. Background Slides on Metrics

  17. McCabe Metrics Halstead: programmers read code. Too many “words”  error Mccabe: paths between “words” Twisted paths  error v(G): cyclomatic complexity = # path(ish)s = edges-nodes+1 m = # one entry/one exit sub-graphs ev(G): essential complexity = v(G) – m iv(G): design complexity (reflects complexity of calls to other modules)

  18. Halstead Metrics µ = µ1+ µ2 N = length = N1+N2 V = volume = N*log2(µ) V’ = (2+ µ2’)*log2(2+ µ2’) L = level = V’/V D = difficulty = 1/L L’ = 1/D E = effort = V/L’ T = time = E/18 e.g. 2+2+3 N1 = 3 N2 = 2 µ1 = 2 µ2 = 2 µ1’ = 2(ish) µ2’ = #input parameters µ1 Could be found via simple tokenizers µ2 N1 N2

  19. if (…) … ! % & * if (…) … else + , - . / switch (……) ; < > default: ? ^ | ~ case <label>: = >= <= goto <label> == != >> do … while (…) << += -= *= while (…) …do /= %= &= for (… ; …; …) ^= |= >>= <<= this-> && || ++ -- [ ] -> return size of { } enum struct delete continue <function name>( ) new break union ( ) in any other cases not covered … ‘?’ … ‘ ‘: … Operators

  20. Operands • Variables and identifiers • Constants (numeric literal/string) • Function names when used during calls

  21. Error Metrics *ED = ER/KLOC

  22. OO Metrics • Number of Children (NOC) – number of sub-classes • Depth – level of class in the class hierarchy • Response for Class (RFC) – number of local methods plus the number of methods called by local methods (>100) • Weighted Methods per Class (WMC) – sum of the complexities of the methods (>100) • Coupling Between Object Classes (CBO) – dependency on classes outside the class hierarchy (>5) • Lack of Cohesion of Methods (LOCM) – the use of local instance variable by local methods

  23. ARM Metrics • Weak Phrases (adequate, be able to) – clauses that cause uncertainty • Incomplete (TBD, TBR) – Words and phrases that indicate the spec may not be fully developed • Options (can, optionally) – Words that give the developer latitude • Imperatives (shall, may, will, should) – Words that are explicit • Continuances (below, as follows, and) Extensive use of continuances can indicate complex requirements. • Directives (for example, figure, table) – examples or illustrations • Lines of text (size) • Document structure (levels)

  24. Program Background Slides

  25. Problem Report Fields Error Identifier: (Alpha-numeric) Headline: (text- short description) Submitted-on: (Date yymmdd) Severity: (1 thru 5) Status: (NVTCDARMB) System Mode: (operations versus test) Request type: (Problem or enhancement) Problem-type: (requirements, design, source code, COTS, documentation, hardware,etc) Problem Mode: (PR or Action Item) Assigned-to: CCB Approval: (Date) Impacts-csci: (high level design element) Impacts-csc: CSC Impacts-class/file: class/file Impacts-method/function/module: Method/function/module Impacts – Requirement Impacts – design element Resolution: source code, COTS, documentation, not a bug, unreproducible Problem: (text) Analysis: (text) Resolution: (text) Closed-on: (Date)

  26. Project Non-Specific - Universal Error Identifier: (Alpha-numeric)* Submitted-on: (yymmdd) Severity: (1 thru 5) Status: (NVTCDARMB) Mode: (operations versus test) Request type: (Problem or enhancement) Problem-type: (requirements, design, source code, COTS, documentation,etc) Impacts-csci: (high level software design element)* Documents: (What documents are affected?) CCB Approval: (Date) Resolution: source code, COTS, documentation, not a bug, unreproducible Verified-on: (Date) Closed-on: (Date) * May need sanitized.

  27. Project Non-specific - Expanded Impact data: Costing Data: Impacts-csc: CSC* Impacts-class/file: class/file* Impacts-method/function/module: Method/function/module* Recommend-change: source code Process data: How-found: (e.g., Acceptance Test) When-found: (e.g., Acceptance Testing) Analysis-due: 020322 Assigned-Eval-on: 020322 Assigned-Implement-on: 020323 Implement-due: 020325 Fix-date: 020325 Fixed-on: (date) In-Test-on: 020325 Test-name: (Numeric id)* Test-system: (hardware) Verify-date: Merge-build-id: Deferred-on: Build-name: (Alpha-numeric identifier) Patch-rel-name: Patch-rel-date: (date) Automated history entries: (Alpha-numeric) * May need sanitized Cost: (high, medium, low) Est-fix-hours: Est-fix-date: (date) Est-Num-SLOC: Rev-fix-time: Rev-fix-date: (date) SLOC-Type: SLOC-count: Fix-hours: Miscellaneous Data: Operating-system: Priority: (High, Medium, Low) Enhancement: (Y or N) Workaround: Y Iteration: (Version bug identified in)

  28. Project Specific - Universal Headline: (text – short problem description) Problem: (text – expanded problem description) Analysis: (text) Resolution: (text) Closure: (text) Submitter-id: Assigned-to: Closer-id: Question: Can Project Specific data be sanitized?

  29. Project Specific - Universal Headline: (text – short problem description) Problem: (text – expanded problem description) Analysis: (text) Resolution: (text) Closure: (text) Submitter-id: Assigned-to: Closer-id: Question: Can Project Specific data be sanitized?