nasa software iv v facility
Download
Skip this Video
Download Presentation
NASA Software IV&V Facility

Loading in 2 Seconds...

play fullscreen
1 / 29

NASA Software IV&V Facility - PowerPoint PPT Presentation


  • 130 Views
  • Uploaded on

NASA Software IV&V Facility. Metrics Data Program. Mike Chapman [email protected] 304-367-8341 Pat Callis [email protected] 304-367-8309 Ken McGill [email protected] 304-367-8300. June 5, 2003. The Metrics Data Program Goal.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'NASA Software IV&V Facility' - haracha


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
nasa software iv v facility
NASA Software IV&V Facility

Metrics Data Program

Mike Chapman [email protected] 304-367-8341

Pat Callis [email protected] 304-367-8309

Ken McGill [email protected] 304-367-8300

June 5, 2003

slide2
The Metrics Data Program Goal
  • Establish a centralized repository that provides consistent,

fully-involved software product data across multiple domains:

    • To improve the effectiveness of software assurance
    • To improve the effectiveness of software research
    • To improve the ability of projects to predict software errors

early in the lifecycle

slide3
Data, Data Everywhere: The Data Drought

There is very little fully-involved software product data…

  • Error data associated to smallest functional unit
  • Requirements traced through the design to the smallest

functional unit

…available to those who need it.

slide4
SARP PI Survey

To what degree has a lack of software defect data from actual NASA projects impacted your SARP funded research?

-  Greatly (9).  Lack of software defect data from actual NASA project has seriously hampered my research- Moderately (8).  It would be nice to have more (or some) real project data, but I have found non-NASA data sources or other workarounds.-  Not significantly (5).  My project has not been impacted because I either have all the data I need or I don't need software defect data

Note: The totals are in parentheses.

slide5
The Lack of Data from a Project Perspective
  • Error data associated to smallest functional unit
    • There is little value in this activity for the project
  • Requirements traced through the design to the smallest

functional unit

    • Projects only need to trace requirements to the executable

(CSC level or above)

  • Fully-involved data made available to the research community
    • Vulnerability of the program
    • Proprietary issues
slide6
The Quest for the Holy Grail

Regardless of the development model the following is needed:

Requirements

(Many to one issue.)

Problem

Reports

dev, test, user, maintenance

Design

Traceability

Code

(smallest functional unit)

Association to the smallest functional unit.

slide7
Recruitment of Project Data
  • Existing repository data
  • Error Data Enhancement Program
slide8
Error Data Enhancement Program

The goal of the enhancement effort is to successfully recruit projects to work with the MDP to provide fully-involved software product data – the Holy Grail.

The MDP team will provide:

  • Requirements analysis and traceability support
  • Configuration Management Support (Error Tracking)
  • Metrics generation and analysis
  • Database and web support
  • Machine learning analysis
slide9
MDP Repository

Other Project

Data

Project Error

Data

MDP Web Site

ClearQuest

Interface

Queries

Sanitized

Data

Firewall

Stored Metrics

and

Error Data

ClearQuest

Sanitized

Data

Project Security

Project1………………….Project n

Project Metrics and Error Data

slide10
Benefits
  • Agency benefits:
    • The improved ability to predict error early in the lifecycle
    • The improved ability to assess the quality of the software
  • The research community benefits:
    • Availability of quality error and metric data
    • Availability of a support team for data needs
  • Participating projects benefit:
    • Additional metrics analysis
    • Additional error analysis
    • Problem tracking tool
    • Other support such as requirements traceability
slide12
Site Metrics

Web Site activity for 3 months:

  • 596 hits
  • 46 accounts
  • 146 logins
  • 85 downloads of data
slide13
Special Data Requests
  • Five time stamps of KC-2 data
  • Sanitized activity fields of JM-1 data
  • Error Reports from CSCI of KC

Note: Five papers have been written from the repository data so far.

slide14
Current Data Request
  • CM-1 requirements data and associated errors - JPL
  • CM-1 data including metrics – Dolores Wallace SATC
  • KC semantic metrics generation – Letha Etzkorn UA – Huntsville
  • JM-1 time stamp data (five sets) – Tim Menzies WVU
slide15
A Study
  • v(G) – cyclomatic complexity – independent linear paths
  • ev(G) – essential complexity – unstructured constructs
  • e – programming effort – mental effort
  • l – (2/u1)*(u2/N2)-program level – the level at which a program can be understood
slide17
McCabe Metrics

Halstead: programmers read code.

Too many “words”  error

Mccabe: paths between “words”

Twisted paths  error

v(G): cyclomatic complexity

= # path(ish)s = edges-nodes+1

m = # one entry/one exit sub-graphs

ev(G): essential complexity = v(G) – m

iv(G): design complexity (reflects complexity of calls to other modules)

slide18
Halstead Metrics

µ = µ1+ µ2

N = length = N1+N2

V = volume = N*log2(µ)

V’ = (2+ µ2’)*log2(2+ µ2’)

L = level = V’/V

D = difficulty = 1/L

L’ = 1/D

E = effort = V/L’

T = time = E/18

e.g. 2+2+3

N1 = 3

N2 = 2

µ1 = 2

µ2 = 2

µ1’ = 2(ish)

µ2’ = #input parameters

µ1

Could be found

via simple tokenizers

µ2

N1

N2

operators
if (…) …

!

%

&

*

if (…) … else

+

,

-

. /

switch (……)

;

<

>

default:

?

^

|

~

case

=

>=

<=

goto

==

!=

>>

do … while (…)

<<

+=

-=

*=

while (…) …do

/=

%=

&=

for (… ; …; …)

^=

|=

>>=

<<=

this->

&&

||

++

--

[ ]

->

return

size of

{ }

enum

struct

delete

continue

( )

new

break

union

( ) in any other cases not covered

… ‘?’ … ‘ ‘: …

Operators
slide20
Operands
  • Variables and identifiers
  • Constants (numeric literal/string)
  • Function names when used during calls
slide21
Error Metrics

*ED = ER/KLOC

slide22
OO Metrics
  • Number of Children (NOC) – number of sub-classes
  • Depth – level of class in the class hierarchy
  • Response for Class (RFC) – number of local methods plus

the number of methods called by local methods (>100)

  • Weighted Methods per Class (WMC) – sum of the complexities

of the methods (>100)

  • Coupling Between Object Classes (CBO) – dependency on classes

outside the class hierarchy (>5)

  • Lack of Cohesion of Methods (LOCM) – the use of local

instance variable by local methods

slide23
ARM Metrics
  • Weak Phrases (adequate, be able to) – clauses that cause uncertainty
  • Incomplete (TBD, TBR) – Words and phrases that indicate the

spec may not be fully developed

  • Options (can, optionally) – Words that give the developer latitude
  • Imperatives (shall, may, will, should) – Words that are explicit
  • Continuances (below, as follows, and) Extensive use of continuances

can indicate complex requirements.

  • Directives (for example, figure, table) – examples or illustrations
  • Lines of text (size)
  • Document structure (levels)
slide25
Problem Report Fields

Error Identifier: (Alpha-numeric) Headline: (text- short description)

Submitted-on: (Date yymmdd) Severity: (1 thru 5)

Status: (NVTCDARMB) System Mode: (operations versus test)

Request type: (Problem or enhancement) Problem-type: (requirements, design, source code, COTS, documentation, hardware,etc)

Problem Mode: (PR or Action Item)

Assigned-to: CCB Approval: (Date)

Impacts-csci: (high level design element) Impacts-csc: CSC

Impacts-class/file: class/file Impacts-method/function/module: Method/function/module

Impacts – Requirement Impacts – design element

Resolution: source code, COTS, documentation, not a bug, unreproducible

Problem: (text) Analysis: (text)

Resolution: (text) Closed-on: (Date)

slide26
Project Non-Specific - Universal

Error Identifier: (Alpha-numeric)*

Submitted-on: (yymmdd)

Severity: (1 thru 5)

Status: (NVTCDARMB)

Mode: (operations versus test)

Request type: (Problem or enhancement)

Problem-type: (requirements, design, source code, COTS, documentation,etc)

Impacts-csci: (high level software design element)*

Documents: (What documents are affected?)

CCB Approval: (Date)

Resolution: source code, COTS, documentation, not a bug, unreproducible

Verified-on: (Date)

Closed-on: (Date)

* May need sanitized.

slide27
Project Non-specific - Expanded

Impact data: Costing Data:

Impacts-csc: CSC*

Impacts-class/file: class/file*

Impacts-method/function/module: Method/function/module*

Recommend-change: source code

Process data:

How-found: (e.g., Acceptance Test)

When-found: (e.g., Acceptance Testing)

Analysis-due: 020322

Assigned-Eval-on: 020322

Assigned-Implement-on: 020323

Implement-due: 020325

Fix-date: 020325

Fixed-on: (date)

In-Test-on: 020325

Test-name: (Numeric id)*

Test-system: (hardware)

Verify-date:

Merge-build-id:

Deferred-on:

Build-name: (Alpha-numeric identifier)

Patch-rel-name:

Patch-rel-date: (date)

Automated history entries: (Alpha-numeric) * May need sanitized

Cost: (high, medium, low)

Est-fix-hours:

Est-fix-date: (date)

Est-Num-SLOC:

Rev-fix-time:

Rev-fix-date: (date)

SLOC-Type:

SLOC-count:

Fix-hours:

Miscellaneous Data:

Operating-system:

Priority: (High, Medium, Low)

Enhancement: (Y or N)

Workaround: Y

Iteration: (Version bug identified in)

slide28
Project Specific - Universal

Headline: (text – short problem description)

Problem: (text – expanded problem description)

Analysis: (text)

Resolution: (text)

Closure: (text)

Submitter-id:

Assigned-to:

Closer-id:

Question: Can Project Specific data be sanitized?

slide29
Project Specific - Universal

Headline: (text – short problem description)

Problem: (text – expanded problem description)

Analysis: (text)

Resolution: (text)

Closure: (text)

Submitter-id:

Assigned-to:

Closer-id:

Question: Can Project Specific data be sanitized?

ad