empirical study of software quality and reliability n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Empirical Study of Software Quality and Reliability PowerPoint Presentation
Download Presentation
Empirical Study of Software Quality and Reliability

Loading in 2 Seconds...

play fullscreen
1 / 16

Empirical Study of Software Quality and Reliability - PowerPoint PPT Presentation


  • 95 Views
  • Uploaded on

Empirical Study of Software Quality and Reliability. 14 November 2007. High Assurance Systems Engineering Conference IEEE HASE 2007. Jeff Tian, Ph.D., PE Southern Methodist University Dallas, TX USA 75275 Tel: (214) 768-2861 FAX: (817) 768-3085 tian@engr.smu.edu. Michael F. Siok, PE

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Empirical Study of Software Quality and Reliability' - audi


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
empirical study of software quality and reliability

Empirical Study of Software Quality and Reliability

14 November 2007

High Assurance Systems Engineering Conference

IEEE HASE 2007

Jeff Tian, Ph.D., PE

Southern Methodist University

Dallas, TX USA 75275

Tel: (214) 768-2861

FAX: (817) 768-3085

tian@engr.smu.edu

Michael F. Siok, PE

Lockheed Martin Aeronautics Company

P.O. Box 748, MZ 8604

Fort Worth, TX 76101

Tel: (817) 935-4514

Mike.F.Siok@lmco.com

avionics software development
Avionics Software Development
  • How much software process is enough ?
  • How much software is enough ?
  • Benchmarks ?
  • In-house Best Practice ?

Cost too Much

Takes too long

Metrics, metrics, metrics

Good Software = f (productivity, reliability, quality)

aircraft software at lm aero
Aircraft Software at LM Aero . . .

Controls

Displays

Radar

Mission

Computer

Electro-

Optical

Global

Positioning

Weapons

External

Stores

Inertial

Stick and

Throttle

Ailerons

Gyros

Rudder(s)

Flight Control

Stabilizers

Accelerometers

Engine(s)

  • Characteristics
  • Large Complex Systems
  • Decades Lifespan
  • Frequent Software Updates
  • Mix of Computation Types
    • Computational
    • Displays
    • Logic/State Machine
    • Signal Processing
    • Feedback Control
  • Hard & Soft Real-Time
  • Severe Computing Resource Constraints
  • COTS requirements
  • Legacy reuse
aircraft software at lm aero1
Aircraft Software at LM Aero . . .
  • Embedded Code size ~2 MSLOC & climbing fast
  • Hundreds of object instances
  • Some COTS
  • Products: F-16, C-130, U-2, F-22, JSF, support equipment, others . . . .
background cont d
Background (Cont’d)
  • Avionics Software Managers want to know . . .
    • How are projects performing, individually and collectively?
    • Is OO better than SA/SD?
    • Does programming language make a difference?
    • Productivity, reliability, quality . . . How are we doing?
    • Issues . . .
      • Measuring success
      • Benchmarking
      • How to improve
  • Use statistics to provide some answers
topics
Topics

IEEE HASE 2007 Paper:

Empirical Study of Embedded Software Quality and Productivity

By Michael F. Siok and Jeff Tian

  • Background
  • Avionics Software Project Data
  • Data Analysis
  • Wrap up and Next Steps
avionics software project data
Avionics Software Project Data
  • 39 software projects carefully chosen for study
    • Project Application Domain (a.k.a. project complexity)
    • Project Size
    • Software Development Methodology
    • Programming Language used
  • Metrics Framework
    • Size
    • Cost
    • Schedule
    • TP & Quality
avionics software project data cont d
Avionics Software Project Data (Cont’d)
  • Metrics Framework provides reference to capture metric and scope
  • 27 metrics available from all projects and normalized, where appropriate
    • Comparable
      • Individually
      • In Aggregate
data analysis approach
Data Analysis-- Approach
  • Study metrics data using Descriptive Statistics
    • Individual Metrics
    • Metrics in aggregate
  • Perform Hypothesis Tests to answer management questions
    • Domain Separation
    • OO –vs- SA/SD
    • Size –vs- Reliability
    • Size –vs- Productivity
    • Cost –vs- Reliability
    • Cost –vs- Productivity
    • Language –vs- Productivity
  • Summarize & Report Results
data analysis cont d descriptive statistics
Data Analysis (Cont’d)-- Descriptive Statistics
  • Used to discover interesting characteristics about each metric in dataset
data analysis cont d hypothesis tests
Data Analysis (Cont’d)-- Hypothesis Tests
  • Hypothesis Testing
    • Since data not normally distributed, used non-parametric tests to accept or reject hypotheses
      • H0 – Data compared is from same population
      • H1 – Data compared is from different populations

Wilcoxon Rank Sum Test

data analysis cont d reporting
Data Analysis (Cont’d)-- Reporting
  • Hypothesis Testing (Continued)
    • Conducted on metrics, results collected, summarized
      • Majority rules policy on Accept/Reject test results
        • Reject = difference at .05 level of significance
        • Accept = no difference
wrap up and next steps
Wrap Up and Next Steps
  • Avionics software organization needed methodology to
    • Assess project performance
    • Assess project performance relative to other similar projects
    • Identify and act on opportunities for improvement
  • Software project data was difficult to acquire
    • Data actually very easy to get
    • Projects had to demonstrate selected process controls
      • Managed variability in metrics
    • Projects had to submit data to company metrics repository and use it
    • Projects had to validate data in company repository
  • Analysis method fairly simple, straight-forward
    • Descriptive Statistics to study metrics behaviors
    • Hypothesis Testing
    • Summary reporting to capture analysis results for action
wrap up and next steps cont d1
Wrap Up and Next Steps (Cont’d)
  • Statistical Testing did not uncover a clear ‘best all around project’
    • Want to identify well-rounded best-in-class project(s)
    • Project demonstrated best in cost, schedule, performance, and quality
  • Use Data Envelopment Analysis (DEA) as benchmarking method to identify best in class software projects
    • Non-parametric analysis method
    • Establishes Multivariate production efficiency frontier
  • Statistical Analysis coupled with DEA will provide repeatable methodology to study & assess company software project data
    • To understand software project and organizational performance
    • To identify best performing software projects
    • To clearly identify practical software process & product improvement opportunities

1011000101010110001010100010100100010010001010010101110010101101010101011111100101011111100000101010101010100010111

. . . to better the business practice of software development

points of contact
Points of Contact

Michael F. Siok, PE

Lockheed Martin Aeronautics Company

P.O. Box 748, MZ 8604

Fort Worth, TX 76101

Tel: (817) 935-4514

Mike.F.Siok@lmco.com

Jeff Tian, Ph.D., PE

Southern Methodist University

Dallas, TX USA 76

Tel: (214) 768-2861

FAX: (817) 768-3085

tian@engr.smu.edu