Ivica crnkovic m lardalen university department of computer engineering ivica crnkovic@mdh se
This presentation is the property of its rightful owner.
Sponsored Links
1 / 33

Ivica Crnkovic Mälardalen University Department of Computer Engineering [email protected] PowerPoint PPT Presentation


  • 70 Views
  • Uploaded on
  • Presentation posted in: General

Software Engineering Course Testing. Ivica Crnkovic Mälardalen University Department of Computer Engineering [email protected] Software faults and failures. What does it mean that the software has failed? The software does not do what the requirements describe

Download Presentation

Ivica Crnkovic Mälardalen University Department of Computer Engineering [email protected]

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Ivica crnkovic m lardalen university department of computer engineering ivica crnkovic@mdh se

Software Engineering Course

Testing

Ivica Crnkovic

Mälardalen University

Department of Computer Engineering

[email protected]

SE course, Testing


Software faults and failures

Software faults and failures

  • What does it mean that the software has failed?

    • The software does not do what the requirements describe

  • Reasons to the failures

    • Requirements incomplete, inconsistent, impossible to implement

    • Wrong interpretation of the requirements

    • The system design may contain a fault (misunderstanding requirements, wrong architecture, incomplete design)

    • Program design may contain a fault - wrong algorithm

    • The program code may be wrong - improper, incomplete, inconsistent implementation

    • Documentation can be wrong - can describe incorrectly the behavior of the system

SE course, Testing


Some other reasons to the failures

Some other reasons to the failures

  • Integration problems

    • The components integrated into a systems are not compatible (wrong components, wrong component versions)

  • The performance of the entire system does not correspond to the requirements (expectations)

  • The system requires too much resources

  • Not failures but still unsuccessful

    • Bad Graphical User Interface

    • Complicated process

    • Missed points with requirements (users actually wanted something else)

SE course, Testing


Defect classifications

Defect classifications

  • What type of defect?

  • When it occurred (when failure, when fault?)

  • Severity level?

  • What is the foot cause of the defect?

  • ….

SE course, Testing


Fault classification hp

Specification/

requirements

Environment/

support

Design

Code

Documentation

Other

HW interface

SW interface

User interface

Functional

description

(Inter-)Process

communications

Data definition

Module design

Logic

description

Error checking

Standards

Logic

Computation

Data handling

Module

interface/

implementation

Standards

Test HW

Test SW

Integration SW

Development

tools

Missing Unclear Wrong Changed Better way

Fault classification (HP)

ORIGIN: WHERE?

Requirements

or

specifications

Functionality

TYPE: WHAT?

MODE:

WHY?

SE course, Testing


What are the most frequent faults

What are the most frequent faults

Example:

Faults statistics (HP)

SE course, Testing


Costs to repair the failures

Specification/

requirements

Design

Code

Integration

Costs to repair the failures

$$$ to fix the problems

Origin of problems

SE course, Testing


How to avoid a late detection of the faults

How to avoid a late detection of the faults?

  • Use the development process that suits the best

  • Do proper planning

  • Ensure the quality in each process step

    • perform reviews

    • perform tests

SE course, Testing


V model document produced

ACCEPTANCE

TESTING

V Model - document produced

Acceptance testdescription

OPERATION

& MAINTENANCE

REQUIREMENTS

ANALYSIS

RS

Validate requirements

SYSTEM

DESIGN

Verify design

System design test description

SYSTEM

TESTING

SD, FS

PROGRAM DESIGN

UNIT & INTE-

GRATION TESTING

Module test description & code

DS, Code

Test analysis report.

CODING

SE course, Testing


The testing process

The testing process

SE course, Testing


The testing process details

Unit

test

Unit

test

Integration

test

Function

test

Performance

test

Acceptance

test

Installation

test

Unit

test

The testing process - details

Design

specifications

System

functional

requirements

Other

software

requirements

Customer

requirements

specification

User

environment

Component code

Tested component

.

.

.

Integrated

modules

Functioning

system

Verified,

validated

software

Accepted

system

Tested component

Component code

SYSTEM

IN USE!

SE course, Testing


Who performs the tests

Who performs the tests?

  • Program authors (module tests)

  • The developer group (module tests, component tests)

  • Test teams

    • Integration tests, subsystem tests

    • System tests

    • “Big configuration test”

  • Special customers (running alpha, beta releases)

  • Customers with support people (acceptance tests)

Beta releases

Development

Alpha releases

Official

release

New functions

Fixing defects

SE course, Testing


Unit testing

Unit Testing

Different methods to test the code

  • Code review - go trough the code

    • Code author, or a review group in the developer team (peer review)

  • Code inspection

    • Inspect the code against a prepared list of concerns (data-types, style guide…)

  • Proving code correct

    • Formal verification

  • Choosing test cases

    • Inputs to test the system and the predicted outputs from these inputs if the system operates according to its specification

SE course, Testing


Black box testing

Black-box testing

  • Approach to testing where the program is considered as a ‘black-box’

  • The program test cases are based on the system specification

  • Test planning can begin early in the software process

SE course, Testing


Interface testing

Interface testing

  • Takes place when modules or sub-systems are integrated to create larger systems

  • Objectives are to detect faults due to interface errors or invalid assumptions about interfaces

  • Particularly important for object-oriented development as objects are defined by their interfaces

The test is applied to sub-systems,

not to individual modules

SE course, Testing


Interfaces types

Interfaces types

  • Parameter interfaces

    • Data passed from one procedure to another

  • Shared memory interfaces

    • Block of memory is shared between procedures

  • Procedural interfaces

    • Sub-system encapsulates a set of procedures to be called by other sub-systems

  • Message passing interfaces

    • Sub-systems request services from other sub-systems

SE course, Testing


White box testing structural testing

White-box testing (structural testing)

  • Derivation of test cases according to program structure. Knowledge of the program is used to identify additional test cases

  • Objective is to exercise all program statements (not all path combinations)

SE course, Testing


Path testing

Path testing

  • A white-box testing

  • Testing different combinations of program paths

  • Measuring the coverage percentile of the test

SE course, Testing


Integration testing

A

B

C

D

E

F

G

Integration testing

  • Test on the sub-systems levels

  • Testing how the components work together

SE course, Testing


Integration test bottom up principle

Test

E

A

Test

B,E,F

Test

A,B,C,D,

E,F,G

Test

F

Test

C

B

C

D

E

F

G

Test

G

Test

D,G

Integration test - bottom-up principle

SE course, Testing


Top down testing

A

Test

A,B,C,D,

E,F,G

Test

A

Test

A,B,C,D

B

C

D

E

F

G

Top-down testing

  • Test from the top level

  • all components are not necessary completed (and tested)

    • writing STUBS (which simulates the component)

SE course, Testing


Combination of top down and down top model

Test

B

Test

E

A

Test

A,B,C,D,

E,F,G

Test

A

Test

C

Test

C

Test

A,B,C,D

Test

F

B

C

D

Test

D

Test

D

Test

G

E

F

G

Combination of Top-down and Down-top model

SE course, Testing


Sandwich integration

Test

E

Test

B,E,F

A

Test

F

Test

A,B,C,D,

E,F,G

B

C

D

Test

D,G

Test

G

E

F

G

Test

A

Sandwich Integration

SE course, Testing


Ivica crnkovic m lardalen university department of computer engineering ivica crnkovic mdh se

Milestone 1: Most critical features

and shared components

Design, code, prototype

Usability testing

Daily builds

Feature integration

Eliminate severe faults

Milestone 2: Desirable features

Design, code, prototype

Usability testing

Daily builds

Feature integration

Milestone 3: Least critical features

Design, code, prototype

Usability testing

Daily builds

Feature integration and completion

Release to manufacturing

Microsoft synch-and stabilize approach

SE course, Testing


Object oriented systems testing

Object-oriented systems testing

  • Less closely coupled systems. Objects are not necessarily integrated into sub-systems

  • Cluster testing. Test a group of cooperating objects

  • Thread testing. Test a processing thread as it weaves from object to object. Discussed later in real-time system testing

  • Complication with inheritance

SE course, Testing


Object oriented systems testing1

Object-oriented systems testing

Harder parts of testing

Easier parts of testing

Modularity

Inheritance

Polymorphism

Small methods

Dynamic binding

Reuse

Encapsulation

Complex Interfaces

Interfaces

identified earlier

More integration

SE course, Testing


Automated testing tools

Automated testing tools

  • Static Analyzers

    • Code analysis

      • Show some potential program errors and low quality code (too many branches, complicated structure, bad naming policy

      • Generate different graphs showing the program structure

      • Data analyzer (initialization, usage, etc.)

  • Dynamic Analyzers

    • Checking a running program

    • Memory leakage

    • non proper use of variables

    • Measuring timing

    • Measuring execution coverage

SE course, Testing


When to stop with testing

When to stop with testing?

Probability

of existence

of additional

faults

Number of faults found to date

SE course, Testing


Ivica crnkovic m lardalen university department of computer engineering ivica crnkovic mdh se

Requirements

analysis

System

design

Program

design

Program

implementation

Unit/integration testing

System testing

Maintenance

Incorrect, missing or unclear requirements

Incorrect or unclear translation

Incorrect or unclear design specification

Incorrect or unclear design specification

Misinterpretation of system design

Misinterpretation of program design

Incorrect documentation

Incorrect syntax or semantics

Incomplete test procedures

New faults introduced when old ones corrected

Incomplete test procedures

Incorrect user documentation

Poor human factors

New faults introduced when old one corrected

Changes in requirements

SE course, Testing


Ivica crnkovic m lardalen university department of computer engineering ivica crnkovic mdh se

System

functional

requirements

Other

software

requirements

Customer

requirements

specification

User

environment

Function

test

Performance

test

Acceptance

test

Installation

test

Functioning

system

Verified,

validated

software

Accepted

system

SYSTEM

IN USE!

Integrated

modules

SE course, Testing


Ivica crnkovic m lardalen university department of computer engineering ivica crnkovic mdh se

TEST SPECIFICATION

TEST DESCRIPTION

TEST ANALYSIS

REPORT

TEST 1

Requirements tested

Functions tested

Methods

Conditions

TEST 1

Test data

TEST 1

Results

Test procedures

1.

2.

TEST PLAN

SYSTEM TEST

FUNCTION

Function 1 1

Function 2 3,4

.

.

TEST SPECIFICATION

TEST DESCRIPTION

TEST 2

Requirements tested

Functions tested

Methods

Conditions

TEST 2

Test data

Test procedures

1.

2.

Perform

test 1

TEST ANALYSIS

REPORT

TEST 2

Results

Perform

test 2

SE course, Testing


Ivica crnkovic m lardalen university department of computer engineering ivica crnkovic mdh se

FAULT REPORT

S.P0204.6.10.3016

ORIGINATOR:

Joe Bloggs

BRIEF TITLE:

Exception 1 in dps_c.c line 620 raised by NAS

FULL DESCRIPTION

Started NAS endurance and allowed it to run for a few minutes. Disabled the active NAS

link (emulator switched to standby link), then re-enabled the disabled link and CDIS exceptioned as above. (I

think the re-enabling is a red herring.)

(during database load)

ASSIGNED FOR EVALUATION TO:

DATE:

CATEGORISATION:

0 1 2 3 Design Spec Docn

SEND COPIES FOR INFORMATION TO:

8/7/92

EVALUATOR:

DATE:

CONFIGURATION ID

ASSIGNED TO

PART

dpo_s.c

COMMENTS: dpo_s.c appears to try to use an invalid CID, instead of rejecting the message. AWJ

ITEMS CHANGED

CONFIGURATION ID IMPLEMENTOR/DATE REVIEWER/DATE BUILD/ISSUE NUM INTEGRATOR/DATE

dpo_s.c v.10

AWJ 8/7/92

MAR 8/7/92

6.120

RA 8-7-92

COMMENTS:

CLOSED

FAULT CONTROLLER:

DATE: 9/7/92

SE course, Testing


Ivica crnkovic m lardalen university department of computer engineering ivica crnkovic mdh se

DISCREPANCY REPORT FORM

DRF Number:__________________________________________Tester name:____________________________

Date: ___________________________________Time: ________________________________

Test Number: ______________________________

Script step executed when failure occurred: __________________________________________________________

Descripton of failure: ________________________________________________________________________

_______________________________________________________________________________________

_______________________________________________________________________________________

_______________________________________________________________________________________

Activities before occurrence of failure:

_______________________________________________________________________________________

_______________________________________________________________________________________

Expected results:

_______________________________________________________________________________________

_______________________________________________________________________________________

Requirements affected:

_______________________________________________________________________________________

_______________________________________________________________________________________

Effect of failure on test:

_______________________________________________________________________________________

_______________________________________________________________________________________

Effect of failure on system:

_______________________________________________________________________________________

_______________________________________________________________________________________

Severity level:

(LOW)12345 (HIGH)

SE course, Testing


  • Login