cots testing n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
COTS Testing PowerPoint Presentation
Download Presentation
COTS Testing

Loading in 2 Seconds...

play fullscreen
1 / 77

COTS Testing - PowerPoint PPT Presentation


  • 120 Views
  • Uploaded on

COTS Testing . Diff. With in-house components. Interface (pre and post conditions) are not clearly specified. No Arch. and code. Black boxes to component user. Why use COTS. Why COTS Testing. Failure of Ariane5.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'COTS Testing' - missy


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
diff with in house components
Diff. With in-house components
  • Interface (pre and post conditions) are not clearly specified.
  • No Arch. and code.
  • Black boxes to component user.

Why use COTS

why cots testing
Why COTS Testing
  • Failure of Ariane5.
  • explosion resulted from insufficiently tested software reused from the Ariane 4 launcher.
why rigorous evaluation of cots
Why rigorous evaluation of COTS?
  • Large number of alternative products.
  • Multiple stakeholders.
  • Large number of Quality criteria.
  • Compatibility with other products.
why evaluation difficult
Why evaluation difficult
  • Large number of evaluation criteria.
  • Different opinions are usually encountered among different stakeholders.
  • Evaluation criteria are not easily measurable at evaluation time.
  • Gathering relevant info. is prohibitively expensive.
  • COTS market is changing fast, evaluation must be performed several times during lifecycle.
  • Evaluation deals with uncertainty info.
ahp technique
AHP Technique
  • Originally designed for economic and political science domains.
  • Requires a pair wise comparison of alternatives and pair wise weighting of selection criteria.
  • Enables consistency analysis of comparisons and weights, making possible to assess quality of gathered info.
ahp technique contd
AHP Technique (contd.)
  • Allows alternatives to be measured on a ratio scale,we can determine how much better an alternative compared to other.
  • Practically usable if number of alternatives and criteria are sufficiently low, because comparisons are made by experts.
selection in practice
Selection in practice

Follows three stages

  • Informal screening for a set of requirements using selection thresholds.
  • More systematic evaluation using AHP process.
  • Detailed Information gathering involves testing, prototyping and reading technical documents.
how to provide information to user
How to provide information to user
  • Component meta-data approach.
  • Retro-components approach.
  • Component test bench approach.
  • Built-in test approach.
  • Component+ approach.
  • STECC strategy.
component meta data approach
Component meta-data approach

Component

Binary code

Call graphs,

Testing info. done by provider

Metadata

component metadata contd
Component metadata (contd.)

Component

server

functionality

Meta

DB

Metadata req

Metadata

retro components approach
Retro-components approach

Component

server

functionality

Meta

DB

Metadata req and test data

Metadata

component test bench approach
Component test bench approach
  • A set of test cases called test operation is associated with each interface of a component.
  • A test operation defines the necessary steps for testing a specific method.
  • The concrete test inputs and expected test output packaged in a test operation.
built in test approach
Built-in test approach.

Component

Functionality

Tester

Test case

generator

built in test approach contd
Built-in test approach(contd.)

Normal mode.

Maintenance mode.

Functionality

Functionality

Tester

Test case

generator

built in test approach contd1
Built-in test approach(contd.)

Base Component

Inheritance

Derived Component

component approach
Component+ approach

Tester

Built-in testing enabled component

Test case

generator

Functionality

Handler

Test executor

Failure

Recovery

mech.s

Interface

disadv of bit and component
Disadv. of BIT and component+
  • Static nature.
  • Generally do not ensure that tests are conducted as required by the component user
  • The component provider makes some assumptions concerning the requirements of the component user, which again might be wrong or inaccurate.
stecc strategy
STECC strategy

Server

query

functionality

Meta

DB

Metadata Req.

Tester

Metadata

Test generator

levels of testing
Levels of Testing
  • Unit Testing.
  • Integration Testing.
  • System Testing
types of testing
Types of testing
  • Functionality Testing .
  • Reliability Testing.
  • Robustness Testing.
  • Performance Testing.
  • Load Testing.
  • Stress Testing.
  • Stability Testing.
  • Security Testing.
certifying cots
Certifying COTS

When considering a candidate component, developers

need to ask three key questions:

  • Does component C fill the developer’s needs?
  • Is the quality of component C high enough?
  • What impact will component C have on system S?
certification techniques
CERTIFICATION TECHNIQUES
  • Black-box component testing.
  • System-level fault injection.
  • Operational system testing.
  • Software Wrapping.
  • Interface propagation Analysis.
black box testing
Black box Testing
  • To understand the behavior of a component, various inputs are executed and outputs are analyzed.
  • To catch all types of errors all possible combinations of input values should be executed.
  • To make testing feasible, test cases are selected randomly from test case space.
black box test reduction using input output analysis
Black box test reduction using Input-output Analysis
  • Random Testing is not complete.
  • To perform complete functional testing, number of test cases can be reduced by Input-output Analysis.
how to find i o relationships
How to find I/O relationships
  • By static analysis or execution analysis of program.
fault injection
Fault Injection

Component

request

Fault

simulation

tool

Fault

simulation

tool

Exceptions, No response

Erroneous or malicious input

operational system testing
Operational System Testing
  • complements system-level fault injection.
  • System is operated with random inputs (valid and invalid inputs)
  • Provides more accurate assessment of COTS quality.
  • To ensure that a component is a good match for the system.
software wrapping
Software Wrapping

Input wrapper

Output wrapper

Component

output

Input

interface propagation analysis
Interface propagation Analysis

COTS

Component 1

COTS

Component 2

Fault Injector

  • Modify input, call correct method.
  • Call correct method, modify output.
  • Call perturbed function.
fault injection used for
Fault Injection used for
  • Robustness Testing.
  • Error propagation Analysis.
  • Reliability Testing.
  • Security Testing.
cots testing for os failures
COTS testing for OS failures

COTS

component

Operating

System

Wrapper

ballista approach
Ballista approach
  • Based on fault injection technique.
  • Test cases are generated using parameter types of an interface.
  • Independent of internal functionality.
  • Testing is not complete.
test value data base contd
Test value Data Base(contd.)
  • Integer data type: 0, 1, -1, MAXINT, -MAXINT, selected powers of two, powers of two minus one, and powers of two plus one.
  • Float data type: 0, 1, -1, +/-DBL_MIN, +/-DBL_MAX, pi, and e.
  • Pointer data type: NULL, -1 (cast to a pointer), pointer to free’d memory, and pointers to malloc’ed buffers of various powers of two in size.
test value data base contd1
Test value Data Base(contd.)
  • String data type (based on the pointer base type): includes NULL, -1 (cast to a pointer), pointer to an empty string, a string as large as a virtual memory page, a string 64K bytes in length.
  • File descriptor (based on integer base type): includes -1;MAXINT; and various descriptors: to a file open for reading, to a file open for writing, to a file whose offset is set to end of file, to an empty file, and to a file deleted after the file descriptor was assigned.
test case generation
Test case generation
  • All combinations of values for the parameter types are generated.
  • Number of test cases generated are product of number of parameters and test base for that type.
error propagation analysis
Error propagation analysis
  • Interface Propagation Analysis is used by injecting faults at one component.
  • This is done at component integration level.
  • A known faulty input is injected using fault injector into the system.
  • Components effected by this input are observed (how they handle the faulty input).
middleware
Middleware
  • Application’s execution and Middleware cannot be divorced in any meaningful way.
  • In order to predict the performance of application component, performance of its middleware should be analyzed.
performance prediction methodology
Performance prediction Methodology

Application’s performance prediction is

three step process.

  • Obtaining Technology performance.
  • Analyzing Architecture specific behavioral characteristics.
  • Analyzing Application specific behavioral characteristics.
architecture behavior
Architecture behavior

Identity Application

effect of database access thru middleware
Effectof database access thru Middleware

Container

  • The performance of the entity bean architecture is less than 50% of the performance of the session bean only Architecture.

Session

bean

DB

Entity

bean

effect of server thread
Effectof Server Thread
  • The performance increases from 2 threads to 32 threads, stabilizes around 32 to 64 threads, and gradually decreases as more threads are added due to contention.
the effect of client request load
The Effect of Client RequestLoad.
  • Client response time increases with concurrent client request rate due to contention for server threads.
effect of database contention
Effect of Database Contention
  • Effect of database contention leads to performance that is between 20% and 49%.
load testing1
Load Testing
  • It is just Performance testing under various loads.
  • Performance is measured as Connections per second (CPS), throughput in bytes per second, and round trip time (RTT) .
load test application
Load Test Application

Ethernet

Load test

App

System Under Test

Web

server

DB

server

App

server

testing strategy
Testing strategy

Load tests will be conducted in three phases.

Consumption of server resources as a function of the volume of incoming requests will be measured.

Response time for sequential requests will be measured.

Response time for concurrent client request load will be measured.

security risks with cots
Security Risks with COTS
  • Component design.
  • Component procurement.
  • Component integration.
  • System maintenance.
component design
Component Design
  • Inadvertently flawed component design.
  • Intentionally flawed component design.
  • Excessive component functionality.
  • Open or widely spread component design.
  • Insufficient or incorrect documentation.
slide66

Component integration

Mismatch between product security levels.

Ex. UNIX and CORBA security integration.

System maintenance

  • Insecure updating.
  • Unexpected side effects.
  • Maintenance backdoors.
risks revealed
Risks revealed
  • Trojan horse in client.
  • Information leaking to swap file.
  • DBMS log files.
  • DBMS ordering of records.
piracy avoidance techniques
Piracy avoidance techniques
  • Hardware and software tokens.
  • Dynamic Decryption of Code.
  • Watermarking.
  • Code Partitioning.
i bacci process
I-BACCI process
  • Decomposing the binary file of the component; and filtering trivial information.
  • Comparison the code sections between the two versions.
  • Identification of glue code functions.
  • Identification of change propagation in other components/system.
  • Selection of test cases to cover only the affected glue code functions (functions in firewall).
methods for understanding
Methods for understanding
  • Binary reverse Engg.
  • Interface probing.
  • Partial automation of interface probing.
binary reverse engg
Binary reverse Engg.
  • Derives the design structure (call graph, control graph) from binary code.
  • Source code can also be partially extracted using decompilation.
  • Decompiled source code will have no comments and variable names will not be meaningful.
  • Licenses forbid decompilation back to source code.
interface probing
Interface probing
  • System Developer designs a set of test cases, executes, and analyzes outputs.
  • Done in an iterative manner.
disadvantages
Disadvantages
  • A large number of test cases have to be generated and analyzed.
  • Some properties may require significant probing which may be tedious,labor intensive, expensive.
  • Developers miss certain limitations and make incorrect assumptions.
partial automation of interface probing
Partial Automation of interface probing
  • Based on interface probing.
  • Test cases are generated based on scenarios.
  • Testing is done in three phases

Scenario description phase.

Search space specification phase.

Test case generation phase.