L evels of Software Testing - PowerPoint PPT Presentation

kina
slide1 n.
Skip this Video
Loading SlideShow in 5 Seconds..
L evels of Software Testing PowerPoint Presentation
Download Presentation
L evels of Software Testing

play fullscreen
1 / 35
Download Presentation
L evels of Software Testing
117 Views
Download Presentation

L evels of Software Testing

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Levels of Software Testing

  2. Test Level (1/3)‏ • Definition : A package of testing to organize and manage at a time (one to one correspondence between testing and development phases). • Levels of testing • Unit (Component) test • Integration test • System test • Acceptance test

  3. Test Level (2/3)‏ • Feature • Independent planning activity (Separate planning). • Independent test design, execution, completion (evalution of criterion), reporting and test closure activity. • Independent test team (organization), independent environment • The distinguishable contents in each test level seperately • The existance of general goal (purpose) in each test level • The input product of development which is referenced to design test case (test basis)‏ • Test target (something tested)‏ • The need of test harness (driver) and tool supporting • Spcecific testing approach and responsibilty.

  4. The characteristics of each test level

  5. Unit Testing Test Level

  6. Unit Testing • Test one software unit whether it shows functional execution properly. • Test each software unit (module, component) in isolation without the consideration of connection (interface) with other parts • Test targets could be source codes and main tesing technique is white box • Verify functionality described in design document by unit program • If necessary, it could be executed by the other developer or 3rd person rather than own developer

  7. Unit Testing (2)‏ • Objectives • Verifies the functioning of software that are seperately testable • Non-functional characteristics (i.e memory leaks)‏ • Robustness testing • Verifies error tolerance • Verifies interface within module • Verifies local data, boundary values

  8. Unit Testing (3)‏ • Unit testing design techniques • Test cases are devided from work products such as a specification of component, the software design or data model • Control Flow test, elementary comparison test (Condision/Decision coverage), Equivalene Test, Boundary Value Analysis

  9. Unit Testing (4)‏ • Exit Criteria for unit test • The integrator decide when the entry criteria for integration test satisfy • Test coverage which is planned at unit test plan • Specific test design technique used • Decide through reviews of unit test cases and test result • Developer decide (general practice)‏

  10. Test Coverage • The extent that a structure has been exercised by a test suite, expressed as a percentage of the items being covered. • To measure thoroughness of testing • If coverage is not 100%, then more test may be designed to test those items that were missed and therefore increase coverage.

  11. Test Coverage (ctd)‏ • Classification of Coverage • Structure based : Statement coverage, Branch coverage, condition coverage, Path coverage • Extension : Requirement coverage, Functional Coverage, Entry/Exit coverage.

  12. Design Technique - Control Flow Test IF age < 13 yr OR ID already used THEN error message ELSE IF age <25 yr AND usage time per day < 3 hr THEN fee :=50 ELSE fee := 100 IF subscription period < 2 mth OR (subscription period <=5 mth AND unpaid fee >=200) OR no feedback = Y THEN raise fee 200

  13. 1 2 D1 D1 : age < 13 yr. OR ID already used D2 : age <25 yr. AND usage time per day < 3 hr. D3 : subscription period < 2 mth. OR (subs. period <=5 AND unpaid fee >= 200) OR no feedback == Y 3 4 5 D2 6 D3 8 7

  14. Design Technique - Control Flow Test • Test case example – 100% decision coverage? • TC1 : 1,2 • TC2 : 1,3,4,7 • TC3 : 1,3,5,8 • Make combinations of input and output actions of each Decision Point • Additional Test Cases for Depth Level 2 • TC4 : 1,3,4,8 • TC5 : 1,3,5,7

  15. Design Technique - Control Flow Test • Making Test Script • TCid • Input Action (1,2) : choose specific value for each path • Expected output : write specific action or value to verify output result

  16. CFT : Excercise Begin int x, y, power; float z; input(x,y); if (y<0) power=(-y); else power=y; z=1; while (power!=0){ power = power -1; } if (y<0) z= 1/z; output(z); End

  17. Integration Testing Test Level

  18. Integration Testing • Designed to verify interface among the system components • Integration testing tests interfaces between components, interactions with different parts of a system, such as the OS, FS, h/ware • Component integration testing test the interactions between software components and is done after component testing • There may be more than one level of integration testing.

  19. Approach of Integration Testing • Classification of Approach of Integration Testing • Big Bang integration • Bottom up integration • Top down integration • Backbone integration • Central, Collaboration, Layer, C/S integration

  20. Approach of Integration Testing

  21. System Testing • To verify behavior of a whole system/product as defined by the scope of a development project or program • In system testing, the test environment should correspond to the final production environment as much as possible – environment specific minimize risk of failure • System testing investigate both functional and non functional requirements of a system • Effective system testing needs to be executed without unit or integration defects • Testers need to deal with incomplete or undocumented requirements

  22. System Testing – Test Basis • Requirement Specifications • Requirement can be written in text or model • Incomplete requirements are common • Result of Risk Analysis • Business Process • Use Case • High Level descriptions of system behavior • Interactions with OS and system resource

  23. System Testing (ctd.)‏ • System testing investigate both functional and non-functional requirements of system. • Functional Requirement Test • Specification-Based (black box) decision table • Security test : to detect threat from outside • Non-functional Requirement test • Structure based techniques – to assess the thoroughness of menu structure or web page navigation • Performance test, usability test, reliability test etc

  24. Acceptence Testing Test Level

  25. Acceptance Testing • Acceptance testing is often the responsibility of the customer or users of a SUT; other stakeholders may involved as well. • The customers will decide that they will accept the system or program through the Acceptance Test Result

  26. Acceptance Testing • Goal • To establish confidence in the system about (Non)Functional characteristics (finding defect is not the main focus)‏ • To assess the system's readiness for deployment and use • It is not necessarily the final level of testing • Acceptance testing may occur as more that just a single test level • A COTS software product may be acceptance tested when it is installed or integrated • Usability of a component may be done during component testing • A new functional enhancement may come before system testing.

  27. Acceptance Testing • Typical forms of acceptance test • User acceptance testing • Operational (acceptance testing)‏ • Testing of backup/restore, disaster recovery, user management, maintenance task, periodic checks of security vulnerabilities • Contract and regulation acceptance testing • Acceptance criteria should be defined when the contract is agreed • Regulation acceptance testing is performed against any regulations that must be adhered to, such as governmental, legal or safety regulations • Alpha Testing & Beta Testing • Factory acceptance testing (alpha) and site acceptance testing (beta)‏

  28. Type of Testing

  29. Test Types • Functional Testing • Functional testing considers the external behavior of the software(black box testing)‏ • Non-Functional Testing • It is testing of “how” the system works. Such as performance testing, usability testing etc (ISO/IEC9126)‏ • Structural Testing • It may be based on the architecture of software or the system • Confirmation Testing and regression testing • After modification, to confirm that the original defect has been successfully removed and to discover any defects introduced or uncovered as a result of the change(s).

  30. Functional Testing • Test basis • Process flow model • State transition model • Use case model • Plain language specification • May be performed at all test levels • Specification-based technique

  31. Non-Functional Testing • Testing of software product characteristics • Non-functional testing may be performed at all test levels • The test required to measure characteristics of system and software that can be quantified on a varying scale • Quality Model ISO/IEC9126 “Software Engineering – Software Product Quality”

  32. Structural Testing • To help measure the thoroughness of testing assessment of coverage of a type of structure • Coverage : the extent that a structure has been exercised by a test suite, expressed as a percentage of the items being covered • Available Test Basis • Control Flow model, menu structure model • To be performed at all test levels • Program call tree structure may be based on the architecture of the system (calling hierarchy) for system testing level • Business models or menu structures for Acceptance or System integration testing levels)‏

  33. Regression Testing • After modification, to discover any defects introduced or uncovered as a result of the changes • These defects may be either in the SUT, or in another related or unrelated software component • It is performed when the software or its environment is changed • The extent of regression testing is based on the risk of not finding defects in software that was working previously (impact analysis)‏ • It may be performed at all test levels and applies to functional, non-functional and structural test • Regression testing is strong candidate for automation because it runs many times.

  34. Maintenance Testing • Be triggered by modifications, mitigation or retirement of the software or system on an existing operational system • Modification • Planned enhancement changes (i.e. release based), corrective and emergency change and changes of environment (i.e upgrade or patches for OS/DBMS)‏ • Maintenance testing for migration • Operational test of the new environment as well as of the changed software • Maintenance testing for retirement of a system • Data migration testing • Data archiving testing

  35. Maintenance Testing • Regression Test is emphasized in maintenance testing after impact analysis of changes • The scope of maintenance testing • related to the change, size of the existing system and size of change • Depending on the changes, testing may be done at any levels and for any test types