1 / 33

Intro

Intro. Verification and Validation Processes Introduction Adrian Marshall. Agenda. Introduction Definitions V&V Test objectives Testing challenges Testing and the V model Testing Approaches Testing levels Classes of test Risk based testing Requirements based testing Test Methods

Download Presentation

Intro

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Intro Verification and Validation Processes Introduction Adrian Marshall Adrian Marshall

  2. Agenda • Introduction • Definitions • V&V • Test objectives • Testing challenges • Testing and the V model • Testing Approaches • Testing levels • Classes of test • Risk based testing • Requirements based testing • Test Methods • Common testing types • Testing tools Adrian Marshall

  3. Intro • Verification • IEEE 1012: the process of determining whether or not the products of a given phase of the software development lifecycle fulfil the requirements established during the previous phase. • ISO 12207: confirmation by examination and provision of objective evidence that specified requirements have been fulfilled Adrian Marshall

  4. Intro • Validation • IEEE 1012: the process of evaluating software at the end of the software development processto ensure compliance with software requirements • ISO 12207: confirmation by examination and provision of objective evidence that particular requirements for a specific intended use have been fulfilled Adrian Marshall

  5. Intro • Put more simply: • We Verify that the output of each software phase meets its requirements, and • We Validate that the software, at the end of the development effort, meets the overall intended use Adrian Marshall

  6. Types of V & V Activities • Requirements analysis and Traceability analysis • Design analysis • Interface analysis • Implementation evaluation • static - reviews, inspections, structure analysis • dynamic - simulation, prototyping, execution time analysis • formal - mathematical analysis of algorithms • Testing • Project & Management analysis Adrian Marshall

  7. Sources of guidance on V & V • V & V Standards • IEEE 1012 - Software V & V Plans • IEEE 1059 - Guide for Software V & V Plans • IEEE 1028 - Software Reviews & Audits • IEEE 829 - Software Test Documentation • Related Standards • ISO 12207 - Software Lifecycle Processes • ISO 9126 - Software Quality Characteristics • Text • V & V of Modern Software -Intensive Systems - Schulmeyer & Mackenzie, Prentice Hall, 2000 Adrian Marshall

  8. Pros and Cons of V & V • Positive • early error detection • better product quality • better project planning • better adherence to standards, methods and practices • better decision support information • Cost of detection & prevention < cost of corrective action • Negative • additional time and effort required for V&V activities • additional cost (visible) • Independence of V&V can be hard for small organisations Adrian Marshall

  9. Defect Introduction by Phase What is known about the quality of software systems? Applied Software Measurement 2nd Edition, by Capers Jones. McGraw-Hill, 1997. ISBN: 0-07-032826-9 Adrian Marshall

  10. Cost of Removing Defects What is known about the quality of software systems? Applied Software Measurement 2nd Edition, by Capers Jones. McGraw-Hill, 1997. ISBN: 0-07-032826-9 Adrian Marshall

  11. Testing Definitions (1) • Testing is the process of executing a program with the intent of finding errors • Testing is an activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results • Testing is the process by which we understand the status of the benefits and the risk associated with release of a software system • Testing includes all activities associated with the planning, preparation, execution, and reporting of tests Adrian Marshall

  12. Testing Definitions (2) • “Testing cannot guarantee that errors are not present, rather it demonstrates that errors are present”…. Adrian Marshall

  13. Test Objectives • Verifying the implementation of any or all products • Requirements and solution validation • Defect detection • Provide assessment of deployment risks • Provide performance & threshold benchmark data • Establish testing processes, assets, data and skills for on-going testing activities Adrian Marshall

  14. Testing Challenges • Complete testing is not possible • Testing work is creative and difficult • Testing is costly • Testing is often not seen as a core activity • Testers aim to find and report problems Adrian Marshall

  15. Testing Challenges • Technical personnel often do not want to become testers, leaving testing to non-technical system users • Testing requires independence • Testing is often a critical path activity • Testing is often trimmed to solve schedule or budget problems.… Adrian Marshall

  16. Testing and the V Model DESIGN ACTIVITIES TESTING ACTIVITIES BUILD ACTIVITIES Determine business requirements Review requirements Analyse test requirements Test against business requirements Accept system ACCEPTANCE LEVEL BUSINESS REQUIREMENTS LEVEL Review solution Develop Master Test Plan Test against system requirements Determine system requirements Install system SYSTEMS REQUIREMENTS LEVEL SYSTEM LEVEL Design solution Develop Detailed Test Plan/s Test integration of components Integrate components DESIGN LEVEL INTEGRATION LEVEL Design component solution Buy / build components Test components COMPONENT LEVEL COMPONENT LEVEL Adrian Marshall

  17. Testing Approaches • Bottom up • tests smallest components / sub functions first • test drivers are required • Top down • tests major functional areas from the top down • stubs are used where lower levels are incomplete • Functional thread • process / path-oriented approach which crosses unit boundaries • Combined…. Adrian Marshall

  18. Testing Levels • Unit / Component testing conducted to verify the implementation of the design for one software element (for example, unit, module, function, class instance, method) or a collection of software elements • Integration an orderly progression of testing in which software elements, hardware elements, or both are incrementally combined and tested until the entire system has been integrated • System the process of testing an integrated hardware and software system to verify that the system meets its specified requirements…. Adrian Marshall

  19. Classes of Test • White (or glass) box testing • designed with knowledge of how the system is constructed • aims to exercise the internal logical structure • statements, decisions, paths & exception handling evaluated • Black box testing • designed without knowledge of how the system is constructed • verifies that functional & performance requirements have been satisfied • focuses on the external behaviour of the system…. • Grey Box • designed with some knowledge of how the system is constructed Adrian Marshall

  20. White Box Testing • White box testing techniques • Control flow based testing (e.g. decision & statement coverage testing) • Statement coverage – each statement is executed at least once • Decision coverage – each conditional statement is executed at least once each way • Complexity based testing – (eg McCabe cyclomatic complexity measure) – higher concentration of tests for more complex software • Boundary case and exception handling Adrian Marshall

  21. Black Box Testing • Black box testing techniques • Equivalence partitioning • Boundary value analysis • Decision table • Testing from formal specifications • Error guessing • Exploratory testing Adrian Marshall

  22. Risk-Based Testing • Test areas of risk with more rigor (greater coverage of functionality, and/or code) • Product risks may include: • Performance (capacity, throughput, accuracy, etc) • Safety • Security (authentication…) • Complexity • Test areas of higher risk first. • Focus on consequences and likelihood. Adrian Marshall

  23. Requirements-Based Testing • Systematic requirements based testing ensures complete testing scope is analysed. • Focus Areas (examples) Functionality- • Security, Accuracy, Regulatory Compliance, Technical Compliance, Reliability - • Data Integrity, Error Handling, Fault Tolerance, Recoverability Useability - • User Friendliness, User Guidance, Adaptability , Clarity of Control, Error Handling, Conciseness, Ease of Learning, Documentation Quality, Ease of Installation, Performance- • Throughput, Acceptable Response Time, Data Storage Requirements, Acceptable Memory Capacity, Acceptable Processing Speed Portability - • Portability to Different Hardware Platforms, Compatibility With Different Operating Systems, Conformance, Replaceability, Languages Supported…. Adrian Marshall

  24. Test Methods Adrian Marshall

  25. Inspections and Reviews • Inspections and reviews require visual examination. • They can be conducted at the early definitions phase and hence provide efficient defect rectification. • Can be influenced by the ability of the inspector/reviewer (use checklists to standardise). Adrian Marshall

  26. Common Testing Types (1) • Acceptance testing / User Acceptance Testing (UAT) Testing a system’s behaviour against the customer’s requirements • Alpha & beta testing Testing by a representative sample of users (internal = alpha, external = beta) • Installation testing Testing a system after installation in the target environment • Performance testing Testing against specified performance requirements (eg. response time) • Reliability testing Testing of stability, endurance, robustness, and recoverability…. Adrian Marshall

  27. Common Testing Types (2) • Regression testing Re-testing previously run tests to evaluate the impact that a software change may have on unaltered software components • Security testing Testing a system’s ability to prevent unauthorised use or misuse, authentication • Compatibility / Interoperability testing Testing the ability of software to operate and coexist with other (application and system) software and hardware • Stress testing Exercising a system at the maximum design load and beyond • Usability testing Testing a system’s user friendliness, ease of learning, and ease of use…. Adrian Marshall

  28. Testing Tools • Test management tools • information repositories • document generators • defect management tools • requirements traceability and test coverage tools • Test execution tools • compliers, debuggers, link loaders • source code analysers (coverage & complexity) • GUI testers • functional record and replay tools / robots • performance / load and stress testing tools • security vulnerability analysis tools…. Adrian Marshall

  29. Requirements • Requirements definition through design • Software Specifications • Requirement/specification reviews Adrian Marshall

  30. Software Specification Adrian Marshall Twelve Requirements Basics for Project Success”, Dr. Ralph R. Young , Northrop Grumman Information Technology Defense Group

  31. Criteria for good requirements Twelve Requirements Basics for Project Success”, Dr. Ralph R. Young , Northrop Grumman Information Technology Defense Group Adrian Marshall

  32. Review for testability • Review criteria: • Concise • Complete • Unambiguous • Consistent • Verifiable • Traceable Adrian Marshall

  33. Review • V & V can provide continuous information about the quality of the system and the development effort • cost of detection & prevention < cost of corrective action • Testing is a process by which we understand the status of the benefits and the risk associated with release of a software system. • There are many testing techniques available for developers and testers. • Risk based testing is used to focus scarce testing resources. • Systematic requirements based testing ensure complete testing scope is analysed. • Automated testing tools may by used to assist test management and execution. Adrian Marshall

More Related