1 / 44

Testing: Verification and Validation

Testing: Verification and Validation. Definitions. Error A problem at its point of origin Example: coding problem found in code inspection Defect A problem beyond its point of origin Example: requirements problem found in design inspection, system failure during deployment.

mboulanger
Download Presentation

Testing: Verification and Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Testing:Verification and Validation

  2. Definitions • Error • A problem at its point of origin • Example: coding problem found in code inspection • Defect • A problem beyond its point of origin • Example: requirements problem found in design inspection, system failure during deployment

  3. Cost of Defects • According to NASA analysis: • The cost of finding a defects during system test versus finding the defect in design • Dollar cost is 100:1 • Cost in time 200:1

  4. Cost of Defects • The cost of finding a defect in a requirement is $100, in test $10,000 • On average, design and code reviews reduce the cost of testing by 50-80% including the cost of the reviews Wow!

  5. 5 20 50 1 Cost of Repair [Boehm81] Requirements Design Build Test 100 Maintain

  6. Quality and Profit • Which is better? • Time-to-profit may be improved by more investment in build quality early in the process Support Cost Revenue Time -to-market Time -to-market [Fujimura93] Time -to-profit Time -to-profit

  7. Number of Defects Time Product Stability • Measure defects (correlated with delivered unit) • System, product, component, etc. • Normalize defects to importance/criticality

  8. Definitions • From Webster’s • Verify 1) to confirm or substantiate in law by oath 2) to establish the truth, accuracy, or reality of • Validate2) to support or corroborate on sound or authoritative basis <as in experiments designed to confirm an hypothesis>

  9. What’s the difference? • Verification is determining whether the system is built right : correctly translating design into implementation • Validation is determining whether the right system is built : does the implementation meet the requirements

  10. Verification and Validation • Verification is applied at each transition in the development process • Validation is applied with respect to the results of each phase either for acceptance or process improvement. • Inspection for Verification • Testing for Validation

  11. Verification and Validation What is the practical difference between verification and validation? Product User Requirements Architecture Design

  12. Verification Verification Verification Verification and Validation Product User Requirements Architecture Design

  13. Verification Verification Verification Verification and Validation User Validation Product User Requirements Architecture Design

  14. Verification Verification Verification Verification and Validation Requirements Validation User Validation Product User Requirements Architecture Design

  15. Verification Verification Verification Verification and Validation Requirements Validation Architectural Validation Design Validation User Validation Product User Requirements Architecture Design

  16. Real-Time Systems • Not only do we have the standard software and system concerns… • …but performance is crucial as well Temporal Real-time System Functional Structural

  17. Real-Time Systems • In real-time systems (soft, firm, hard), correctness of function depends upon the ability of the system to be timely • In real-time systems, correct functionality may also depend on: • reliability, robustness, availability, security • If the system cannot meet any of these constraints, it may be defective

  18. Requirements Inspections • Biggest potential return on investment • Attributes of good requirement specification: • Unambiguous • Complete • Verifiable • Consistent • Modifiable • Traceable • Usable

  19. Requirements Inspections • Inspection objectives: • Each requirement is consistent with and traceable to prior information • Each requirement is clear, concise, internally consistent, unambiguous, and testable • Are we building the right system?

  20. Design Inspection • Opportunity to catch problems early. • Objectives: • Does the design address all the requirements? • Are all design elements traceable to specific requirements? • Does the design conform to applicable standards? • Are we building the system correctly?

  21. Test Procedure Inspections • Focus on verifying the validation process. Does the test validate all the requirements using formal procedure with predictable results and metrics. • Objectives: • Do validation tests accurately reflect requirements? • Have validation tests taken advantage of knowledge of the design? • Is the system ready for validation testing?

  22. Requirements Validation • Check: • Validity - Does the system provide the functions that best support customer need? • Consistency - Are there any requirements conflicts? • Completeness - Are all required functions included? • Realism - Can requirements be implemented with available resources and technology • Verifiability - Can requirements be checked?

  23. Testing • Testing is an aspect of Verification and Validation • Testing can verify correct implementation of a design • Testing can validate accomplishment of requirement specifications • Testing is often tightly coupled with implementation (integration and evaluation) but it also is important to production

  24. When to Test • To test, you need something to evaluate • Algorithms • Prototypes • Components/Sub-systems • Functional Implementation • Complete Implementation • Deployed System • Testing can begin as soon as there’s something to test!

  25. System Engineering Test Requirementsand Evaluation TestMeasurements TestPlanning TestArchitecture Component Engineering TestEngineering Test Equipment Requirements Test Conductand Analysis TestEquipment Testing Participants [Kossiakoff03]

  26. Testing Strategies • White Box (or Glass Box) Testing • Component level testing where internal elements are exposed • Test cases are developed with developers’ knowledge of critical design issues • Functional testing for verification • Black Box Testing • Component level testing where structure of test object is unknown • Test cases are developed using specifications only • Operational testing for validation

  27. Black Box Testing • Positive Tests • Valid test data is derived from specifications • Both high and low probability data is used • Tests reliability • Negative Tests • Invalid test data is derived violating specifications • Tests robustness of the test object • Need both kinds of tests and high and low probability events to develop statistical evidence for reliability and robustness

  28. Testing Strategies • Test Envelopes • Given a behavior with 2 parameters we can establish the test envelope • Useful for identifying boundary conditions Y Min X Max X Boundary Normal Max Y Normal Boundary Abnormal Min Y X

  29. Test Envelopes • Boundary conditions define positive and negative tests • Test cases should include • High probability zones in the normal region • High probability zones in the abnormal region • Low probability zones in the abnormal region if the outcome is catastrophic

  30. Hierarchical Testing • Top-down testing • Developed early during development • High level components are developed • Low level components are “stubbed” • Allows for verification of overall structure of the system (testing the architectural pattern and infrastructure)

  31. Hierarchical Testing • Bottom-up testing • Lowest level components are developed first • Dedicated “test harnesses” are developed to operationally test the low-level components • Good approach for flat, distributed, functionally partitioned, systems (pipeline architecture)

  32. Testing Strategies • Regression Testing • Testing that is done after system has been modified • Assure that those things that it used to do—that it still should do—still function • Assure that any new functionality behaves as specified

  33. Testing Complications • When a test discrepancy occurs (the test “fails”) it could be a fault in: • Test equipment (test harness) • Test procedures • Test execution • Test analysis • System under test • Impossible performance requirement • The first step in resolution is to diagnose the source of the test discrepancy

  34. Operational Testing • Validation Techniques • Simulation • Simulate the real-world to provide inputs to the system • Simulate the real world for evaluating the output from the system • Simulate the system itself to evaluate its fitness • Simulation can be expensive

  35. Operational Testing • Simulation is a primary tool in real-time systems development Flight Controls Flaps, Ailerons, Rudder, Elevator Cockpit Displays Propulsion Systems Inertial Navigation Radar

  36. Operational Testing • Avionics integration labs develop “airplane on a bench” Flight Controls Flaps, Ailerons, Rudder, Elevator Cockpit Displays Propulsion Systems Inertial Navigation Radar

  37. Operational Testing • Full motion simulators are developed to train aircrews, test usability of flight control systems, and human factors Flight Controls Flaps, Ailerons, Rudder, Elevator Cockpit Displays Propulsion Systems Inertial Navigation Radar

  38. Operational Testing • Radar, INS, offensive, operability, and defensive avionics are tested in antiechoic chambers Flight Controls Flaps, Ailerons, Rudder, Elevator Cockpit Displays Propulsion Systems Inertial Navigation Radar

  39. Operational Testing • Fight control software and systems are installed on “flying test beds” to ensure they work Flight Controls Flaps, Ailerons, Rudder, Elevator Cockpit Displays Propulsion Systems Inertial Navigation Radar

  40. Operational Testing • The whole system is put together and a flight test program undertaken Flight Controls Flaps, Ailerons, Rudder, Elevator Cockpit Displays Propulsion Systems Inertial Navigation Radar

  41. Operational Test Plan • Types of tests • Unit Tests – Test a component • Integration Tests – Test a set of components • System Tests – Test an entire system • Acceptance Tests – Have users test system

  42. Operational Test Plan • The Operational Test Plan should identify: • Objectives • Prerequisites • Preparation, Participants, Logistics • Schedule • Tests • Expected Outcomes and Completion • For each specific test detail: • Measurement/Metric, Objective, Procedure

  43. Verification and Validation Plans • Test plans, or more rigorous V&V plans, are often left for late in the development process • Many development models do not consider V&V, or focus on testing after the system is implemented • Verifying progress in development is a continuous, parallel process • Start thinking about V&V during requirements specification • How will these requirements be verified in design? Think traceability. • How will the implementation be verified? • What formulation of requirements will clarify system validation?

  44. Review • Testing can verify correct implementation of a design and verify operational performance • Testing can validate accomplishment of requirement specifications • Variety of test strategies must be tailored to the specific application depending upon: • Likely failure modes • Known complexities • Reliability and safety concerns

More Related