1 / 40

Safety Critical Systems 5 Formal Verification and Testing

Safety Critical Systems 5 Formal Verification and Testing. T 79.5303 Safety Critical Systems. Requirements Model. Requirements Analysis. Test Scenarios. Test Scenarios. System Acceptance. Requirements Document. Functional / Architechural - Model. System

guido
Download Presentation

Safety Critical Systems 5 Formal Verification and Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Safety Critical Systems 5Formal Verification and Testing T 79.5303 Safety Critical Systems

  2. Requirements Model Requirements Analysis Test Scenarios Test Scenarios System Acceptance Requirements Document Functional / Architechural - Model System Integration & Test Systems Analysis & Design Specification Document Knowledge Base * Software Design Module Integration & Test Software Implementation & Unit Test * Configuration controlled Knowledge that is increasing in Understanding until Completion of the System: • Requirements Documentation • Requirements Traceability • Model Data/Parameters • Test Definition/Vectors Formal Verification/testing

  3. Verification and validation • Verification is the process of determining that a system or module meets its specification. • Validation is the process of determining that a system is appropriate for its purpose. • Testing is a process used to verify or validate system or its components.

  4. Prover Formal Verification

  5. Prover

  6. Prover

  7. Prover

  8. Prover

  9. Prover

  10. Prover

  11. Prover

  12. Prover

  13. Prover iLock for Signalling

  14. Requirements Model Requirements Analysis Test Scenarios Test Scenarios System Acceptance Requirements Document Functional / Architechural - Model System Integration & Test Systems Analysis & Design Specification Document Knowledge Base * Software Design Module Integration & Test Software Implementation & Unit Test * Configuration controlled Knowledge that is increasing in Understanding until Completion of the System: • Requirements Documentation • Requirements Traceability • Model Data/Parameters • Test Definition/Vectors Testing

  15. Testing in different stages of V Testing is performed during various stage of system development. Module testing – evaluation of a small function of the hardware/software. System integration testing – investigates correct interaction of modules. System validation testing – a complete system satisfies its requirements.

  16. Forms of Testing Dynamic testing - execution of the system or component in the natural/simulated environment. Functional – test all functions Structural – test signal/test cases (glass-box) Random – n-dimensional input space Static testing - reviews, inspections and walkthroughs. Static code analysis for software. Modelling - mathematical representation of the behaviour of a system or its environment.

  17. Testing Methods Black-box testing – requirements-based, no information of the system, what is inside. White-box testing – more information about the system design to guide testing. Open view glass box. Gray-box testing – open the internal structure, but not detailed information

  18. Dynamic testing techniques Dynamic testing standards IEC1508, BCS (British Computer Society), Def STAN 00-55 and DO-178B. - Process simulation Error seeding/guessing Timing and memory tests Performance/stress testing Probabilistic testing – values for failure rates

  19. Test planning Lifecycle Phase Activity Safety case Requirements Hazard identification Analysis results Test planning Identify tests integrity Strategy for V/V Req/Design/Test Trace hazards to specs. Risk reduction Req/Design Define specs Design analysis Safety Functional Requirements are the actual safety- related functions which the system, sub-system or item of equipments required to carry out. (CENELEC)

  20. Simulator testing • Safety critical standards e.g. Def STAN 00-551 recommend that if a simulator is used to validate a safety-critical system then the simulator should be properly validated. • In industry, simulators are validated using ad hoc techniques and no guidelines on simulator validation are available.

  21. Simulator testing • Modified lifecycle model which illustrates the importance of environment simulation and helps to define the techniques which should be adopted. • This model expands the conventional ‘V’ model to form a ‘W’ model, where the left hand side represents the development of the product and the right hand side the development of the simulator used to test it. • The ‘W’ lifecycle model defines a similar set of phases for the development of the environment simulator to those used in the development of the product itself. This does not necessarily imply that the amount of effort required in the former is equal to that in the latter.

  22. The ‘W’ Model of the Software Development Lifecycle

  23. Statistical software testing • A type of random testing – input=>output • Provides quantifiable metric of software integrity = probability of failure and reliability figures • A proper environment simulation is needed • Statistical method(s) is needed to produce an estimate of probability of failure and a measure of the confidence in that estimate

  24. Safety Case / Lifecycle 1

  25. Safety Case / Lifecycle 2

  26. Test plan /activities

  27. Definitions of Testability • The degree to which a system or component facilitates the establishment of test criteria and the performance of tests to determine whether those criteria have been met. • The effort required to apply a given testing strategy to a system. • The ease with which faults in a system can be made to reveal themselves during testing.

  28. Enough Testing “How much testing do I need to do to prove that my system is safe?”. • An industrial project developed results which included the situation where failures were observed during testing. For example, a 99% confidence that the probability of failure on test demand is smaller than 0,001 requires about 5000 demands all of which are successful. • Safety critical system testing starts, when normal industrial testing procedures has passed without a single failure.

  29. Testing Home assignment: 12.7 Describe the characteristics of the three major categories of dynamic testing and give examples of techniques that fall within each group. State whether each group corresponds to a black-box or a white-box approach. Please email to herttua@uic.asso.fr by 24 of April 2008 References: KnowGravity, I-Logix, Contesse project, Prover

More Related