1 / 35

A Framework for Computer-aided Validation

A Framework for Computer-aided Validation. Presented by Bret Michael Joint work with Doron Drusinsky and Man-Tak Shing Naval Postgraduate School Monterey, CA. Disclaimer.

Download Presentation

A Framework for Computer-aided Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Framework forComputer-aided Validation Presented by Bret Michael Joint work with Doron Drusinsky and Man-Tak Shing Naval Postgraduate School Monterey, CA NASA IV&V Facility Workshop on Validation Morgantown, WV

  2. Disclaimer • The views and conclusions in this talk are those of the author and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of the U.S. Government NASA IV&V Facility Workshop on Validation Morgantown, WV

  3. Conventional Approach to Conducting IV&V • Relies on • Manual examination of software requirements and design artifacts • Manual and tool-based code analysis • Systematic or random independent testing of target code • Poses seemingly insurmountable challenges • Most of these techniques are ineffective for validating the correctness of the developer’s cognitive understanding of the requirements • For complex software-intensive systems, manual IV&V techniques are inadequate for locating the subtle errors in the software • For example, sequencing behaviors only observable at runtime and at such a fine level of granularity of time make human intervention at runtime impractical NASA IV&V Facility Workshop on Validation Morgantown, WV

  4. Software Automation • Holds the key to the validation and verification of the behaviors of complex software-intensive systems • Relies on formal specification of system behaviors • Requires breaking from time-honored rules of thumb about how to conduct IV&V • Enables IV&V teams to • Accelerate their productivity • Cope with the impacts of accelerating technological change, or what Alan Greenspan refers to as the “revolution in information technology” NASA IV&V Facility Workshop on Validation Morgantown, WV

  5. IEEE Definitions • Validation • “The process of evaluating a system or component during or at the end of the development process to determine whether a system or component satisfies specified requirements” • Verification • “The process of evaluating a system or component to determine whether a system of a given development phase satisfies the conditions imposed at the start of that phase” NASA IV&V Facility Workshop on Validation Morgantown, WV

  6. Current IEEE Standards View of Validation and Verification (V&V) • Checking the • Correctness of a target system or component against a formal model that is derived from the natural language requirements • Consistency and completeness of the formal models without ensuring that the developer understands the requirements and that the formal models correctly match the developer’s cognitive intent of the requirements NASA IV&V Facility Workshop on Validation Morgantown, WV

  7. IV&V Team’s Independent Requirements Effort • Describe the necessary attributes, characteristics, and qualities of any system developed to solve the problem and satisfy the intended use and user needs • Ensure that its cognitive understanding of the problem and the requirements for any system solving the problem are correct before performing IV&V on developer-produced systems NASA IV&V Facility Workshop on Validation Morgantown, WV

  8. Proposed Framework • Incorporates advanced computer-aided validation techniques to the IV&V of software systems • Allows the IV&V team to capture both • Its own understanding of the problem • The expected behavior of any proposed system for solving the problem via an executable system reference model NASA IV&V Facility Workshop on Validation Morgantown, WV

  9. Terminology as Usedin the Framework • Developer-generated requirements • The requirements artifacts produced by the developer of a system • System reference model (SRM) • The artifacts developed by the IV&V team’s own requirements effort NASA IV&V Facility Workshop on Validation Morgantown, WV

  10. Contents of a SRM • Use cases and UML artifacts • Formal assertions to describe precisely the necessary behaviors to satisfy system goals (i.e., to solve the problem) with respect to • What the system should do • What the should not do • How the system should respond under non-nominal circumstances NASA IV&V Facility Workshop on Validation Morgantown, WV

  11. Prerequisites for Using Computer-Based V&V Technology • Development of formal, executable representations of a system’s properties, expressed as a set of desired system behaviors NASA IV&V Facility Workshop on Validation Morgantown, WV

  12. Classes of System Behaviors • Logical behavior • Describes the cause and effect of a computation, typically represented as functional requirements of a system • Sequencing behavior • Describes the behaviors that consist of sequences of events, conditions and constraints on data values, and timing • In its vanilla form specifies sets of legal (or illegal) sequences NASA IV&V Facility Workshop on Validation Morgantown, WV

  13. Beyond Pure Sequencing • Timing constraints • Describe the timely start and/or termination of successful computations at a specific point of time • Example: Deadline of a periodic computation or the maximum response time of an event handler • Time-series constraints • Describe the timely execution of a sequence of data values within a specific duration of time NASA IV&V Facility Workshop on Validation Morgantown, WV

  14. Use Cases and UML Artifactsof the SRM NASA IV&V Facility Workshop on Validation Morgantown, WV

  15. Categories of FormalSpecifications of Behavior • Assertion-oriented specifications • High-level requirements are decomposed into more precise requirements that are mapped one-to-one to formal assertions • Model-oriented specifications • A single monolithic formal model (either as a state- or an algebraic-based system) captures the combined expected behavior described by the lower level specifications of behavior • Describes the expected behavior of a conceptualized system from the IV&V team’s understanding of the problem space • May differ significantly from the system design models created by the developers in their design space NASA IV&V Facility Workshop on Validation Morgantown, WV

  16. Example of ConductingAssertion-oriented Specification • Start with high-level requirement • R1. The track processing system can only handle a workload not exceeding 80% of its maximum load capacity at runtime • Reify R1 into lower level requirement • R1.1Whenever the track count (cnt) Average Arrival Rate (ART) exceeds 80% of the MAX_COUNT_PER_MIN, cnt ART must be reduced back to 50% of the MAX_COUNT_PER_MIN within 2 minutes and cnt ART must remain below 60% of the MAX_COUNT_PER_MIN for at least 10 minutes NASA IV&V Facility Workshop on Validation Morgantown, WV

  17. Continuation of Example • Map R1.1 to a formal assertion expressed as a Statechart assertion NASA IV&V Facility Workshop on Validation Morgantown, WV

  18. Advantages of Using an Assertion-Oriented Specification Approach • Requirements are traceable because they are represented, one-to-one, by assertions (acting as watchdogs for the requirements) • A monolithic model is the sum of all concerns: on detecting a violation of the formal specification, it is difficult to map that violation to a specific human-driven requirement • Assertion-oriented specifications have a lower maintenance cost than the model-oriented counterpart when requirements change (i.e., ability to adjust the model) NASA IV&V Facility Workshop on Validation Morgantown, WV

  19. Continuation of Advantages • Assertions can be constructed to represent illegal behaviors, whereas the monolithic model typically only represents “good behavior” • It is much easier to trace the expected and actual behaviors of the target system to the required behaviors in the requirements space and the formal assertions can be used directly as input to the verifiers in the verification dimension NASA IV&V Facility Workshop on Validation Morgantown, WV

  20. Continuation of Advantages • Conjunction of all the assertions becomes a “single” formal model of a conceptualized system from the requirement space • Can be used to check for inconsistencies and other gaps in the specifications with the help of computer-aided tools NASA IV&V Facility Workshop on Validation Morgantown, WV

  21. Validation of Formal Assertions • Formal assertions must be executable to allow the modelers to visualize the true meaning of the assertions via scenario simulations • One way to do this is to use an iterative process that allows the modeler to • Write formal specifications using Statechart assertions • Validate the correctness of the assertions via simulated test scenarios within the JUnit test-framework NASA IV&V Facility Workshop on Validation Morgantown, WV

  22. Validation of Statechart Assertion via Scenario-based Testing NASA IV&V Facility Workshop on Validation Morgantown, WV

  23. Process for Validating Assertions (Utilizing the Executable SRM) • Start by testing individual assertions using the scenario-based test cases to validate the correctness of the logical and temporal meaning of the assertions • Next test the assertions using the scenario-based test cases subjected to the constraints imposed by the objects in the SRM conceptual model • Then use an automated tool to exercise all assertions together to detect any conflicts in the formal specification NASA IV&V Facility Workshop on Validation Morgantown, WV

  24. A process for formal specification and computer-aided validation NASA IV&V Facility Workshop on Validation Morgantown, WV

  25. Runtime Verification (RV) • Uses executable SRMs • Monitors the runtime execution of a system and checks the observed runtime behavior against the system’s formal specification • It serves as an automated observer of the program’s behavior and compares it with the expected behavior per the formal specification • Requires that the software artifacts produced by the developer be instrumented NASA IV&V Facility Workshop on Validation Morgantown, WV

  26. Execution-based Model Checking (EMC) • Can be used if state-based design models are available • A combination of RV and Automatic Test Generation (ATG) • Large volumes of automatically generated tests are used to exercise the program or system under test, using RV on the other end to check the SUT’s conformance to the formal specification • Examples of ATG tools that can be used in combination with RV to conduct EMC • StateRover’s white-box automatic test-generator (WBATG) • NASA’s Java Path Finder (JPF) NASA IV&V Facility Workshop on Validation Morgantown, WV

  27. Execution-based Model Checking of State-Based Design Models NASA IV&V Facility Workshop on Validation Morgantown, WV

  28. Three Ways in Which to Use the Auto-generated Tests • To search for severe programming errors, of the kind that induces a JUnit error status, such as NullPointerException • To identify test cases which violate temporal assertions • To identify input sequences that lead the statechart under test to particular states of interest NASA IV&V Facility Workshop on Validation Morgantown, WV

  29. Example • StateRover generated WBTestCase creates sequences of events and conditions for the state chart under test • Only sequences consisting of events that the SUT or some assertion is sensitive to, by repeatedly observing all events that potentially affect the SUT when it is in a given configuration state, selects one of those events and fires the SUT using this event NASA IV&V Facility Workshop on Validation Morgantown, WV

  30. Hybrid Model- and Specification-based WBATG • StateRover’s WBTestCase auto-generates • Events • Time-advance increments, for the correct generation of timeoutFire events • External data objects of the type that the statechart prototype refers to • WBATG observes all entities, namely, the SUT and all embedded assertions • It collects all possible events from all of those entities NASA IV&V Facility Workshop on Validation Morgantown, WV

  31. Verification of Target Code • If only executable code is available, the IV&V team can use the StateRover white-box tester in tandem with the executable assertions of the SRM to automate the testing of the target code produced by the developer • Executable assertions of the SRM • Keep track of the set of possible next events to drive the SUT • Serve as the observer for the RV during the test NASA IV&V Facility Workshop on Validation Morgantown, WV

  32. Automated testing using the system reference model NASA IV&V Facility Workshop on Validation Morgantown, WV

  33. Manual Examination of the Developer-Generated Requirements • IV&V team can use the SRM to validate the textual descriptions of the requirements produced by the developer • Start by associating the developer-generated requirements with the use cases to obtain the context for assessing the requirements • Next, trace the developer-generated requirements to the other artifacts, for example trace the requirements to the • Activity and sequence diagrams to help identify the subsystems or components responsible for the system requirements • Domain model to identify the correct naming of the objects and events • Then use the traces to identify the critical components of the target system for more thorough testing NASA IV&V Facility Workshop on Validation Morgantown, WV

  34. The IV&V team needs to capture its own understanding of the problem to be solved and the expected behavior of any system for solving the problem, using SRMs Complex system sequencing behaviors can mainly be understood and their formal specifications can most effectively be validated via execution-based techniques We advocate the use of assertion-oriented specification We presented a framework for incorporating computer-aided validation into the IV&V of complex reactive systems We described how the SRM can be used to automate the testing of the software artifacts produced by the developer of the system Recap NASA IV&V Facility Workshop on Validation Morgantown, WV

  35. Challenge for the NASA’s Software Engineering Community • Taking the proposed exotic validation framework from being exotic to being ubiquitous while harnessing • “Creative destruction,” coined by the late Joseph Schumpeter • Reallocate resources to new, productive business practices (antithesis of catering to the human need for stability and permanence) • “Disruptive innovation,” coined by Clayton Christensen • Cause a technological innovation, product, or service to overturn the existing dominant technology or status quo product in the market NASA IV&V Facility Workshop on Validation Morgantown, WV

More Related