1 / 39

Assertion – Based Requirements Validation

Assertion – Based Requirements Validation. September 8, 2008. Patrick Theeke (PMP, CSQE) John Ryan cmg@ivv.nasa.gov. Assertion–Based Requirements Validation Objectives. Describe the process of using executable models for requirements validation

chrisreed
Download Presentation

Assertion – Based Requirements Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assertion–BasedRequirements Validation September 8, 2008 Patrick Theeke (PMP, CSQE) John Ryan cmg@ivv.nasa.gov

  2. Assertion–Based Requirements ValidationObjectives • Describe the process of using executable models for requirements validation • Discuss experiences and characterize outcomes from a pilot project employing the technique • Identify lessons learned to date and possible next steps

  3. Assertion–Based Requirements ValidationAgenda • Briefly describe Model-Based Requirements Validation • Define the process for Assertion-Based Requirements Validation • Present experiences of pilot project • Present observations • Discuss additional details via panel

  4. Requirements ValidationGoals of Requirements Validation • Assure that the documented requirements fully specify the capabilities and characteristics necessary for the system being defined to meet its goals • In particular, NASA IVV is interested in critical capabilities • Requirements defects include • Missing • Superfluous • Ambiguous • Incomplete • Inconsistent • Unverifiable • Incorrect

  5. Requirements ValidationValidation Approach • Build a standard of reference for comparison against the documented requirements of the system of interest • This standard is the System Reference Model (SRM), which represents IVV’s understanding of the system of interest Validated Requirements SRM Behaviors Software Requirements SRM Correct? Requirements Complete? Valid? SRM Correct? Requirements in Scope? Valid? Correct, Complete, Consistent, Unambiguous, and Verifiable Requirements Verifiable Requirements Requirements

  6. Requirements ValidationSRM components • Behaviors • Use cases and UML diagrams (e.g. activity diagrams) • Structure • Placeholder for named physical and logical components of the system • Identify component roles and events important to understanding the system • Independent Modeling Assertions (IMA) • Formalized IMA extracted from the other SRM components • Collection of unit tests to validate the IMAs • System Scenarios • Scenarios that describe various sequences of events • Nominal sequences (what the system is supposed to do) • Unwanted behaviors (what the system is not supposed to do) • Detection and response to adverse conditions (how the systems responds to adverse conditions)

  7. Assertion–Based Requirements Validation PilotAssertion Statechart Definition • Each Statechart Assertion is a formal specification • of a single requirement. • A statechart assertion is fundamentally a monitoring • device that observes system behavior and • determines whether that behavior is valid • Observed behavior is valid when it matches the • behavior specification coded into the • assertion, and invalid when it violates the • specification • An assertion is run against observable behavior, • typically supplied by some executable • artifact running under a test scenario

  8. SRM Findings Correlations with SRM Project Requirements Analyze Results Requirements ValidationCorrelation-Based Requirements Validation • Capture independent understanding of system behavior or characteristic in a SRM • Correlate project requirements with SRM behaviors • Analyze results to identify • Ambiguous, incorrect, incomplete, inconsistent, or non-verifiable requirements that correspond to SRM behaviors • Requirements without matching SRM behavior • SRM behavior without matching requirements

  9. Assertion–Based Requirements ValidationMotivation • Formalization of natural language requirements reveal errors in the specification • Leveraging formal assertions for requirements validation increases the utility of formalized requirements • Formalization of requirements is a necessary step in leveraging the SRM for verification • Utilizing assertions more closely mimics the highly parallel threads of execution of real time, reactive systems, including timing and synchronization, allowing more in-depth examination of complex behavioral interactions • Discovering requirements and representing them as assertions is more readily done early in the modeling process while the SRM is under development – so we might as well use them!

  10. Assertion–Based Requirements ValidationDependencies • Assertion Based Validation Depends on: • SRM was built to the appropriate level of depth and details • The requirements were traced to the SRM • The set of system scenarios provided an appropriate coverage of the system behaviors • The naming convention for the assertion events match the events of the formalized unit tests, system scenarios, and assertion state charts

  11. Project Requirements Analyze Results Assertion–Based Requirements Validation ProcessAssertion-Based Requirements Validation • Formally capture developer’s documented requirements as Project Based Assertions (PBA) • Exercise PBAs and IMAs using validated System Scenarios • Analyze results SRM Develop & Validate IMAs Develop System Scenarios Develop & Validate PBAs Correlations with SRM Execute Scenarios in Context of PBAs Findings

  12. Identify Relevant Documented requirements Interpret Requirements Formalize Requirements as PBAs Validate PBAs Project Requirements Assertion–Based Requirements Validation Process Develop & Validate Project–Based Assertions • Translate the documented requirements from the development project into a formal representation that is conducive to comparison with the SRM • Requires detailed understanding of the requirements • Exposes errors in the requirements specifications SRM Develop & Validate PBAs Analyze Results Correlations with SRM Execute Scenarios in Context of PBAs Findings

  13. Include PBAs in executable environment Execute system scenarios Collect results for further analysis Project Requirements Assertion–Based Requirements Validation ProcessExecute scenarios in Context of PBAs • We are interested in how the PBAs react to the system scenarios • Which system scenarios ended as expected? • Which system scenarios ended differently than expected? • Which PBAs were not exercised? SRM Execute Scenarios in Context of PBAs Analyze Results Develop & Validate PBAs Correlations with SRM Findings

  14. Ensure system scenario set is correct and complete Ensure PBA is correct Ensure PBA set is complete Ensure that PBA is in scope Project Requirements Analyze Results Assertion–Based Requirements Validation ProcessAnalyze Results • Examine each assertion failure for: • Errors in constructing the SRM (system scenarios, IMAs, etc.) • Errors in constructing the PBA(s) • Errors in a requirement • Fix the errors in the SRM • Rerun test until no SRM or PBA construction errors are discovered. Therefore all unexpected results indicate findings SRM Develop & Validate PBAs Correlations with SRM Execute Scenarios in Context of PBAs Findings

  15. Project Requirements Identify Candidate IMAs Analyze Candidate IMAs Represent Candidate IMAs in Natural Language Formalize IMAs as assertion state charts Validate IMAs Assertion–Based Requirements Validation PilotDevelop and Validate IMAs • Examine behavior in UML activity diagrams • Begin with preconditions, triggers, constraints • Assert important sequences that must occur in order (under all conditions or adverse conditions) • Constraints on looping (will continue to try until …) • Important executions if a critical behavior is unavailable • Any possible/impossible executions that will occur during off-nominal events • Look at the whole picture and observe modeling issues as well SRM Develop & Validate IMAs Develop System Scenarios Analyze Results Develop & Validate PBAs Correlations with SRM Execute Scenarios in Context of PBAs Findings

  16. Assertion–Based Requirements Validation PilotIdentify candidate IMAs • Actively Damp Nutation [After burn maneuver, need to stabilize damping before selecting Idle Mode to downlink data] Select Damping Mode Determine Attitude Configure Damping Filter Estimate Nutation Angle IF Angle > target: Then compute damping torque prior to controlling fired thruster burn IF Angle < target: continue process Estimate target Angle Check if < target Track time passing Once stablized for minimal time, GNC subsytem can Select Idle Mode

  17. Assertion–Based Requirements Validation PilotAnalyze candidate IMAs • Analysis No te: The nutation angle must be less than or equal to target nutation angle for the minimum amount of stable time to maintain an acceptable and stable nutation angle . [P <= R and T1 >= T2 (Tmin)] mus t Q1. The nutation angle (/can?) be less than or equal to target nutation angle for the minimum amount of stable time to select Idle Mode. Domain Answer – Actually both Ground and Fault Protection Mode can select Idle Mode at any time even if the above is not true ([P <= R and T 1 >= T2 (Tmin)]). Assumption: If there was a satellite timeout occurring, and if the fault protection mode has not been entered (due to # attempts or timeout), then the above is true. Q1b. The Reference Orientation (GN&C sub system) must maintain a nutat ion angle that is less than or equal to the target nutation angle for a minimal amount of stable time to select Idle Mode. (See Figure 16 and 20 ) Q2 . What occurs if the goal is met (nutation angle stabilized and maintainable) and something causes the ang le to exceed the nutation angle again? Domain Answer – The issue is monitored and the active damp nutation behavior is called again.

  18. Assertion–Based Requirements Validation PilotAnalyze candidate IMAs • Analyze each candidate assertion and create good and bad scenarios to represent the requirements • Use the diagrams in the SRM to derive scenarios or unit tests that will cause the requirement to succeed and fail • Use the scenarios to challenge the assertion against • redundancy of events • improper sequencing • Unknowns • misuse of system boundaries • Dependencies • and constraints (i.e time) • Determine whether or not the sequencing, repetition, looping, constraints, and concurrence always (“must”) occur this way or conditionally throughout the entire SRM

  19. Good and Bad Scenarios for Q1B: The Reference Orientation must maintain a nutation angle that is less than or equal to the target nutation angle for a minimal amount of stable time to select Idle Mode. Timer Tmin is the minimal amount of stable time required to reach a mainta inable angle Tmin = 5. state and allow ability to select Idle Mode. Angle Angle Angle Angle Angle Angle Nutation Nutation Nutation Nutation Nutation Nutation Exceed Target Exceed Target Select Idle Mode Select Idle Mode Time passed = 3 Time passed = 4 Time passed = 2 Time passed = 4 Exceed Target Reach Target Reach Target Reach Target Good Scenario Bad Scenario Assertion–Based Requirements Validation PilotAnalyze candidate IMAs

  20. Assertion–Based Requirements Validation PilotRepresent the candidate IMAs as NLR • Natural Language Requirement for Q1b • The Reference Orientation (GN&C sub system) shall achieve a stable nutation angle that is less than or equal to an acceptable target nutation angle for a minimal amount of time prior to selecting Idle Mode.

  21. Assertion–Based Requirements Validation PilotFormalize each Natural Language Requirement

  22. public void testDampNutation_IMA001_sc001(){ public void testDampNutation_IMA001_sc002(){ // this is testing a good scenario for RO selecting Idle Mode // this is testing a b scenario for RO selecting Idle Mode printState(IMA001); printState(IMA001); IMA001.reachTargetNutationAngle(); IMA001.reachTargetNutationAngle(); printState(IMA001); printState(IMA001); IMA001.incrSimTime(1); IMA001.incrSimTime(1); printState(IMA001); printState(IMA001); IMA001.exceedTargetNutationAngle(); IMA001.exceedTargetNutationAngle(); printState(IMA001); printState(IMA001); IMA001.reachTargetNutationAngle(); IMA001.reachTargetNutationAngle(); printState(IMA001); printState(IMA001); IMA001.incrSimTime(4); IMA001.incrSimTime(5); printState(IMA001); printState(IMA001); IMA001.ROselectIdleMode(); IMA001.incrSimTime(1); printState(IMA001); printState(IMA001); assertFalse(IMA001.isSuccess()); IMA001.ROselectIdleMode(); System.out.println("Completed testDampNutation_IMA001_sc002"); printState(IMA001); } assertTrue(IMA001.isSuccess()); System.out.println("Completed testDampNutation_IMA001_sc001"); System.out.println (); System.out.println (" ------------------------- "); System.out.println (); } Assertion–Based Requirements Validation PilotValidate each formalized IMA • Validate the IMA is unambiguous, correct, and consistent with the SRM behavior and scenarios. • Validate each model-based formalized requirement has been modeled correctly to represent the intended requirement • Create formalized unit tests for the scenarios • Validate the IMA against the formalized unit tests

  23. Identify Relevant Documented requirements Interpret Requirements Formalize Requirements as PBAs Validate PBAs Project Requirements Assertion–Based Requirements Validation PilotDevelop & Validate Project–Based Assertions • Translate the documented requirements from the development project into a formal representation that is conducive to comparison with the SRM • Requires detailed understanding of the requirements • Exposes errors in the requirements specifications Develop & Validate IMAs SRM Develop System Scenarios Develop & Validate PBAs Analyze Results Correlations with SRM Execute Scenarios in Context of PBAs Findings

  24. Assertion–Based Requirements Validation PilotDevelop & Validate Project–Based Assertions

  25. Assertion–Based Requirements Validation PilotValidate each formalized PBA public void testTorqueVector_IMA001(){ printState(pba001); pba001.beginJOIPhase(); printState(pba001); pba001.SRUUpdate(); printState(pba001); pba001.incrSimTime(4); pba001.beginFireThrusters(); printState(pba001); pba001.endFireThrusters(); printState(pba001); assertTrue(pba001.isSuccess()); System.out.println("Completed testTorqueVector_pba001"); System.out.println(); System.out.println("-------------------------"); System.out.println(); } • Validate the PBA is unambiguous, correct, and consistent. • Validate each project-based formalized requirement has been modeled correctly to represent the intended requirement • Create formalized unit tests for the scenarios

  26. Identify events Define system scenarios Implement system scenarios Validate system scenarios Project Requirements Assertion–Based Requirements Validation PilotAdd system scenarios to SRM • Derive system scenarios • Nominal, expected behavioral sequences • Behavioral sequences that should not be permitted • Detection and responses to adverse conditions SRM Develop & Validate IMAs Develop System Scenarios Analyze Results Develop & Validate PBAs Correlations with SRM Execute Scenarios in Context of PBAs Findings

  27. Assertion–Based Requirements Validation PilotIdentify Events • Identify events in the activity diagrams • Activities model the control of execution of one or more actions. Activities do not model sequences of event occurrences. • Activities do not often model timing constraints • Action names and control flow guards are defined in natural language and thus prone to ambiguity

  28. Assertion–Based Requirements Validation PilotDefine System Scenarios • “Good” and “Bad” Scenarios • Used the semantics of UML sequence diagrams to define a scenario, by modeling sequence of event occurrences (i.e. action execution start/finish events) • Used guard conditions in activity diagrams that should not hold true in the context of the goal as means of creating “bad scenarios”

  29. Assertion–Based Requirements Validation PilotImplement System Scenarios • Scenarios are implemented in J-Unit and reference events in assertions

  30. Assertion–Based Requirements Validation PilotValidate System Scenarios • Use the IMAs to ensure that the scenarios return the expected results • Q1 scenarios complete without errors, as expected • Q2 scenarios do not cause any assertions to “pop” • Q3 scenarios result in detection of an adverse condition and respond in a correct way, as expected • Scenarios represent a test oracle

  31. Include PBAs in executable environment Execute system scenarios Collect results for further analysis Project Requirements Assertion–Based Requirements Validation PilotExecute scenarios in Context of PBAs • Interested in how the PBAs react to the system scenarios • Which system scenarios ended as expected? • Which system scenarios ended differently than expected? • Which PBAs were not exercised? Develop & Validate IMAs SRM Develop System Scenarios Execute Scenarios in Context of PBAs Analyze Results Develop & Validate PBAs Correlations with SRM Findings

  32. Ensure system scenario set is correct and complete Ensure PBA is correct Ensure PBA set is complete Ensure that PBA is in scope Project Requirements Assertion–Based Requirements Validation ProcessAnalyze Results • Examine each assertion failure for: • Errors in constructing the SRM (system scenarios, IMAs, etc.) • Errors in constructing the PBA(s) • Errors in a requirement • Fix the errors in the SRM • Rerun test until no SRM or PBA construction errors are discovered. Therefore all unexpected results indicate findings Develop & Validate IMAs SRM Develop System Scenarios Analyze Results Develop & Validate PBAs Correlations with SRM Execute Scenarios in Context of PBAs Findings

  33. Assertion–Based Requirements Validation PilotObservations • Developing IMAs helps identify defects and deficiencies in the SRM. Doing this early in the SRM development process yields early benefits • The process of examining the documented requirements helps to identify omissions in the correlation process • The depth of understanding required to create IMAs and PBAs results in more rapid understanding of the system of interest • Assertions best represent requirements of specific behaviors • When creating PBAs for high level requirements such as “The system shall have the capability to do X”, these behavioral details need to be supplied from domain knowledge and analysis

  34. Assertion–Based Requirements Validation PilotObservations • It was easier than expected to: • Discover early questions and concerns that could lead to good assertions • Develop a high level understanding of what the system is supposed to do • Create higher level system scenarios that describe what the system is supposed to do • Develop Formal Assertion State Charts from Natural Language • Observe the lack of model depth, details • Discover possible missing behavioral IMAs • Write unit tests for IMAs and PBAs (although easier for IMAs if the SRM provides more information) • Write formalized Junit Tests • Test assertions against the System Scenarios

  35. Assertion–Based Requirements Validation PilotObservations • It was harder than expected to: • Vet good/worthwhile NL (informal assertions) from the list of answered questions • Create System Scenarios with the lack of model depth and details • Create System Scenarios that describe what the system is not supposed to do, and what the system is supposed to do under adverse conditions • Correlate system scenario events, IMA events, and PBA events without a data dictionary or complete Domain Model • Discover worthwhile PBAs from documented requirements that define and monitor a single event, and/or provide limited details and constraints • Discover matching IMAs and PBAs when there are limited requirement details • Discover matching IMAs and PBAs when the model is not developed to needed level of coverage/detail • Validate Requirements when the model is not developed to the needed level of coverage/detail • Set up the test rig for the first time

  36. Assertion–Based Requirements ValidationRecommendations • Develop IMAs early in the SRM development process • Consider developing IMAs as part of the process for reviewing activity diagrams • Continue refining techniques for using assertion-based methods for requirements validation

  37. Assertion–Based Requirements ValidationSummary • Model–Based Requirements Validation is part of the SRM Validation process • The purposes for using assertions include: • Have ability to test conflicts and validate complex requirements • Provides an advantage of using formal requirements over natural language • Sets groundwork for automated testing • Use assertions to validate requirements that correspond to SRM behaviors and IMAs as unambiguous, correct, complete, consistent, and verifiable. • Use assertions to ensure that the right behaviors have been defined and the behaviors are of high quality. The right behaviors are those that adequately describe: • What the system is supposed to do • What the system in not supposed to do • What the system is supposed to do under adverse conditions Assertion–based validation leverages our independent modeling to find issues earlier in the life–cycle

  38. Assertion–Based Requirements ValidationPanel Discussion • Michael Facemire • John Ryan • Bill Stanton • Patrick Theeke

  39. References • D. Drusinsky, “From AD to Assertions”; informal presentation, 2008. • D. Drusinsky, J. B. Michael, and M. Shing, “A Framework for Computer-aided Validation”; Naval Postgraduate School, NPS-CS-07-008, 08/2007. • K. Hurley, and D. Kranz, “Executable Reference Model”; informal presentation, 2008. • Juno IVV Team, “Juno Level-4 Flight Software Requirements Validation Report”; NASA IV&V, 2008. • D. Kranz, and J. Ryan, “Juno and Assertion Based Validation”; NASA IV&V, white paper, 2008. • NASA IV&V Facility, “System Level Procedure 09-1”; 2007. • S. Raque, “SRM Example Overview”; informal presentation , 2008. • P. Theeke, “A process architecture for reference model based validation”; NASA IV&V, white paper, 2007. • P. Theeke, “System Reference Models and Executable Assertions for Requirements Validation”; NASA IV&V, white paper, 2007.

More Related