1 / 62

Validating Dependability and Safety Requirements

Validating Dependability and Safety Requirements. Stephen B. Driskell, NASA IV&V Facility Man-Tak Shing, Naval Postgraduate School. Outline. Dependability Requirements Defining System Qualities Utility Tree Quality Attribute Scenarios Validating Dependability Requirements

darcie
Download Presentation

Validating Dependability and Safety Requirements

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Validating Dependability and Safety Requirements Stephen B. Driskell, NASA IV&V Facility Man-Tak Shing, Naval Postgraduate School Third NASA IV&V Workshop on Validation and Verification

  2. Outline Dependability Requirements Defining System Qualities Utility Tree Quality Attribute Scenarios Validating Dependability Requirements Expected Outcomes Dependability Modeling and Validation Example – JWST Validation Metric Framework for System Software Safety Requirements Conclusions Third NASA IV&V Workshop on Validation and Verification

  3. Acknowledgements • Dr. Butch S. Caffell, Visionary • Marcus S. Fisher, Chief Engineer • Dependability Tiger Team members • Dr. Man-Tak Shing • Dr. James Bret Micheal • Kenneth A. Costello • Jeffrey R. Northey • Stephen B. Driskell • SRMV Example Contributors • Dr. Khalid Latif • Jacob Cox • Dr. Karl Frank

  4. Dependability Requirements (1/3) Defined by a set of system quality attributes Availability Reliability Safety Security (confidentiality and integrity) Maintainability Third NASA IV&V Workshop on Validation and Verification

  5. Dependability Requirements (2/3) They are modifiers of the functional requirements Cross-cutting Affects many different functions of a system Architectural drivers Substantial influence on a choice among architectural alternatives The basis for prioritizing the design goals to guide design tradeoffs Third NASA IV&V Workshop on Validation and Verification

  6. Dependability Requirements (3/3) Attributes are system characteristics defined in the context of the operating environment Some can be measured directly (e.g. performance, availability) Others (e.g. safety and security) are expressed in terms of quantifiable proxy attributes All attributes must be traced from requirements to model to implementation in order to accomplish IV&V analysis & testing Third NASA IV&V Workshop on Validation and Verification

  7. Two types of Quality Attribute Requirements Requirements that define quality attributes and how and when to measure them E.g.: The system shall measure and report ‘reservation completion time’, starting from the display of the first flight query screen and ending with the display of the screen giving the reservation confirmation number Requirements that specify what values of the quality attribute measure indicate sufficient quality E.g.: The ‘reservation completion time’ for an experienced tester on a lightly loaded system making a Type 3 reservation shall be less than two minutes Third NASA IV&V Workshop on Validation and Verification

  8. Defining Dependability Requirements via Utility Trees A top-down vehicle for characterizing the quality attribute requirements prioritization, difficulties quality goals quality attribute scenarios Third NASA IV&V Workshop on Validation and Verification

  9. Quality Attribute Scenario (1/4) Textual description of how a system responds to a stimulus Make up of A stimulus A stimulus source An artifact being stimulated An environment in which the stimulus occurs A response to the stimulus A response measure (to quantitatively define a satisfactory response) Third NASA IV&V Workshop on Validation and Verification

  10. Quality Attribute Scenario (2/4) Example – Configurability Stimulus: Request support for a new type of sensor Stimulus source: The customer Artifact: The system and the customer support organization Environment: After the software has been installed and activated Response: The customer support engineer reconfigures the system to support the new sensor Response measure: No new source code, no extraordinary downtime, and commencing operation within one calendar week Third NASA IV&V Workshop on Validation and Verification

  11. Quality Measures The amount of new source code, the amount of downtown, the amount of calendar time to bring a new sensor on line Quality Requirement Zero new source code No extra downtime (need to define the meaning of “extra”) Less than one calendar week to bring new sensor on line Quality Attribute Scenario (3/4) Third NASA IV&V Workshop on Validation and Verification

  12. Implications Zero new source code => limit on the range of new sensors No extra downtime => reconfiguration without shutting down the system and has to be performed by expert from the installation organization (instead of the customer organization) Quality Attribute Scenario (4/4) Third NASA IV&V Workshop on Validation and Verification

  13. Quality Attribute Scenario Types Normal operations This are the most obvious scenarios System-as-object scenarios Treat the system as a passive object being manipulated by an actor (e.g. programmer, installer, administrator) Growth scenarios Deals with the likely or plausible changes to the requirements in the future (e.g. 50% increase in capacity requirement) Help future-proof the system under development Exploratory scenarios Improbable scenarios, useful to stimulate thinking about implicit assumptions underpinning the architecture E.g., loss of power from an uninterruptible power supply Third NASA IV&V Workshop on Validation and Verification

  14. Quality Attribute Scenario vs. Use Case Scenarios Use case scenarios focus on the functional behaviors (normal operation and exceptions) of the system Quality attribute scenarios focus on satisfaction of a response measure Need to establish trace links between quality attribute scenarios and the use cases/use case scenarios they correspond to Third NASA IV&V Workshop on Validation and Verification

  15. Validating Dependability Requirements (1/2) • Sufficiency of the requirements • Do the dependability requirements adequately describe the desired attributes of a system that meets the operation needs? • Are the dependability requirements correctly prioritized? • Are the dependability requirements verifiable? Or are they traceable to derived system requirements that are verifiable? Third NASA IV&V Workshop on Validation and Verification

  16. Validating Dependability Requirements (2/2) • Expected outcomes • Evidence of correct IV&V understanding and prioritization of the quality attributes via the SRMs • Evidence of properly specified system capabilities to achieve the required quality attributes • Dependability and Safety requirements are clearly defined in terms of dependability/safety cases • Evidence is formally captured in Validation Reports Third NASA IV&V Workshop on Validation and Verification

  17. Dependability Modeling Example - JWST (1/2) Focus on availability and safety NASA IV&V uses capabilities and limitations, along with Portfolio Based Risk Assessment to prioritize behaviors for model development Availability Ensures that the modeled system has preventative behaviors to avoid the “undesired behaviors (Butch Q2)” Models the responsive behaviors to handle “adverse conditions (Butch Q3)” Safety Models the safety criteria by generating model elements that focus on the fault detection, fault response, and fault recovery Focus on fault response in terms of fail-safing in case of high severity faults (level-1) or fault-isolation in case of low severity faults (Level-3). Third NASA IV&V Workshop on Validation and Verification

  18. Dependability Modeling Example - JWST (2/2) Fault management elements include use cases and activity diagrams for Perform Onboard Fault Management (POBFM) Perform Ground Based Fault Management for Observatory faults (PGBFMO) Ground Based Fault Management for Ground faults (PGBFMG) Recover from Loss of Uplink, Recover from Loss of Downlink. Third NASA IV&V Workshop on Validation and Verification

  19. Use Case, JWST SRM: Mission Level UC diagram Workshop on Validation and Verification

  20. Activity Diagram , JWST - NIRSpec DS Test Limitations related to the command processing behavior “Availability” (TIM-2419) Test script CX6 not testing the invalid parameter values for the dwell command. This may result in unexpected results from the command execution. This in turn will result in providing an incorrect set of telemetry, or in the worst case, crashing the NIRSpec DS, in turn impacting the JWST science mission.

  21. Dependability Validation Example – JWST (1/4) Dependability criteria Recoverability Robustness Consistency Correctness Safety criteria Fault Avoidance Fault Warning Fault Correction Fault Tolerance Fail Operational Fail Safe Third NASA IV&V Workshop on Validation and Verification

  22. Dependability Validation Example – JWST (2/4) Recoverability validation By correlating the requirements that allow ground controllers or onboard systems to recover from the failure Most of the JWST observatory recovery is performed by the ground commands Robustness validation By correlating the safety behaviors to the requirements that allow the onboard systems to detect and respond to the faults Consistency validation By ensuring that the multiple requirements for given behavior are consistent within the SRD (System Requirements Document) and the constraints for a given behavior correlate with the requirements for preventative and responsive behaviors Correctness validation By ensuring that requirements are fully specifying the behaviors modeled in the SRM Third NASA IV&V Workshop on Validation and Verification

  23. Dependability Validation Example – JWST (3/4) Fault Avoidance SRM includes the “preventative behaviors” The requirements correlated to these behaviors allow JWST SRMV to validate the fault avoidance. Fault Warning SRM includes the desired behavior for the “detection of faults” The requirements correlated to these behaviors allow JWST SRMV to validate the fault avoidance. Fault Correction SRM includes the “responsive behaviors” that model how JWST observatory will self-correct such as reset the faulty component in order to continue science operations The requirements correlated to these behaviors allow JWST SRMV to validate the fault avoidance. Third NASA IV&V Workshop on Validation and Verification

  24. Dependability Validation Example – JWST (4/4) Fault Tolerance SRM includes the “responsive behaviors” that model how JWST will select alternative paths in response to a fault or adverse condition The requirements correlated to these behaviors allow JWST SRMV to validate the fault avoidance. Fail Operational SRM includes the “responsive behaviors” that model how JWST will select respond to a single fault such as Level 3 fault with a science instrument. And still stay operational and continue operating with the remaining instruments. The requirements correlated to these behaviors allow JWST SRMV to validate the fault avoidance Fail Safe SRM includes the “responsive behaviors” that model how JWST will respond to Level 1 or level 2 fault and transition to a safe state The requirements correlated to these behaviors allow JWST SRMV to validate the fault avoidance Third NASA IV&V Workshop on Validation and Verification

  25. Validating Software Safety Requirements Cannot be validated in the traditional way Matching stakeholder requirements and expectations (“the system behaves safely) to system behavior Need to develop a safety case Sufficiency of hazard identification Adequacy of hazard analysis to identify software’s role in causing these hazards Traceability from the derived software safety requirements to the identified system hazards Indication of the completeness of the set of software safety requirements Third NASA IV&V Workshop on Validation and Verification

  26. A Metric Framework for Software Safety Requirements Validation Utilize Goal-Question-Metric (GQM) approach Measurement Goals Questions to characterize the object of measurement Metrics to answer the questions in a quantitative way Augment with Goal Structuring Notation Highlight the context, justification, assumptions, strategies, and solutions Third NASA IV&V Workshop on Validation and Verification

  27. Validation Metric Framework (cont’d) Third NASA IV&V Workshop on Validation and Verification

  28. Validation Metric Framework (cont’d) Third NASA IV&V Workshop on Validation and Verification

  29. Validation Metric Framework (cont’d) Third NASA IV&V Workshop on Validation and Verification

  30. Sample Application(1/9) Demonstrate the application of the framework with a fictitious safety-critical, software-intensive Rapid Action Surface-to-Air Missile (RASAM) system EPSH = 49.1%, σ = 9.1% Third NASA IV&V Workshop on Validation and Verification

  31. Sample Application(2/9) M1: % S/W Hazards Sample 4 shows a reduction in PSH, with |PSH – EPSH| = |37.8 – 49.1| = 11.3% > σ The results of M1 indicate that further investigation needs to be made to determine the sufficiency of hazard identification Third NASA IV&V Workshop on Validation and Verification

  32. M2: Software Hazard Analysis Depth (SHAD) Sample Application (3/9) Third NASA IV&V Workshop on Validation and Verification

  33. Sample 1 shows HAAH = ∑HAA = 3 + 2 + 3 + 3 + 2 + 1 = 14 HAAMED = ∑HAA = 2 + 3 + 2 + 2 + 3 = 12 CH = [HAAH / HASH] * 100% = [14/18] * 100% = 78% CMED = [HAAMED / HASMED] * 100% = [12/15] * 100% = 80% SHAD = (CH + CM)/2 = (78% + 80%) / 2 = 79%. Sample Application (4/9) Third NASA IV&V Workshop on Validation and Verification

  34. M3: Percent Software Safety Requirements EPSSR = 38.2%, σ = 6.5%. Sample Application (5/9) Third NASA IV&V Workshop on Validation and Verification

  35. M3: PSSR Sample 4 shows|PSSR – EPSSR| = |41.9 – 38.2| = 3.7% < σ. The results indicate a sufficient number of software safety requirements are being developed, thus instilling confidence that the safety requirements are indeed valid Sample Application (6/9) Third NASA IV&V Workshop on Validation and Verification

  36. M4, M5 and M6 - Percent software hazards with safety requirements Sample Application (7/9) Third NASA IV&V Workshop on Validation and Verification

  37. M4, M5 and M6 All high- and medium-risk software hazards have associated software safety requirements Some moderate-risk software hazards that are not mitigated through software safety requirements. Sample Application (8/9) • Further investigation is required to determine either the validity of the software hazard, or the validity of the set of software safety requirements. Third NASA IV&V Workshop on Validation and Verification

  38. M7: Percent software safety requirements traceable to hazards Sample Application (9/9) • All software safety requirements are traceable to software hazards at the end of the design phase • This strengthens the case that the derived software safety requirements are valid Third NASA IV&V Workshop on Validation and Verification

  39. Conclusion (1/3) Dependability requirements are modifiers of the functional requirements Addressed by Butch’s Q2 and Q3 Focused model analysis enables IV&V improvements to Dependability and Safety Additional model elaboration and approach refinement improves the Dependability and Safety of the Missions analyzed Third NASA IV&V Workshop on Validation and Verification

  40. Conclusion (2/3) System software safety requirements cannot be validated directly Need to validate their sufficiency via proxies NASA Safety standards recommend a “What, Why, How” Conops approach for establishing goals and delivering products Validation metrics help identify potential problems early on in the development lifecycle Third NASA IV&V Workshop on Validation and Verification

  41. Conclusion (3/3) Additional Dependability studies will continue to improve the contributions of requirements to implementation for NASA IV&V Development of a master glossary and dictionary for IV&V would support the approach Third NASA IV&V Workshop on Validation and Verification

  42. Questions? Third NASA IV&V Workshop on Validation and Verification

  43. Dependability Criteria Third NASA IV&V Workshop on Validation and Verification

  44. Safety Criteria Third NASA IV&V Workshop on Validation and Verification

  45. JWST SRM: Mission Level UC diagram Third NASA IV&V Workshop on Validation and Verification 45

  46. MSL Example - Dependability of Behaviors Dependability using Behavior and Domain Modeling Jacob Cox jacob.cox@ngc.com The models included are for illustrative purposes only. Third NASA IV&V Workshop on Validation and Verification

  47. Objective • Show how behavior and domain models can be used to derive dependability • Behavioral models • Activity Diagrams • State Diagrams • Sequence Diagrams • Domain models • Class Diagrams containing hardware and software components • Dependability is the probability of success of a behavior Third NASA IV&V Workshop on Validation and Verification

  48. Simplifying Assumption Focus is on the Main Success Scenario Because, in general, NASA IV&V has adopted the concept of Extension Scenarios as outside the norm. Decisions rarely matter Most decisions between the initial node and the main success final node are loops. Third NASA IV&V Workshop on Validation and Verification

  49. Concept The dependability, probability of the success, of an activity, a behavior, is the product of the success probabilities of all the actions in the Main Success Scenario. Third NASA IV&V Workshop on Validation and Verification

  50. Entry to Mars MSS Third NASA IV&V Workshop on Validation and Verification

More Related