1 / 34

Requirements Verification

Requirements Verification. Bill Fournier billfournier4@gmail.com 202 255 0760. Agenda. Definitions Confidence Top level Planning Concepts Requirements understanding & traces Relationship to other topics

Download Presentation

Requirements Verification

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Requirements Verification Bill Fournier billfournier4@gmail.com 202 255 0760 Bill Fournier Oct 2014

  2. Agenda • Definitions • Confidence • Top level Planning Concepts • Requirements understanding & traces • Relationship to other topics • Similarities and differences, and synergy to other types of Verification • Methods • Events • Redo /back-ups and flexibility • Closure • Lessons learned

  3. Verification • Definition • Confirm or prove by Verifiable Objective Evidence (VOE) that the requirement has been met with the final configuration • Answers the question, “Did we build it right?” Verifiable Objective Evidence • Definition- factual proof that a requirement has been meet or exceeded. • Documented trace and path to evidence • Objective as possible • Evidence beyond a reasonable doubt • Legalese- Shall’s not will’s, may’s, or should’s • Perspective- 4 step rigorous quality process of plan, do, analyze, close • When plan & influence execute right side of Verification “Vee” • Types of verification • GMD

  4. Requirements • What system must do • Not Statement of Work (SOW) • Stakeholders vs. technical requirements • Requirements Understanding (RU) or context statements if not only one reasonable interpretation and requirement baselined. • Parallel section 3 and 4 • RVTM • Accuracy • Confidence • States, modes • Normal, stressing, both, all • Compound requirements Bill Fournier

  5. Requirements Tracing • Up and down • Requirements from parent to children and from children to parents • Parents • Requirements which spawn other lower level requirements • Children • Requirements spawn from a higher level requirement • Orphans • Requirements at second level and below without parents • Tight • All children and grandchildren completely make up a parent Requirement • Large implication to reusing lower level (subsystem) verification closures.

  6. Validation • Definition • Assessing we meet what user needs • Perspective • Did we do right thing (big picture), not did we do what we planned in specification • Difference from Verification more nebulous, more sampling, & user environment and really used • Types • Requirements - System • System • Software Independent Verification & Validation • Model & Simulation Verification Validation & Accreditation Bill Fournier

  7. Relationship to Validation • Verification 1000s versus validation 10s • Verification assumes SEs did stakeholder/user translation assumptions were correct to specifications • Verification assumes system user will use system the way the SE assumed and with other related systems • Validation does not make these translation, user, related systems assumptions and tries to see system really do the job at a high level of detail

  8. Accreditation • Definition- approval of authority of a M&S adequate for Intended Usage • DoD requires of a M&S used for Verification • Analyzing data pedigree of M&S for its intended use • Ideally physics-based • Test data • Correlation with more credible M&S

  9. Assessment • Definition • Assess ability and progress to meet requirements with less than final configuration and / or verification environment • Perspective • Risk reduction and “what if” • Types • Risk reduction like a TPM as early warning indicator for system • Verification light to reduce verification risk can verify well

  10. Configuration • Definition • Item that is controlled • Type – baselines on increasing level of control • Functional • Allocated • Production… • No one size fits all • Depends on technology, maturity, supportability, and acquisition/contracting strategy

  11. Confidence • Definition • Degree of comfort that it is real • Conditions • Temperature, threat, states, modes • Operator • Field, engineer, scientist • Pedigree • Comfort on data applicability • With how much precision should you state results? • M&S fidelity - low to high • Missile Defense Agency examples • Interrelated requirements • Runs • Design of Experiments (DOE) • Normal / exponential • Bayesian statistics • Key verification issue confidence per $

  12. Rigor • Definition • Repeatability, exactness, reality, with stated assumptions and conditions • Rigor • Requires planning and being methodical • May require more money up front • May save money by preventing rework but • Should allow identical redo by another organization with the same results of method within randomness and measurement error • Repeatability: Can an outside organization repeat? • Assumptions agreed with customer as reasonable and acceptable • Rigor assumption could relate to confidence customer willing to pay

  13. Verification Cost • Method choice • Tools and Computer Aided Systems Engineering (CASE) tools • Environment • Standard accuracies • Confidence • Not process • Event efficiencies • Data reuse/ Trace • Envelope assumption • Requirement Understanding • Environment normal vs. 3 standard deviation vs. worst case • Regression/ Reverification • Navy five interrelated requirements example

  14. Process Choice & Timing & training • INCOSE Handbook task Verification process = Verification closure process after Requirements done and have a good RVTM • Factors that influence choice • Experience • Schedule • Cost • Quality • Confidence • Process training

  15. Approaches, Perspectives, Themes • Trust but verify • Communication • Clear, consistent, one definition, requirements understanding • Quality and configuration management perspective • Tailoring • system, testability, rigor, cost, variability, stressing, and confidence • Risk-based prioritization • Plan early on appropriate levels a lead time away - levels of planning as relates to design baseline levels • Maximize risk based Confidence for $

  16. Other Types of Verification • Requirements verification vs. SW IV&V vs. M&S VV&A • Similarities • Trying to answer same questions for system, software, or M&S • Differences • Narrowing scope and degree of formality and all versus sampling of requirements. • Synergy • With care, the events and data can be sometimes be shared between three types of verifications

  17. Why Verification Methods? • Four main reasons for using verification methods • Plan events; lump similar requirements together • Tools long lead Items (realize early need a new M&S or test Tool) • Price / schedule effort (help with early estimates number of requirements by verification method estimates) • Specialization take advantage (as an example of test engineer or performance analyst functionality)

  18. Methods • Definition • Top level technique to provide data • The point here is to reduce method overlaps • Reuse with integration and validation • Decisions involved in selecting methods - cost, schedule, confidence, feasibility • Test events may include both test and demonstration methods • An old definition is still acceptable: Tests are objective, demos are subjective • INCOSE Handbook definitions, paraphrased: • TestActive test with heavy instrumentation, quantitative requirements • DemonstrationActive test with limited instrumentation, qualitative requirements • AnalysisMathematical formula or M&S • Inspection (Examination) Passive visual of product, process, or data • Certification Bill Fournier

  19. Verification Plan • Early at top level – include verification environment • Relationship to other plans • Maturity • Useful document • Tailored • Consistent • Assumptions and resources • Standards • Process and closure process • Dependencies, long lead items, critical items • At least Three levels of planning

  20. Planning for methods • Confidence required vs. Cost • Priority of Requirements • Some obvious choices • Some family of Requirements can be easily combined • Some multiple methods • Requirements Verification Traceability Matrix (RVTM) • Organizational preferences

  21. Processes Selection Criteria • Quality of process • Other examples • Most processes adequate quality. • Experience with process • Tools / Databases for process • Cost and fit for system verifying • Verification environment • Verification risks of top few % of requirements

  22. Tools • Planning, Requirements and lead time for • M&S and Analysis tools • Test tools (Screen captures, instrumentation on test unit while flying, outside sensors, truth sources) • Inspection tools • Requirements/ trace and version tools • CAD/CAM • CASE tools • Tool calibration to other tools, physics, or more credible events • Version management tools • Metrics

  23. Test • Reason beyond verification for assessment and risk reduction • Test method, test events different but overlap • Test vs. demonstration • Quantitative requirements • Most popular and best comfort • Can be very expensive or impossible • Challenge - normal, stressing, concerns envelope • Challenge requirements may interrelate and all stressing requirements may be impossible

  24. Analysis • Probability distribution • Mathematical Analysis • M&S Analysis- Allows lots of runs and unsafe conditions • Repeatable • Assumptions • Calibration/anchoring • M&S fidelity • Intended Usage Accreditation • Cost usually primarily tool cost • Bayesian and /or Design of Experiments (DOE)

  25. Detail Analysis Plans • Intermediate level • Assumptions • Tools • Run matrix • Analysis method • Ability for someone to repeat • Resources • Tie to Verification

  26. Inspection • Product or data examination • Preplan qualification of inspector and whether data be easily available • Lower level data roll-up • Lowest cost method tied with mathematical analysis • More common higher level of systems • More common in design consideration requirements • Key question in trace roll-up do children equal parents.

  27. Events • Definition • Specific formal set of activities to generate data • Reasons • Lump similar requirements, use specialization, and plan events • Reuse with integration and validation • “Test” definition confusion events vs. methods vs. test/verification • Test for verification, assessment, and risk reduction reasons • Common types for verification • Flight, sea, ground, space tests • Qualification test • Integration test • Environmental test • Analysis events • Performance, communication, Reliability, Availability, Maintainability (RAM) • Inspection events • Quality, design, standard compliance, and lower level data • Maintenance Demonstration

  28. Event Types • Test / Demonstration • Development • Qualification • Environment • RMA • Acceptance • Operational • Part / subsystem / system/ system of systems levels • Analysis events • Inspection events • Tag on events

  29. Verification Closure Schedule • Dependent on events report and configuration • Key to right events • Back up events if must redo • All requirements verification are not created equal • 80/20 rule • Not linear • What is your long pole Bill Fournier

  30. Closure Forms/Topics for Closure Report • Method (s) • Tool (s) • Success criteria • Test cases/ scenarios • Test conditions • Assumptions/ Requirements understanding • Summary • Individuals involved • Data source • Configuration • Rationale • Events • Constraints

  31. Verification Closure Roll-Up / Trace • Databases • Tools • Requirements wording & context • Verification method definition • Trace (tight)

  32. Cost • Automation with quality control • Sampling independent repeat • Automated test • Tolerance and conditions • Number of runs • Trace and verification closure reuse • Combing requirements for events • Multipurpose events and their cost risk

  33. Redo / Reverify • Failure (failed or no event data) or change • More than just software regression testing • Some people call this regression • Cannot 100% test software • Portion changed or failed • Impact / risk • Replan • Rerun / redo • Rerun one run, in between, all original runs

  34. Lessons Learned • Good requirements • All Requirements are not equal • Flexibility of method, Events • Willingness to tradeoff weighted summation of Confidence and Cost per Requirement • Importance of clear consistent communication • Long term planning with appropriate level of detail and long lead items for Verification Bill Fournier

More Related