1 / 59

Requirements Engineering

Requirements Engineering. Quality Management. Software Quality Management. What is Quality in Software ? The end product should meet the specification Issues Bad or imperfect specification Non-functional requirements What is Quality Management ?

ama
Download Presentation

Requirements Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Requirements Engineering Quality Management

  2. Software Quality Management • What is Quality in Software ? • The end product should meet the specification • Issues • Bad or imperfect specification • Non-functional requirements • What is Quality Management ? • Defining standards and policies to be followed in development • Checking to see that they are followed • In addition, develop a quality culture • Functions of Quality Management • Quality Assurance – establishing the frame work • Quality Planning – the use of the framework in planning specific projects • Quality Control – The process by which compliance with the standards and processes is ensured

  3. Software Quality • Quality Management is a separate process • Needs independence • From budget • From schedule • From project management chain • From Product Development Groups • ISO 9000 a guide to quality process • ISO 9001 general applicability to product development • ISO 9000-3 interprets ISO 9000 for software development • Deliverables from the software process are submitted to QC for review

  4. ISO 9000 and Quality • ISO 9000 supplies Quality Models • Subset developed as Organizational quality manual • which documents the organization quality process • The subset is the basis used to develop project quality plan • Project quality management uses the plan to enforce the organizational standards • See text for references to ISO materials

  5. ISO 9001:2000 Standard • ISO 9001:2000 is the quality assurance standard that applies to software engineering. • The standard contains 20 requirements that must be present for an effective quality assurance system. • The requirements delineated by ISO 9001:2000 address topics such as • management responsibility, quality system, contract review, design control, document and data control, product identification and traceability, process control, inspection and testing, corrective and preventive action, control of quality records, internal quality audits, training, servicing, and statistical techniques.

  6. Quality assurance and standards • QA – defines the framework for achieving quality software • Defines the standards • Product • Process • Provide • Best practices - make the success of others available • Provides a checklist to judge if standards have been followed • Continuity – institutional memory • Sources • IEEE • ANSI • US DoD • NATO

  7. Basic definitions • A failure is an unacceptable behaviour exhibited by a system • The frequency of failures measures the reliability • An important design objective is to achieve a very low failure rate and hence high reliability. • A failure can result from a violation of an explicit or implicit requirement • A defect is a flaw in any aspect of the system that contributes, or may potentially contribute, to the occurrence of one or more failures • It might take several defects to cause a particular failure • An error is a slip-up or inappropriate decision by a software developer that leads to the introduction of a defect

  8. Effective and Efficient Testing • To test effectively, you must use a strategy that uncovers as many defects as possible. • To test efficiently, you must find the largest possible number of defects using the fewest possible tests • Testing is like detective work: • The tester must try to understand how programmers and designers think, so as to better find defects. • The tester must not leave anything uncovered, and must be suspicious of everything. • It does not pay to take an excessive amount of time; tester has to be efficient.

  9. Standards • Documents require standards for • Process – production i.e. Creation thru print final • Documents - structure and presentation • Identification • Structure • Presentation – font, styles, logos, etc. • Update – version control • Interchange – exchange compatibility • Process • How do we improve quality in the product • Feedback • Standardization

  10. Quality Planning and Control • Plan • Developed early • Addresses the most important software quality attributes • Safety • Security • Reliability • Etc. • Control • Quality reviews • Design or program inspection – errors in requirements, design or code • Progress reviews – schedule, budget • Quality the whole package

  11. Measurement and Metrics • Metrics • Not widely used in software industry • Lack of standards • Lack of standard processes • Control – measures associated with process • Time to repair defects • Time to modify or enhance • Predictor – associated with the product • Cyclomatic count • Fog factor • Size • Measurement process • Choose measurements – • Select components • Measure • Identify anomalous measurements • Analyze components

  12. Product Metrics • Product Metrics • Concerned with the software itself • Dynamic – measurements made during execution • Static – made of the representations • Design • Program • Documentation • Dynamic assess • Efficiency • Reliability • Relatively easy to measure • Static • Complexity • Understandability • Maintainability

  13. Defect testing • Goal – expose latent defects in the system before it is implemented (delivered) • Successful test causes the system to perform incorrectly • Demonstrates the presence of program faults • Test case • Specifications of input and output • Statement of what is being tested • Test data • Inputs devised to test the code. • Test thoroughness • Exhaustive not possible • Policies of the organization not the development team

  14. Documentation defects • Defect: • The software has a defect if the user manual, reference manual or on-line help: • gives incorrect information • fails to give information relevant to a problem. • Testing strategy: • Examine all the end-user documentation, making sure it is correct. • Work through the use cases, making sure that each of them is adequately explained to the user.

  15. Writing Formal Test Cases and Test Plans • A test case is an explicit set of instructions designed to detect a particular class of defect in a software system. • A test case can give rise to many tests. • Each test is a particular running of the test case on a particular version of the system.

  16. Test plans • A test plan is a document that contains a complete set of test cases for a system • Along with other information about the testing process. • The test plan is one of the standard forms of documentation. • If a project does not have a test plan: • Testing will inevitably be done in an ad-hoc manner. • Leading to poor quality software. • The test plan should be written long before the testing starts. • You can start to develop the test plan once you have developed the requirements.

  17. Information to include in a formal test case A. Identification and classification: • Each test case should have a number, and may also be given a descriptive title. • The system, subsystem or module being tested should also be clearly indicated. • The importance of the test case should be indicated. B. Instructions: • Tell the tester exactly what to do. • The tester should not normally have to refer to any documentation in order to execute the instructions. C. Expected result: • Tells the tester what the system should do in response to the instructions. • The tester reports a failure if the expected result is not encountered. D. Cleanup (when needed): • Tells the tester how to make the system go ‘back to normal’ or shut down after the test.

  18. Levels of importance of test cases • Level 1: • First pass critical test cases. • Designed to verify the system runs and is safe. • No further testing is possible. • Level 2: • General test cases. • Verify that day-to-day functions correctly. • Still permit testing of other aspects of the system. • Level 3: • Detailed test cases. • Test requirements that are of lesser importance. • The system functions most of the time but has not yet met quality objectives.

  19. Determining test cases by enumerating attributes • It is important that the test cases test every aspect of the requirements. • Each detail in the requirements is called an attribute. • An attribute can be thought of as something that is testable. • A good first step when creating a set of test cases is to enumerate the attributes. • A way to enumerate attributes is to circle all the important points in the requirements document. • However there are often many attributes that are implicit.

  20. Software Inspections • Software Inspections • Program inspections • 1970’s • Line by line code review • Defect detection not enhancement • Team of four to six usually • Author • Reader • Tester • Moderator • Maybe scribe and chief moderator • Requires • Precise spec • Members familiar with standards • Up to date set of code • About 2 hours

  21. Reviews & Inspections ... there is no particular reason why your friend and colleague cannot also be your sternest critic. Jerry Weinberg

  22. What Are Reviews? • a meeting conducted by technical people for technical people • a technical assessment of a work product created during the software engineering process • a software quality assurance mechanism • a training ground

  23. What Reviews Are Not • A project summary or progress assessment • A meeting intended solely to impart information • A mechanism for political or personal reprisal!

  24. The Players review leader standards bearer (SQA) producer maintenance oracle reviewer recorder user rep

  25. Conducting the Review be prepared—evaluate 1. product before the review review the product, not 2. the producer keep your tone mild, ask 3. questions instead of making accusations stick to the review agenda 4. 5. raise issues, don't resolve them 6. avoid discussions of style—stick to technical correctness 7. schedule reviews as project tasks 8. record and report all review results

  26. Sample-Driven Reviews (SDRs) • SDRs attempt to quantify those work products that are primary targets for full FTRs. To accomplish this … • Inspect a fraction ai of each software work product, i. Record the number of faults, fi found within ai. • Develop a gross estimate of the number of faults within work product i by multiplying fi by 1/ai. • Sort the work products in descending order according to the gross estimate of the number of faults in each. • Focus available review resources on those work products that have the highest estimated number of faults.

  27. Metrics Derived from Reviews inspection time per page of documentation inspection time per KLOC or FP (or Use Case) inspection effort per KLOC or FP (or Use Case) errors uncovered per reviewer hour errors uncovered per preparation hour errors uncovered per SE task (e.g., requirements) number of minor errors (e.g., typos) number of major errors (e.g., nonconformance to User wants/needs vision) number of errors found during preparation

  28. Statistical SQA Product & Process Collect information on all defects Find the causes of the defects Move to provide fixes for the process measurement ... an understanding of how to improve quality ...

  29. Six-Sigma for Software Engineering • The term “six sigma” is derived from six standard deviations—3.4 instances (defects) per million occurrences — implying an extremely high quality standard. • The Six Sigma methodology defines these core steps: • Definecustomer requirements and deliverables and project goals via well-defined methods of customer communication • Measure the existing process and its output to determine current quality performance (collect defect metrics) • Analyzedefect metrics and determine the vital few causes. • Improve the process by eliminating the root causes of defects. • Controlthe process to ensure that future work does not reintroduce the causes of defects.

  30. Inspecting compared to testing • Both testing and inspection rely on different aspects of human intelligence. • Testing can find defects whose consequences are obvious but which are buried in complex code. • Inspecting can find defects that relate to maintainability or efficiency. • The chances of mistakes are reduced if both activities are performed.

  31. Testing or inspecting, which comes first? • It is important to inspect software before extensively testing it. • The reason for this is that inspecting allows you to quickly get rid of many defects. • If you test first, and inspectors recommend that redesign is needed, the testing work has been wasted. • There is a growing consensus that it is most efficient to inspect software before any testing is done. • Even before developer testing

  32. Quality Assurance in General • Root cause analysis • Determine whether problems are caused by such factors as • Lack of training • Schedules that are too tight • Building on poor designs or reusable technology

  33. Measure quality and strive for continual improvement • Things you can measure regarding the quality of a software product, and indirectly of the quality of the process • The number of failures encountered by users. • The number of failures found when testing a product. • The number of defects found when inspecting a product. • The percentage of code that is reused. • More is better, but don’t count clones. • The number of questions posed by users to the help desk. • As a measure of usability and the quality of documentation.

  34. Post-mortem analysis • Looking back at a project after it is complete, or after a release, • You look at the design and the development process • Identify those aspects which, with benefit of hindsight, you could have done better • You make plans to do better next time

  35. Meaning of “V&V” (Boehm) Verification: are we buildingthe thing right? Validation: are we buildingthe right thing? Graphics reproduced with permission from Corel.

  36. Process standards • The personal software process (PSP): • Defines a disciplined approach that a developer can use to improve the quality and efficiency of his or her personal work. • One of the key tenets is personally inspecting your own work. • The team software process (TSP): • Describes how teams of software engineers can work together effectively. • The software capability maturity model (CMM): • Contains five levels, Organizations start in level 1, and as their processes become better they can move up towards level 5. • ISO 9000-2: • An international standard that lists a large number of things an organization should do to improve their overall software process.

  37. PSP3 Cyclic development 1000’s of lines The PSP Evolution (Humphrey) Skills added to prior stage PSP2.1 Design templates PSP2 Code reviews Design reviews 100’s of lines Additional capability at the same level PSP1.1 Task planning Schedule planning PSP1 Size estimation Test report PSP0.1 Coding standards Process improvement proposal Size measurement PSP0 Current personal process Basic measurements (Adapted from [Hu1] )

  38. TSP Objectives 1 (Humphrey) • Build self-directed teams • 3-20 engineers • establish own goals • establish own process and plans • track work • Show managers how to manage teams • coach • motivate • sustain peak performance Graphics reproduced with permission from Corel.

  39. TSP Objectives 2 (Humphrey) • Accelerate CMM improvement • make CMM 5 “normal” • “Provide improvement guidelines to high-maturity organizations” • “Facilitate university teaching of industrial-grade teams”

  40. Background - Capability Maturity Model for Software • 1986 Software Engineering Institute, and the Mitre Corp. begin to develop a process maturity framework to improve software processes • 1987 description of the framework • Assessment • Evaluation • 1991 evolved to the Capability Maturity Model for Software (CMM v1.0) • Recommended practices in key process areas (KPA’s) • Gain control of processes for developing and maintaining software • Evolve to a culture of software engineering and management excellence • Current

  41. What is the CMM? • Concept: • The application of process management and quality improvement concepts to software development and maintenance • Model: • A model for organizational improvement • Guidelines: • A guide for evolving toward a culture of engineering excellence • Basis for Measurement: • The underlying structure for reliable and consistent software process assessments, software capability evaluations, and interim profiles

  42. Maturity Levels are a Framework for Process Improvement • Based on Continuous Process Improvement: • based on many small, evolutionary steps rather than revolutionary innovations. • Plateau: • A maturity level is a well-defined evolutionary plateau toward achieving a mature software process. • Foundation: • Each maturity level provides a layer in the foundation for continuous process improvement. • Priority Order: • The levels also help an organization prioritize its improvement efforts.

  43. Symptoms of Process Failure • Commitments consistently missed • Late delivery • Last minute crunches • Spiraling costs • No management visibility into progress • You’re always being surprised. • Quality problems • Too much rework • Functions do not work correctly. • Customer complaints after delivery • Poor morale • • People frustrated • • Is anyone in charge?

  44. Settling for Less • Do these statements sound familiar? If they do, your organization may be settling for less than it is capable of and may be a good candidate for process improvement. • a senior software manager (industry) “I'd rather have it wrong than have it late. We can always fix it later.” • a program manager (government) “The bottom line is schedule. My promotions and raises are based on meeting schedule first and foremost.” -

  45. The Process Management Premise • The quality of a system is highly influenced by the quality of the process used to acquire, develop, and maintain it. • This premise implies a focus on processes as well as on products. • This is a long-established premise in manufacturing (and is based on TQM principles as taught by Shewhart, Juran, Deming, and Humphrey). • Belief in this premise is visible worldwide in quality movements in manufacturing and service industries (e.g., ISO standards).

  46. What is a CMM

  47. CMM (Software) Overview SEI’s Vision:To bring engineering discipline to the development and maintenance of software products Desired Result:Higher quality -- better products for a better pricePredictability -- function/quality, on time, within budget Methodology to Achieve that Desired Result: 2. Identify Desired State:Understand the description of the next Level 1. Identify Current State:Know your current Capability Maturity Level 3. Reduce the Gap:Plan, implement, and institutionalizethe key practices of the next Level.Repeat until continuous optimization is part of the culture.

  48. Assessment vs Evaluation • A software process assessment is an appraisal by a trained team of software professionals to determine • the state of an organization's current software process, • the high-priority software process-related issues facing an organization, • and to obtain the organizational support for software process improvement. • A software capability evaluation is an appraisal by a trained team of professionals to identify contractors who are qualified to perform the software work or to monitor the state of the software process used on an existing software effort.

  49. A Foundation, Not a Destination • The optimizing level (Level 5) is not the destination of process management. • The destination is better products for a better price: economic survival • The optimizing level is a foundation for building an ever-improvingcapability.

More Related