1 / 65

Software Engineering Fall 2005

Lecture 14 Quality Management Based on: Software Engineering, A Practitioner’s Approach, 6/e, R.S. Pressman. Software Engineering Fall 2005. What is quality?. Quality , simplistically, means that a product should meet its specification. This is problematical for software systems

raven-booth
Download Presentation

Software Engineering Fall 2005

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 14Quality ManagementBased on: Software Engineering, A Practitioner’s Approach, 6/e, R.S. Pressman Software Engineering Fall 2005

  2. What is quality? • Quality, simplistically, means that a product should meet its specification. • This is problematical for software systems • There is a tension between customer quality requirements (efficiency, reliability, etc.) and developer quality requirements (maintainability, reusability, etc.); • Some quality requirements are difficult to specify in an unambiguous way; • Software specifications are usually incomplete and often inconsistent.

  3. Quality • The American Heritage Dictionary defines quality as • “a characteristic or attribute of something.” • For software, measures of program’s characteristics include: - cyclomatic complexity, cohesion, number of function points, lines of code.

  4. Quality Concepts • For software, two kinds of quality may be encountered: • Quality of design encompasses requirements, specifications, and the design of the system. • Quality of conformance is an issue focused primarily on implementation.

  5. Quality Concepts Difference between "quality of design" and "quality of conformance“: • Quality of design refers to the characteristics that designers specify for the product being constructed. • Quality of conformance is the degree to which the requirements and design specifications are followed during the manufacturing of the product.

  6. Quality Concepts • Quality is important, but if the user is not satisfied, nothing else really matters. user satisfaction = compliant product + good quality + delivery within budget and schedule

  7. Software Quality • Conformance to explicitly stated functional and performance requirements, explicitly documented development standards, and implicit characteristics that are expected of all professionally developed software.

  8. Quality Concepts • Quality control - series of inspections, reviews, and tests used to ensure conformance of a work product to its specifications • Quality assurance - consists of the auditing and reporting procedures used to provide management with data needed to make proactive decisions

  9. Cost of Quality • Prevention costs include: quality planning, formal technical reviews, test equipment, training. • Appraisal costs include: in-process and inter-process inspection, equipment calibration and maintenance, testing. • Internal failure costs include: rework, repair, failure mode analysis. • External failure costs are: complaint resolution, product return and replacement, help line support, warranty work.

  10. Software quality management • Concerned with ensuring that the required level of quality is achieved in a software product. • Involves defining appropriate quality standards and procedures and ensuring that these are followed. • Should aim to develop a ‘quality culture’ where quality is seen as everyone’s responsibility.

  11. Quality Management System (QMS) • A quality management system (QMS) is an organization-wide mechanism forbuilding quality into projects and for managing the quality control process.Figure 1 illustrates these basic elements. • A quality management system will include a quality manual, which in turn references a number of standards, guidelines and procedures that are applied within the organization. The quality manual will require that each project develops its own quality plan, which must comply with the guidelines laid down in the quality manual. The quality management system (QMS) itself is likely to want to claim conformance to some external national or international standard for quality.

  12. Figure 1.The architecture of a quality management system

  13. The quality compromise • We cannot wait for specifications to improve before paying attention to quality management. • We must put quality management procedures into place to improve quality in spite of imperfect specification.

  14. Scope of quality management • Quality management is particularly important for large, complex systems. The quality documentation is a record of progress and supports continuity of development as the development team changes. • For smaller systems, quality management needs less documentation and should focus on establishing a quality culture.

  15. Role of the Software Quality Assurance (SQA) Group-I • Prepares an SQA plan for a project. • The plan identifies • evaluations to be performed • audits and reviews to be performed • standards that are applicable to the project • procedures for error reporting and tracking • documents to be produced by the SQA group • amount of feedback provided to the software project team • Participates in the development of the project’s software process description. • The SQA group reviews the process description for compliance with organizational policy, internal software standards, externally imposed standards (e.g., ISO-9001), and other parts of the software project plan.

  16. Role of the SQA Group-II • Reviews software engineering activities to verify compliance with the defined software process. • identifies, documents, and tracks deviations from the process and verifies that corrections have been made. • Audits designated software work products to verify compliance with those defined as part of the software process. • reviews selected work products; identifies, documents, and tracks deviations; verifies that corrections have been made • periodically reports the results of its work to the project manager. • Ensures that deviations in software work and work products are documented and handled according to a documented procedure. • Records any noncompliance and reports to senior management. • Noncompliance items are tracked until they are resolved.

  17. Why SQA Activities Pay Off? cost to find and fix a defect 100 60.00-100.00 log scale 10.00 10 3.00 1.50 1.00 0.75 1 test Design field system Req. use code test

  18. Software Reviews • Purpose is to find errors before they are passed on to another software engineering activity or released to the customer. • Software engineers (and others) conduct formal technical reviews (FTRs) for software engineers. • Using formal technical reviews (walkthroughs or inspections) is an effective means for improving software quality.

  19. Bugs, Errors, Defects: Terminology • The general consensus within the software engineering community is that defects and errors, faults, and bugs are synonymous. • We will make a distinction between an error and a defect: - error – a quality problem found before the software is released to end-users; - defect – a quality problem found only after the software has been released to end-users.

  20. What Are Reviews • A meeting conducted by technical people for technical people • A technical assessment of a work product created during the software engineering process • A software quality assurance mechanism • A training ground

  21. What Reviews Are Not • A project summary or progress assessment • A meeting intended solely to impart information • A mechanism for political or personal reprisal!

  22. The Players review leader standards bearer (SQA) producer maintenance oracle reviewer recorder user rep

  23. Conducting the Review be prepared—evaluate 1. product before the review review the product, not 2. the producer keep your tone mild, ask 3. questions instead of making accusations stick to the review agenda 4. 5. raise issues, don't resolve them 6. avoid discussions of style—stick to technical correctness 7. schedule reviews as project tasks 8. record and report all review results

  24. Formal Technical Reviews (FTRs) What is it? Why is one conducted? • An FTR is a software quality control performed by software engineers (and others). • The purpose of an FTR is to have a group of software engineers examine a discrete work product and determine whether on not the product is free of errors, omissions or inconsistencies. • Software specifications and standards are used as the review criteria.

  25. FTR Objectives The objectives of an FTR are: • To uncover errors in function, logic, or implementation for any representation of the software; • To verify that the software under review meets its requirements; • To ensure that the software has been represented according to predefined standards; • To achieve software that is developed in a uniform manner; • To make projects more manageable. • The FTR also serves to promote backup and continuity because a number of people become familiar with parts of the software that they may not have otherwise seen.

  26. FTR Meeting Tasks I • Involves 3 to 5 people (including reviewers) • Advance preparation (no more than 2 hours per person) required • Duration of review meeting should be less than 2 hours

  27. FTR Meeting Tasks II • Focus of review is on a discrete work product. • Review leader organizes the review meeting at the producer's request. • Reviewers ask questions that enable the producer to discover his or her own error (the product is under review not the producer). • Producer of the work product walks the reviewers through the product (alternative: in inspections a "reader" who is not the producer presents the work product) . • Recorder writes down any significant issues raised during the review • Reviewers decide to accept or reject the work product and whether to require additional reviews of product or not .

  28. FTR Reporting and Record Keeping I • During the FTR, a reviewer (the recorder) actively records all issues that have been raised. These are summarised at the end of the review meeting and a review issues list is produced. • In addition, a formal technical review summary report (a single page form) is completed, which answers: - What was reviewed? - Who reviewed it? - What were the findings and conclusions?

  29. FTR Reporting and Record Keeping II • It is important to establish a follow-up procedure to ensure that items on the issues list have been properly corrected, otherwise, issues raised can ‘fall between the cracks’. • One approach is to assign the responsibility for follow-up to the review leader.

  30. FTR Guidelines • Review the product not the producer. • Seat an agenda and maintain it. • Limit rebuttal and debate. • Enunciate problem area, but don't attempt to solve every problem noted. • Take written notes. • Limit number of participants and insist on advance preparation. • Develop a checklist for each product that is likely to be reviewed. • Allocate resources and schedule time for all reviewers. • Conduct meaningful training for all reviewers. • Review your early reviews,

  31. FTR – Should the Programming Style be Reviewed? • Assessing style is tricky and can lead to bad feelings if a reviewer is not careful when he/she makes comments concerning style. • If the producer gets the feeling that the reviewer is saying, "Do it like I do," it is likely that some resentment will arise. • In general, the review should focus on correctness.

  32. Sample Driven Reviews (SDRs) • Samples of all software engineering work products are reviewed to determine the most error-prone • Full FTR resources are focused on the likely error-prone work products based on sampling results

  33. SDRs • The fraction of the work product that is sampled must: - be representative of the work product as a whole, and; - large enough to be meaningful to the reviewer(s) who does the sampling.

  34. SDRs • The SDRs must attempt to quantify those work products that are primary targets for full FTRs. To accomplish this, the following steps are suggested: 1. Inspect a fraction ai of each software work product, i. Record the number of faults, fi found within ai. 2. Develop a gross estimate of the number of faults within work product i by multiplying fi by 1/ai. 3. Sort the work products in descending order according to the gross estimate of the number of faults in each. 4. Focus available review resources on those work products that have the highest estimated number of faults.

  35. Metrics Derived from Reviews inspection time per page of documentation inspection time per KLOC or FP inspection effort per KLOC or FP errors uncovered per reviewer hour errors uncovered per preparation hour errors uncovered per SE task (e.g., design) number of minor errors (e.g., typos) number of major errors (e.g., nonconformance to req.) number of errors found during preparation

  36. Statistical Quality Assurance Statistical quality assurance implies the following steps: 1. Information about software defects is collected and categorized. 2. Each defect is traced back to its cause. 3. Using the Pareto principle (80% of the defects can be traced to 20% of the causes) isolate the "vital few" defect causes. 4. Move to correct the problems that caused the defects in the "vital few“.

  37. Statistical SQA Product & Process Collect information on all defects Find the causes of the defects Move to provide fixes for the process measurement ... an understanding of how to improve quality ...

  38. Six-Sigma for Software Engineering I • Six Sigma is the most widely used strategy for statistical quality assurance in industry today. • Originally popularized by Motorola in the 1980s.

  39. Six-Sigma for Software Engineering II • The term “six sigma” is derived from six standard deviations—3.4 instances (defects) per million occurrences—implying an extremely high quality standard.

  40. Six-Sigma for Software Engineering Steps The Six Sigma methodology defines the following steps: • Define customer requirements and deliverables and project goals via well-defined methods of customer communication. • Measure the existing process and its output to determine current quality performance (collect defect metrics). • Analyze defect metrics and determine the vital few causes. • Improve the process by eliminating the root causes of defects. • Control the process to ensure that future work does not reintroduce the causes of defects. Sometimes referred to as DMAIC (define, measure, analyze, improve, and control) method.

  41. Software Reliability • Defined as the probability of failure free operation of a computer program in a specified environment for a specified time period • Can be measured directly and estimated using historical and developmental data (unlike many other software quality factors) • Software reliability problems can usually be traced back to errors in design or implementation.

  42. Software Reliability Illustration • Program X is estimated to have a reliability of 0.96 over 8 elapsed processing hours. • In other words, if program X were to be executed 100 times and require a total of 8 hours of processing time (execution time), it is likely to operate correctly (without failure) 96 times.

  43. Failure • In the context of any discussion of software quality and reliability: Failure is nonconformance to software requirements.

  44. Software Reliability • A simple measure of reliability is mean-time-between-failure (MTBF), where MTBF = MTTF + MTTR MTTF - mean-time-to-failure MTTR - mean-time-to-repair • Many argue that MTBF is a far more useful measure than defects/KLOC or defects/FP. An end-user is concerned with failures, not with the total error count. Because each defect contained within a program does not have the same failure rate, the total defect count provides little indication of the reliability of a system.

  45. Software Availability • Software availability is the probability that a program is operating according to requirements at a given point in time and is defined as Availability = [MTTF/(MTTF + MTTR)] x 100% • The MTBF reliability measure is equally sensitive to MTTF and MTTR. The availability measure is somewhat more sensitive to MTTR, an indirect measure of the maintainability of software.

  46. MTBF Criticisms • For hardware the MTBF concept is based on statistical error data that occurs due to physical wear in a product. • In general, when a failure does occur in hardware, the failed part is replaced with a spare. However, when an error occurs for software, a design change is made to correct it. • The change may create side effects that generate other errors. Therefore, the statistical validity of MTBF for software is suspect.

  47. Software Safety • Software safety is a software quality assurance activity that focuses on the identification and assessment of potential hazards that may affect software negatively and cause an entire system to fail. • If hazards can be identified early in the software process, software design features can be specified that will either eliminate or control potential hazards. • Software reliability involves determining the likelihood that a failure will occur, while software safety examines the ways in which failures may result in conditions that can lead to a mishap.

  48. Mistake-Proofing I • Poka-yoke (mistake-proofing) devices—mechanisms that lead to • the prevention of a potential quality problem before it occurs or • the rapid detection of quality problems if they are introduced. • For example, the ignition switch for a motorcar will not work if an automatic transmission is in gear (a prevention device); a car’s warning beep will sound if the seat belts are not buckled (a detection device).

  49. Mistake-Proofing II • An effective poka-yoke device exhibits a set of common characteristics: • It is simple and cheap. If a device is too complicated or expensive, it will not be cost effective. • It is part of the process. That is, the poka-yoke device is integrated into an engineering activity. • It is located near the process task where the mistakes occur. Thus, it provides rapid feedback and error correction. • The poka-yoke scripts

  50. ISO 9000 Quality Standards • An international set of standards for quality management. • Applicable to a range of organisations from manufacturing to service industries. • ISO 9001 applicable to organisations which design, develop and maintain products. • ISO 9001 is a generic model of the quality process that must be instantiated for each organisation using the standard.

More Related