1 / 46

Software Quality

Software Quality. Chapter 20-21. Software Quality. How can you tell if software has high quality? How can we measure the quality of software? How can we make sure software has high quality?. Perspective on quality. Customer system not crashes system follows documentation

tyrell
Download Presentation

Software Quality

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Quality Chapter 20-21

  2. Software Quality • How can you tell if software has high quality? • How can we measure the quality of software? • How can we make sure software has high quality?

  3. Perspective on quality • Customer • system not crashes • system follows documentation • system is logical and easy to use • Developer • system is easy to change • system is easy to understand • system is pleasant to work on

  4. Characteristics of Software Quality • Some external characteristics that the user is aware of: • Correctness • Usability • Efficiency • Reliability • integrity • extent to which it allows unauthorized access to data • Adaptability • Accuracy • Robustness

  5. Characteristics of Software Quality • Some internal characteristics of software that are code-centered: • Maintainability • Flexibility • Portability • Reusability • Readability • Testability • Understandability

  6. Total Quality Management • Factories • Goal is for every item coming off the assembly line to be perfect • Management, production, engineering, QA • Everyone is involved in quality • Develop a reliable, repeatable process • Continuously improve the process

  7. Failure vs. flaw • Failure - program didn’t work right • Flaw - mistake in the text of the program • Failure analysis - what flaw caused this failure? • Flaw analysis - what is wrong with our process that allowed this flaw to be created and not detected?

  8. Failure costs • Internal • rework • repair • failure analysis • External • resolving complaints • returning and replacing product • help line

  9. Prevention costs • Prevention • planning • managing and collecting information • reviews • Appraisal • inspection • Testing

  10. Cost of fixing an error 1000 40-1000 times 30-70 times 100 15-40 times 10 times 10 3-6 times 1 time 1 Req. Design Code Dev. Test System Test Field operation

  11. Johnson’s Law If you don’t test for it, your system doesn’t have it. Is it easy to use? Easy to maintain? Does it crash? Does it match the documentation? Does it make customers happy?

  12. Ways not to improve quality • Say “Be more careful!” • Say “Quality is important.” • Find out whose fault it is and fire him.

  13. How to improve quality • Measure and compare • Determine root cause of problems • Create ways to eliminate problems

  14. If you don’t see it, it doesn’t exist • Measure quality over time (metrics) • Display in a public place • Make quality goals, then check to see if you meet them.

  15. How to appraise quality • Requirements • reviews by customers • prototyping • Analysis and design models • formal reviews, inspections • Current system • bug reports • user tests • surveys

  16. Bug tracking • Keep track of • who reported the bug (the failure) • description of the failure • severity • the flaw that caused this failure • who is repairing it • the repair

  17. Bug tracking • Use information about failures to estimate reliability • Compare • critical nature of failure • iteration failure discovered • module that had the flaw

  18. Use quality information to make decisions • “Must repair all level 1 failures before shipping” • “Half of all level 1 and 2 failures in the alpha release were in the Call Processing module; we should rewrite it.” • “Half of all level 1 and 2 defects found in the design reviews were in Call Processing; we should rewrite it.”

  19. Bug tracking • Discover the flaw (defect) that caused each bug • Categorize flaws • Look at categories with the most flaws and improve your process to eliminate them.

  20. SQA Manager • Responsible for SQA policy • Develops testing plan • Checks that plan is being followed • Develops review process • Trains reviewers • Monitors reviews • Develops customer survey plan • …

  21. Technical Reviews • A way to evaluate the quality of requirements, designs, and software • A way to improve the quality of requirements, designs, and software • A way to educate new developers and ensure that developers are consistent • Proven to be cost-effective!

  22. Main goal: evaluate quality • Produce a report describing • potential problems • summary of overall quality • pass/fail • Evaluated by expert outsiders • must know enough • shouldn’t know too much

  23. Secondary goal: improve quality • Find flaws • Enforce standards • Improve standards • Provide feedback to management

  24. Review methods • The most important step to improve the software engineering performance • Various review methods • Inspection • Walk-throughs • Personal reviews

  25. Inspections • A structured procedure for the team review of a software product • Has a formal structure, and each participant has a defined role • Typical inspection process • Preparation • Inspection meeting • Repair and report • To be more effective, inspections are measured, a report is produced, and the action items are tracked to closure.

  26. Walk-Throughs • A less formal process that usually follows a presentation format • A developer walks through how the program performs, while the audience raises issues and asks questions. • Generally lack of advance preparation or follow-up

  27. Personal Reviews • You examine your own products. • To find and fix as many defects as possible before implementing, inspecting, compiling, or testing the program • As the PSP data will show, a little time spent on reviewing code can save the much longer amount of time that would be spent on debugging and fixing during compile and test.

  28. Code Reviews are More Efficient than Testing (1) • In reviews, you find the defects directly. • When you review a program, you know where you are and the results its logic is supposed to produce. • You establish logical relationships and construct a mental context of the program’s behavior. • In testing, you get only symptoms. • You start with some unexpected system behavior. • You then spend a lot of time figuring out what caused those symptoms, i.e. debugging.

  29. Code Reviews are More Efficient than Testing (2) • A debugger's principal advantage is that it helps you to step through the program logic and check the important parameter values. • This process is effective only if you know what the parameter values are supposed to be.

  30. Review Principles • Establish defined review goals. • Follow a defined review process. • Measure and improve your review process.

  31. Establish Review Goals • To find and fix all defects before the first compile or test • When engineers start doing reviews, they find only about 1/3 to 1/2 of defects. • By analyzing the data on reviews and making appropriate changes in review process, the rate can be improved significantly. • With care can practice, some find they can reach 80%, or even 95%.

  32. Follow a Defined Review Process • Code review script • Code review guideline and checklist (C++)

  33. Measure and Improve Your Review Process • To improve review quality • Generally, a high-quality review is one that finds the greatest number of defects in the least amount of time. • Need to measure the time spent on review, and track all defects found in review and later

  34. Review Measures

  35. Who founds more defects?

  36. Review Measures Four explicit review measures • The size of the program being reviewed • LOC • If no code available, use text lines or pages of design, or estimated LOC. • The review time in minutes • The number of defects found • The number of defects in the program that were laterfound • these are called escapes

  37. Derived Review Measures Several measures can be derived from the basic measures. • The review yield • the percentage of defects in the program that were found during the review • The defect found per KLOC of design or code reviewed • The defects found per hour of review time • The LOC reviewed per hour • Defect removal leverage (DRL), or the relative rate of defect removal for any two process phases

  38. Review Yield • Yield • Cannot be precisely calculated until the reviewed has been extensively tested and used • Still able to make useful early approximations; upper bound • A high yield would be good and a low yield poor.

  39. Result of review • Review summary • who, what, when and the conclusion • Issues list • Can result in more detailed reports • Give priority to issues • Can be disagreement on issues • Most issues are about product, but can also be about process or standards

  40. In real life • Usually there are several meetings until all issues are resolved • Project has a policy that determines what would be reviewed • “Passing review” is a measure of progress • Reviews improve checklists, not just the product under review

  41. Summary • Reviews are a key quality control technique • Use reviews to improve your process, not just your product

More Related