1 / 69

Software Engineering “Software Quality”

Software Engineering “Software Quality”. Leiden Institute of Advanced Computer Science. Lecture Series for BSc. “Computer Science” year 2 (Fall semester 2011). Agenda. What is Software Quality How can we Measure quality? Ensure quality? Process and product quality Software metrics.

noura
Download Presentation

Software Engineering “Software Quality”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Engineering“Software Quality” Leiden Institute of Advanced Computer Science Lecture Series for BSc. “Computer Science” year 2 (Fall semester 2011)

  2. Agenda • What is Software Quality • How can we • Measure quality? • Ensure quality? • Process and product quality • Software metrics

  3. What is quality? • Transcendent (“I really like this program”) • User-based (“fitness for use”) • Value-based (balancing time and cost versus profits) • Manufacturing-based (conformance to specs) • Product-based (based on attributes of the software)

  4. Software quality • Absence of defects? • Program does not crash • Computes correct output • We cannot establish the absence of defects, only their presence. • We can count the number of defects we find after X hours of testing

  5. Why is software quality difficult? • Quality is problematical for software systems • There is a tension between customer quality requirements (efficiency, reliability, etc.) and developer quality requirements (maintainability, reusability, etc.); • Some quality requirements are difficult to specify in an unambiguous way; • Software specifications are usually incomplete and often inconsistent.

  6. What is Quality? • Absence of defects? • program does not crash • computes correct output • We cannot establish the absence of defects, only their presence. • We can count the number of defects we find after X hours of testing Formal Methods

  7. Approaches to quality • Quality of the product versus quality of the process • Check whether (product or process) conforms to certain norms • Improve quality by improving the product or process

  8. Software quality management • Concerned with ensuring that the required level of quality is achieved in a software product. • Involves defining appropriate quality standards and procedures and ensuring that these are followed. • Should aim to develop a ‘quality culture’ where quality is seen as everyone’s responsibility.

  9. Quality management activities • Quality assurance • Establish organisational procedures and standards for quality. • Quality planning • Select applicable procedures and standards for a particular project and modify these as required. • Quality control • Ensure that procedures and standards are followed by the software development team. Quality management should be separate from project management to ensure independence.

  10. Process-based quality • In manufactured goods, there is a straightforward relation between process and product-quality. • This is more complex for software • The relationship between software processes and product quality is very complex and poorly understood. • The application of individual skills and experience is particularly important in software development; • External factors such as the novelty of an application or the need for an accelerated development schedule may impair product quality.

  11. Process-based quality people process product technology The quality of a product depends on the quality of the people, process and technology! Define process Develop product Assess product quality Quality ok? Yes No Improve process Standardize process

  12. Quality assurance and standards • Standards are the key to effective quality management. • They may be international, national, organizational or project standards. • Product standards define characteristics that all components should exhibit e.g. a common programming style. • Process standards define how the software process should be enacted.

  13. Product and process standards

  14. ISO 9001 • Model for quality assurance in design, development, production, installation and servicing • Basic premise: • confidence in product conformance can be obtained by adequate demonstration of supplier’s capabilities in processes (design, development, …) • ISO registration by an officially accredited body, re-registration every three years

  15. Basic Quality Definitions • A failure is an unacceptable behaviour exhibited by a system • The frequency of failures measures the reliability • An important design objective is to achieve a very low failure rate and hence high reliability. • A failure can result from a violation of an explicit or implicit requirement • A defect is a flaw in any aspect of the system that contributes, or may potentially contribute, to the occurrence of one or more failures • could be in the requirements, the design and the code • It might take several defects to cause a particular failure • An error is a slip-up or inappropriate decision by a software developer that leads to the introduction of a defect

  16. More Basic Quality Definitions

  17. Software Quality Hazards

  18. Relation & Examples • A programmer deletes an important line of code. • On 1st Jan 2008 the system reports the date as 1stjan 1908. • No design documentation is present for a complex algorithm. defect failure error

  19. Measuring Defects • Defect Density: Standard quality measure • Number of defects per KLOC or FP • Defect Arrival Rates/Fix Rates: Standard Process and Progress Measurements • Defects detected/fixed per unit of time (or effort) • Removed Defects: Defects which are identified and then taken out of the product • Due to some defect removal activity, such as code reviews

  20. number of non-injected defects found % of injected defects not found Measuring the effectiveness of defect detection • Develop program • Inject (or label) defects • Detect defects (testing, reviewing) • Count how many of the injected defects you found • Count how many non-injected defects you found The number of remaining defects = defect Injected defect

  21. Measure quality for continuous improvement • Measures regarding • the quality of a software product, • the quality of the process • The number of defects found when inspecting a product. • The number of failures found when testing a product. • The number of failures encountered by users. • The number of questions posed by users to the help desk. • As a measure of usability and the quality of documentation. • Root cause analysis • Determine the source of problems & Improve

  22. Post-mortem analysis • Looking back at a project after it is complete, or after a release, • You look at the development process • Identify those aspects which, with benefit of hindsight, you could have done better • You make plans to do better next time

  23. Predicted Software Failure Arrival Rates Software Failure 50 System B 25 6 months Time

  24. Defects Detected tends to be similar to Staffing Curves – time lag for error detection People Defects Time Source: Industrial Strength Software, Putnam & Myers, IEEE, 1997

  25. Which is related to Code Production Rate TEST And all tend to follow Rayleigh Curves People Defects Code Production Rate Time Source: Putnam & Myers Note: Period during test is similar to exponential curve

  26. Software Reliability as a function of time in testing & #defects found Software reliability growth model Rel (t1) Exponential Rel (t0) Final word not said: software is not continuous S-Shaped Curve Time T= 0 T = 1

  27. What is Quality? • The System satisfies the stated requirements • what about missing and incorrect requirements?

  28. ISO 9216 Quality Model

  29. McCalls Quality Factors and Criteria

  30. Boehm’s Quality tree

  31. Software Quality Attributes http://satc.gsfc.nasa.gov/support/STC_APR96/qualtiy/stc_qual.html

  32. Some More Examples of “*ilities” Accessibility, Administrability, Understandability, Generality, Operability, Simplicity, Mobility, Nomadicity, Portability, Accuracy, Efficiency, Footprint, Responsiveness, Scalability, Schedulability, Timeliness, CPU utilization, Latency, Throughput, Concurrency, Flexibility, Changeability, Evolvability, Extensibility, Modifiability, Tailorability, Upgradeability, Expandability, Consistency, Adaptability, Composability, Interoperability, Openness, Integrability, Accountability, Completeness, Conciseness, Correctness, Testability, Traceability, Coherence, Analyzability, Modularity, Reusability, Configurability, Distributeability, Availability, Confidentiality, Integrity, Maintainability, Reliability, Safety, Security, Affordability, Serviceablility, …

  33. If you cannot measure it, then it is not science! In physical science the first essential step in the direction of learning any subject is to find principles of numerical reckoning and practicable methods for measuring some quality connected with it. I often say that when you can measure what you are speaking about, and express it in numbers, you know something about it; But when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely in your thoughts advanced to the state of Science, whatever the matter may be. — Sir William Thompson, Lord Kelvin (1824-1907) From 'Electrical Units of Measurement', a lecture delivered at the Institution of Civil Engineers, London (3 May 1883), Popular Lectures and Addresses (1889), Vol. 1, 73. Quoted in American Association for the Advancement of Science, Science (Jan-Jun 1892), 19, 127.

  34. If you cannot measure it, then it is not science? “Not everything that is important can be measured, and not everything that can be measured is important.“ Albert Einstein

  35. Why measure? • Gilb’s principle of fuzzy targets: Projects without clear goals will not achieve goals clearly Tom Gilb (1940), an American systems engineer, consultant, and author, known for the development of software metric, software inspection, and evolutionary processes. • You can neither predict nor control what you cannot measure Tom DeMarco (1940), an American software engineer, author, teacher and speaker on software engineering topics. Best known as one of the developers of Structured analysis in the 1980s.

  36. Why measure? } • Controlling • Understanding • Comparing • Predicting • Metrics are • objective • (often) automatically collectable • Why measure software? • Determine the quality of the current product or process • Predict qualities of a product/process • Improve quality of a product/process

  37. Representation Condition • A measure M is valid if it satisfies the representation condition, i.e. if A>B in the real world, then M(A)>M(B) • E.g. if we measure complexity as the number of if-statements, then: • Two programs with the same number of if-statements are equally complex • If program A has more if-statements than program B, then A is more complex than B M(Jan) = 200cm M(Joep) = 150cm

  38. Motivation for Metrics • Estimate the cost & schedule of future projects bidding • Evaluate the productivity impacts of new tools and techniques • Establish productivity trends over time • Monitor/Improve software quality • Forecast future staffing needs • Anticipate and reduce future maintenance needs

  39. Software measurement and metrics • Software measurement is concerned with deriving a numeric value for an attribute of a software product or process. • This allows for • objective comparisons between techniques and processes • predictions • management control • Most organisations don’t make systematic use of software measurement.

  40. Goal-Question-Metric-Approach • Victor Basili • Approach to select metrics • Avoids “let’s collect a lot of data and decide afterwards what we do with the values” (fishing for results) • Approach • Express goals • Generate questions to meet goals • Analyze questions and define metrics • Check whether metrics can be collected

  41. GQM: Example Goal Evaluate effectiveness of coding standard What is code quality? Who is using the standard? What is coder productivity? Question Metric • Proportion of coders • Using standard • Using language • Experience of coders • With standards • With language • With environment • … Code size (LOC, Function Points…) Errors, Effort,…

  42. Literature on Software Metrics • Metrics and Models in Software Quality Engineering (2nd edition)Stephen H. KanAddison Wesley, 2002 • Software Metrics A Rigorous & Practical Approach Norman E. Fenton & Shari Lawrence Pfleeger, 2nd ed. International Thomson Computer Press, 1997

  43. Measurement, Metrics, Indicators “maat” • Measure:A quantitative indication of the extent, amount, dimension, capacity or size of some attribute of a product or process. • A single data point (e.g. number of defects from a single review) • Measurement:The act of determining a measure • Metric:A measure of the degree to which a system, component or process possesses a given attribute. • Metrics relate measures (e.g. Average number of defects found in reviews) • Relate data points to each other • Indicator:A metric or series of metrics that provide insight into a process, project or product. “meten” “meting”

  44. Levels of Metrics • Project metrics • Process metrics • Product metrics

  45. Project Metrics • Effort/time per SE task • Defects detected per review hour • Scheduled vs. actual milestone dates • Changes (number) and their characteristics • Distribution of effort on SE tasks

  46. Product Metrics • focus on the quality of deliverables • measures of analysis model • complexity of the design • internal algorithmic complexity • architectural complexity • data flow complexity • code measures (e.g., Halstead) • measures of process effectiveness • e.g., defect removal efficiency

  47. Quality Metrics What: Testability, extensibility, maintainability, error-proneness, … How: • Coupling, Cohesion • Complexity • Inheritance metrics Novel: • Measure Earlier: Design rather than code • Completeness & Consistency • Combining different views: Structure & Behaviour

  48. Measurement scales • Nominal scale • Just categories, no ordering, no magnitude • Example: specification fault, design fault,… • Ordinal scale • Ordered w.r.t. an attribute, ranking only • Example: preference, defect complexity (serious, moderate, simple) • Interval scale • Preserves order and differences, no ratios • Example: grades A, B, C… • Ratio scale • Order, difference and ratio; has a zero element • Example: development time, lines of code

  49. Example: Before Refactoring

  50. Example: After Refactoring

More Related