1 / 86

Software Engineering

Software Engineering. Natallia Kokash email: nkokash@liacs.nl. Agenda. Software quality Process and product quality Software metrics. What is quality?. Transcendent (“I really like this program”) User-based (“fitness for use”) Value-based (balancing time and cost vs. profits

evette
Download Presentation

Software Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Engineering Natallia Kokash email: nkokash@liacs.nl N. Kokash, Software Engineering

  2. Agenda Software quality Process and product quality Software metrics N. Kokash, Software Engineering

  3. What is quality? Transcendent (“I really like this program”) User-based (“fitness for use”) Value-based (balancing time and cost vs. profits Manufacturing-based(conformance to specs) Product-based (based on attributes of the software) N. Kokash, Software Engineering

  4. Software quality • Absence of defects? • Program does not crash • Computes correct output • We cannot establish the absence of defects, only their presence. • We can count the number of defects we find after X hours of testing N. Kokash, Software Engineering

  5. Why is software quality difficult? • This is problematical for software systems • There is a tension between customer quality requirements (efficiency, reliability, etc.) and developer quality requirements (maintainability, reusability, etc.); • Some quality requirements are difficult to specify in an unambiguous way; • Software specifications are usually incomplete and often inconsistent. N. Kokash, Software Engineering

  6. Approaches to quality • Quality of the product versus quality of the process • Check whether (product or process) conforms to certain norms • Improve quality by improving the product or process software + measures N. Kokash, Software Engineering

  7. Software quality management • Concerned with ensuring that the required level of quality is achieved in a software product. • Involves defining appropriate quality standards and procedures and ensuring that these are followed. • Should aim to develop a ‘quality culture’ where quality is seen as everyone’s responsibility. N. Kokash, Software Engineering

  8. Quality management activities • Quality assurance • Establish organisational procedures and standards for quality. • Quality planning • Select applicable procedures and standards for a particular project and modify these as required. • Quality control • Ensure that procedures and standards are followed by the software development team. • Quality management should be separate from project management to ensure independence. N. Kokash, Software Engineering

  9. Process-based quality • In manufactured goods, there is a straightforward relation between process and product-quality. • This is more complex for software • The relationship between software processes and product quality is a very complex and poorly understood. • The application of individual skills and experience is particularly important in software development; • External factors such as the novelty of an application or the need for an accelerated development schedule may impair product quality. N. Kokash, Software Engineering

  10. Process-based quality people process product technology Quality of a product depends on the quality of the people, process and technology! Define process Develop product Assess product quality Quality ok? Yes No Improve process Standardize process N. Kokash, Software Engineering

  11. Quality assurance and standards • Standards are the key to effective quality management. • They may be international, national, organizational or project standards. • Product standards define characteristics that all components should exhibit e.g. a common programming style. • Process standards define how the software process should be enacted. N. Kokash, Software Engineering

  12. Product and process standards N. Kokash, Software Engineering

  13. ISO 9001 • Model for quality assurance in design, development, production, installation and servicing • Basic premise: • confidence in product conformance can be obtained by adequate demonstration of supplier’s capabilities in processes (design, development, …) • ISO registration by an officially accredited body, re-registration every three years N. Kokash, Software Engineering

  14. PEOPLE TECHNOLOGY PROCESS Software Process Improvement (SPI) Premise: “The quality of a product is largely determined by the quality of the process that is used to develop and maintain it.” • Approach to SPI • Formulate hypotheses • Carefully select metrics • Collect data • Interpret data • Initiate improvement actions • Iterate N. Kokash, Software Engineering

  15. Capability Maturity Model (CMM) • Initial level • software development is ad-hoc • Repeatable level • basic processes are in place • Defined level • there are standard processes • Quantitatively managed level • data is gathered and analyzed routinely • Optimizing level • stable base, data is gathered to improve the process N. Kokash, Software Engineering

  16. Initial  repeatable level • Requirements management • Project planning • Project monitoring and control • Supplier agreement management • Measurement and analysis • Process and product quality assurance • Configuration management N. Kokash, Software Engineering

  17. Repeatable defined level • Requirements development • Technical solution • Product integration • Verification • Validation • Organization process focus • Organization process definition • Organizational training • Integrated project management • Risk management • Decision analysis and resolution N. Kokash, Software Engineering

  18. CMM: critical notes • Most appropriate for big companies • Pure CMM approach may stifle creativity • Focus mostly on activities and supporting artifacts associated with a conventional waterfall process • Crude 5-point scale (now: CMMI) N. Kokash, Software Engineering

  19. Capability Maturity Model Integration (CMMI) http://www.cdainfo.com/down/1-Desarrollo/CMM2.pdf N. Kokash, Software Engineering

  20. Continuous process improvement 2 Managed CMMI: process areas by maturity level Focus Level Process Areas Organizational Innovation and Deployment Causal Analysis and Resolution 5 Optimizing Quantitative management Organizational Process Performance Quantitative Project Management 4 Quantitatively Managed Requirements Development Technical Solution Product Integration Verification Validation Organizational Process Focus Organizational Process Definition Organizational Training Integrated Project Management Integrated Supplier Management Risk Management Decision Analysis and Resolution Organizational Environment for Integration Integrated Teaming Process standardization 3 Defined Integrated Product and Process Development (IPPD) Basic project management Requirements Management Project Planning Project Monitoring and Control Supplier Agreement Management Measurement and Analysis Process and Product Quality Assurance Configuration Management 1 Initial N. Kokash, Software Engineering

  21. Documentation standards • Particularly important - documents are the tangible manifestation of the software. • Documentation process standards • Concerned with how documents should be developed, validated and maintained. • Document standards • Concerned with document contents, structure, and appearance. • Document interchange standards • Concerned with the compatibility of electronic documents. N. Kokash, Software Engineering

  22. Problems with standards / QA • They may not be seen as relevant and up-to-date by software engineers. • They often involve too much bureaucratic form filling. • If they are unsupported by software tools, tedious manual work is often involved to maintain the documentation associated with the standards. N. Kokash, Software Engineering

  23. Quality models • Decomposition of characteristics • Bottom level: metrics • Differences in • Relations between characteristics • Vocabulary Boehm McCall ISO 9126 Dromey … N. Kokash, Software Engineering Boehm’s Quality Model

  24. ISO 9126 N. Kokash, Software Engineering

  25. McCallsquality factors and criteria N. Kokash, Software Engineering

  26. Boehm’s quality tree N. Kokash, Software Engineering

  27. Data collection • A metrics programme should be based on a set of product and process data. • Data should be collected immediately (not after a project has finished) and, if possible, automatically. • Don’t rely on memory • Don’t collect unnecessary data • The questions to be answered should be decided in advance and the required data identified. • Tell people why the data is being collected. • It should not be part of personnel evaluation. N. Kokash, Software Engineering

  28. Data collection • Closed loop principle: result of data analysis must be useful to supplier of data • Do not use data collected for other purposes • Focus on continuous improvement • Only collect data you really need N. Kokash, Software Engineering

  29. GQM-Approach • Goal – Question – Metric (Basili) • Approach to select metrics • Avoids “let’s collect a lot of data and decide afterwards what we do with the values” (fishing for results) • Approach • Classify the entities • Express goals of organization • Generate questions to meet goals • Analyze questions and define metrics • Finally, check whether metrics can be collected N. Kokash, Software Engineering

  30. GQM: Example Goal Evaluate effectiveness of coding standard What is code quality? Who is using the standard? What is coder productivity? Question Metric • Proportion of coders • Using standard • Using language • Experience of coders • With standars • With language • With environment • … Code size (LOC, Function Points…) Errors, Effort,… N. Kokash, Software Engineering

  31. Cost of quality Internal costs of failure Costs due to lack of quality External costs of failure Cost of quality Appraisal costs Costs of achieving quality Prevention costs N. Kokash, Software Engineering

  32. Cost of quality N. Kokash, Software Engineering

  33. Cost of repair Cost of Defect Repair Cost of repair increases exponentially As a project progresses,more and more workdepends on earlier decisions. Design Implement. Architecting Maintenance Requirements Defects should be eliminated as soon as possible after introduction N. Kokash, Software Engineering

  34. Question • The highest probability of undetected defects are in modules which show… • …HIGHEST # of known defects • …LOWEST # of known defects ? N. Kokash, Software Engineering

  35. If you cannot measure it, then it is not science! In physical science the first essential step in the direction of learning any subject is to find principles of numerical reckoning and practicable methods for measuring some quality connected with it. I often say that when you can measure what you are speaking about, and express it in numbers, you know something about it; But when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely in your thoughts advanced to the state of Science, whatever the matter may be. — Sir William Thompson, Lord Kelvin (1824-1907) From 'Electrical Units of Measurement', a lecture delivered at the Institution of Civil Engineers, London (3 May 1883), Popular Lectures and Addresses (1889), Vol. 1, 73. Quoted in American Association for the Advancement of Science, Science (Jan-Jun 1892), 19, 127. N. Kokash, Software Engineering

  36. If you cannot measure it, then it is not science? “Not everything that is important can be measured, and not everything that can be measured is important.“ Albert Einstein N. Kokash, Software Engineering

  37. Why measure? • Gilb’s principle of fuzzy targets: Projects without clear goals will not achieve goals clearly Tom Gilb (1940), an American systems engineer, consultant, and author, known for the development of software metric, software inspection, and evolutionary processes. • You can neither predict nor control what you cannot measure Tom DeMarco(1940), an American software engineer, author, teacher and speaker on software engineering topics. Best known as one of the developers of Structured analysis in the 1980s. N. Kokash, Software Engineering

  38. Why measure? } • Controlling • Understanding • Comparing • Predicting • Metrics are • objective • (often) automatically collectable • Why measure software? • Determine the quality of the current product or process • Predict qualities of a product/process • Improve quality of a product/process N. Kokash, Software Engineering

  39. Representation condition • A measure M is valid if it satisfies the representation condition, i.e. if A>B in the real world, then M(A)>M(B) • E.g. if we measure complexity as the number of if-statements, then: • Two programs with the same number of if-statements are equally complex • If program A has more if-statements than program B, then A is more complex than B M(Jan) = 200cm M(Joep) = 150cm N. Kokash, Software Engineering

  40. Motivation for metrics • Estimate the cost & schedule of future projects bidding • Evaluate the productivity impacts of new tools and techniques • Establish productivity trends over time • Monitor/Improve software quality • Forecast future staffing needs • Anticipate and reduce future maintenance needs N. Kokash, Software Engineering

  41. Software measurement and metrics • Software measurement is concerned with deriving a numeric value for an attribute of a software product or process. • This allows for • objective comparisons between techniques and processes • predictions • management control • Most organisations don’t make systematic use of software measurement. N. Kokash, Software Engineering

  42. Measurement, Metrics, Indicators • Measure: a quantitative indication of the extent, amount, dimension, capacity or size of some attribute of a product or process. • A single data point (e.g. number of defects from a single review) • Measurement: The act of determining a measure • Metric: A measure of the degree to which a system, component or process possesses a given attribute. • Metrics relate measures (e.g. average number of defects found in reviews) • Relate data points to each other • Indicator: A metric or series of metrics that provide insight into a process, project or product. N. Kokash, Software Engineering

  43. Has productivity improved over time? • 1000 systems completed between 1996 and 2011. • PI = Productivity Index • FP = Function Points • PM = Person Month • http://www.qsm.com/blog/2011/has-software-productivity-declined-over-time N. Kokash, Software Engineering

  44. Productivity measures? The QSM (Quantitative Software Management, Inc.) methodology is based upon the use of a productivity parameter called the Productivity Index (PI). A PI is calculated using an empirical formula developed by Larry Putnam, Sr. in the 1970's. This index includes in its calculation the following measures related to a software project: size, duration, and effort.  PI = size/(effort*duration) N. Kokash, Software Engineering

  45. Has productivity improved over time? N. Kokash, Software Engineering

  46. Literature on software metrics • Metrics and Models in Software Quality Engineering (2nd edition)Stephen H. KanAddison Wesley, 2002 • Software Metrics A Rigorous & Practical Approach Norman E. Fenton & Shari Lawrence Pleeger, 2nd ed. International Thomson Computer Press, 1997 N. Kokash, Software Engineering

  47. Measurement scales • Nominal scale • Just categories, no ordering, no magnitude • Example: specification fault, design fault,… • Ordinal scale • Ordered w.r.t. an attribute, ranking only • Example: preference, defect complexity (serious, moderate, simple) • Interval scale • Preserves order and differences, no ratios • Example: grades A, B, C… • Ratio scale • Order, difference and ratio; has a zero element • Example: development time, lines of code Measurement scales restrict the kind of allowed analysis N. Kokash, Software Engineering

  48. Scales may not be unique N. Kokash, Software Engineering

  49. Software metric • Any type of measurement which relates to a software system, process or related documentation • Lines of code in a program • Number of methods per class • Number of requirements • Number of components in a system • Number of person-days required to develop a component N. Kokash, Software Engineering

  50. Applications of metrics • Bidding • feasibility • cost prediction • Tracking progress • Identifying fault-prone elements • Focus on QA-activities • Assessing the Quality of a System • “Bad smells”  refactoring N. Kokash, Software Engineering

More Related