550 likes | 1.06k Views
SENG 530: Software Verification and Validation. V&V Processes and Techniques Prof. Bojan Cukic Lane Department of Computer Science and Electrical Engineering West Virginia University. Overview. Software Inspections. 02/14/2002 Software Metrics. Today Software Reliability Engineering.
E N D
SENG 530: SoftwareVerification and Validation V&V Processes and Techniques Prof. Bojan Cukic Lane Department of Computer Science and Electrical Engineering West Virginia University
Overview • Software Inspections. • 02/14/2002 • Software Metrics. • Today • Software Reliability Engineering. • 02/28/2002
Agenda • Software Engineering Measurements. • Measurement Theory. • A Goal-Based Framework for Software Measurement. • Verification and Validation Metrics.
Measure? Why? • Developers angle. • Completeness of requirements, quality of design, testing readiness. • Managers angle. • Delivery readiness, budget and scheduling issues. • Customers angle. • Compliance with requirements, quality. • Maintainers angle. • Planning for upgrades and improvements.
Measurement • A process by which numbers (symbols) are assigned to attributes of entities in the real world in such a way to describe them according to clearly defined rules. • Measurement process is difficult to define. • Measuring colors, intelligence is difficult. • Measurement accuracy, margin of errors. • Measurement units, scales. • Drawing conclusions from measurements is difficult.
Measurement (2) • “What is not measurable make measurable” [Galileo, 1564-1642]. • Increased visibility, understanding, control. • Measurement: • Direct quantification of an attribute. • Calculation: • Indirect, a combination of measurements used to understand some attribute. • (Ex. Overall scores in decathlon).
Measurement in Software Engineering • Applicable to managing, costing, planning, modeling, analyzing, specifying, designing, implementing, verifying, validating, maintaining. • Engineering implies understanding and control. • Computer science provides theoretical foundations for building software, software engineering focuses on controlled and scientifically sound implementation process.
Measurement in Software Engineering • Considered somewhat a luxury?!? • Weakly defined targets: “Product will be user-friendly, reliable, maintainable”. • Gilb’s Principle of Fuzzy Targets: “Projects without clear goals will not achieve their goals clearly.” • Estimation of costs. • Cost of design, cost of testing, cost of coding… • Predicting product quality. • Considering technology impacts.
Software Measurement Objectives • “You cannot control what you cannot measure.” [DeMarco, 1982] • Managers angle. • Cost: Measure time and effort of various processes (elicitation, design, coding, test). • Staff productivity: Measure staff time, size of artifacts. Use these for predicting the impact of a change. • Product quality: Record faults, failures, changes as they occur. Cross compare different projects. • User satisfaction: Response time, functionality. • Potential for improvement.
Software Measurement Objectives (2) • Engineers angle. • Are requirements testable? • Instead of requiring “reliable operation”, state the expected mean time to failure. • Have all faults been found? • Use models of expected detection rates. • Meeting product or process goals. • <20 failures per beta-test site in a month. • No module contains more than x lines (standards). • What the future holds? • Predict product size from specification size.
The scope of software metrics • Cost and effort estimation. • COCOMO 1, Function points model, etc. • Effort is a function of size (LOC, function points), developer’s capability, level of reuse, etc. • Productivity models and measures. • Simplistic approach: Size/Effort. • Can be quite misleading, even dangerous.
The Scope of Metrics (2) • Data collection • Easier said than done. • Must be planed and executed carefully. • Use simple graphs and charts to present collected data (see next slide). • Good experiments, surveys, case studies are essential.
The Scope of Metrics (3) • Quality models and measurements. • Quality and productivity models are usually combined. • Advanced COCOMO (COCOMO II), McCalls model. • Usually constructed in a tree-like fashion. • At a high level are indirect factors, at a low level are directly measurable factors.
The Scope of Metrics (4) • Reliability models. • Performance evaluation and modeling. • Structural and complexity metrics. • Readily available structural properties of code (design) serve as surrogate for quality assessment, control, prediction. • Management by metrics. • Many companies define standard tracking and project monitoring/reporting systems. • Capability maturity assessment.
Agenda • Software Engineering Measurements. • Measurement Theory. • A Goal-Based Framework for Software Measurement. • Verification and Validation Metrics.
The basics of measurement • Some pervasive measurement techniques are taken for granted • A rising column of mercury for temperature measurement was not so obvious 150 years ago. • But, we developed a measurement framework for temperature. • How well do we understand software attributes we want to measure? • What is program “complexity”, for example?
The basics of measurement (2) • Are we really measuring the attribute we want to measure? • Is the number of “bugs” in system testing a measure of quality? • What statements can be made about an attribute” • Can we “double design quality”? • What operations can be applied to measurements? • What is “average productivity” of the group? • What is “average quality” of software modules?
Empirical relations (2) • “Taller than” is an empirical relation. • Binary relation (x is taller than y). • Unary relation (x is tall). • Empirical relations need to be mapped from real world into mathematics. • In this mapping, real world is the domain, mathematical world is the range. • Range can be the set of integers, real numbers, or even non-numeric symbols.
Representation condition • The mapping should preserve real world relations. • A is taller than B iff M(A) > M(B). • Binary empirical relation “taller than” is replaced by the numerical relation >. • So, “x is much taller than y” may meanM(x)>M(y)+15.
Agenda • Software Engineering Measurements. • Measurement Theory. • A Goal-Based Framework for Software Measurement. • Verification and Validation Metrics.
The framework • Classifying the entities to be examined • Determining relevant measurement goals • Identifying the level of maturity reached by the organization
Classifying software measures • Processes are collections of software related activities. • Associated with time, schedule. • Products are artifacts, deliverables or documents that result from a process activity. • Resources are entities required by a process activity.
Classifying software measures (2) • For each entity, we distinguish: • Internal attributes • Those that can be measured purely in terms of the product, process or the resource itself. • Size, complexity measures, dependencies. • External attributes • Those that can be measured in terms of how the product, process or the resource relate to their environment. • Experienced failures, timing and performance.
Process • Measures include: • The duration of the process or one of its activities. • The effort associated with the process or activities • The number of incidents of the specific type arising during the process or one of its activities. • Ave. cost of error=cost/#errors_found.
Products • External attributes: • Reliability, maintainability, understandability (of documentation), usability, integrity, efficiency, reusability, portability, interoperability… • Internal attributes • Size, effort, cost, functionality, modularity, syntactic correctness.
Product measurements • Direct measure example: • Entity: Module design document (D1) • Attribute: Size • Measure: No. of bubbles (in flow diagram). • Indirect measure example: • Entity: Module design document (D1, D2,…) • Attribute: Average module size • Measure: Average no. of bubbles (in flow diagram).
Resources • Personnel (individual or team), materials (including office supplies), tools, methods. • Resource measurement may show what resource to blame for poor quality. • Cost measured across all types or resources. • Productivity: • amount_of_output/effort_input. • Combines resource measure (input) with the product measure (output).
GQM Paradigm • Goal-Question-Metric • Steps: • List the major goals of the development effort. • Derive from each goal questions that must be answered to determine if goals are being met. • Decide what must be measured to answer the questions adequately.
Goal definition templates • Purpose: • To (characterize, evaluate, predict…) the (process, model, metric…) in order to (understand, assess, manage, learn, improve…) • Perspective • Examine the (cost, correctness, defects, changes…) from the viewpoint of (developer, manager, customer…) • Environment • The environment consists of process factors, people factors, methods, tools, etc.
Process improvement • Measurement is useful for understanding, establishing the baseline, assessing and predicting. • But the larger context is improvement. • SEI proposed five maturity levels, ranging from the least to the most predictable and controllable.
Maturity and measurement overview • Level Characteristic Metrics • Initial Ad hoc Baseline • Repeatable Process depends Project on individuals management • Defined Process defined & Product institutionalized • Managed Measured process Process+ feedback • Repeatable Improvement fed back Process+feedback to the process for changing the process.
Applying the framework • Cost and effort estimation • E=a*Sb • E is effort (person months), S is size (thousands of delivered source statements) • A, b are environment specific constants. • Data collection • Orthogonal defect classification
Applying the framework (2) • Reliability models • JM model: MTTFi=a/(N-I+1) • N: total no. faults, 1/a is the “fault size”. • Capability maturity assessment • The maturity attribute is also viewed as an attribute of contractor’s process.
Applying the framework (3) • Evaluation of methods and tools
Agenda • Software Engineering Measurements. • Measurement Theory. • A Goal-Based Framework for Software Measurement. • Verification and Validation Metrics.
V&V application of metrics • Applicable throughout the lifecycle. • Should be condensed for small projects. • Used to assess product, process, resources. • V&V metric characteristics: • Simplicity • Objectivity • Ease of collection • Robustness (insensitive to changes) • Validity