1 / 23

CS567 Software Testing, Quality Assurance and Maintenance Spring 2005 Professor Sally Lee slee2@stevens.edu

CS567 Software Testing, Quality Assurance and Maintenance Spring 2005 Professor Sally Lee slee2@stevens.edu Text Book – Robert Culbertson, Chris Brown and Gary Cobb, “Rapid Testing”, Prentice-Hall, 2002. ISBN 0-13-091294-8 Homework assignments – mostly reading assignments

damian
Download Presentation

CS567 Software Testing, Quality Assurance and Maintenance Spring 2005 Professor Sally Lee slee2@stevens.edu

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS567 Software Testing, Quality Assurance and Maintenance Spring 2005 Professor Sally Lee slee2@stevens.edu Text Book – Robert Culbertson, Chris Brown and Gary Cobb, “Rapid Testing”, Prentice-Hall, 2002. ISBN 0-13-091294-8 Homework assignments – mostly reading assignments Case History Projects – 3 projects, 60% of the final grade First case history project: 15% (2/3/05 – 2/17/05) Second case history project : 20% (2/17/05 – 3/10/05) Third case history project: 25% (3/17/05 – 4/7/05) End of Semester Evaluation - Final exam (20%, 5/5/05) and project (20%, includes an in class presentation 4/28/05), 40% of the final grade Grades All exam, project, and final grades will be calculated according to the following scale: A ..... 85 and above B ..... 75 – 84 C ..... 65 – 74 D ….. 55 - 64 F ..... 54 and below There will be absolutely no negotiations for grades.

  2. Software fault are common for the simple reason that the complexity in modern systems is often pushed into the software part of the system. • It is estimated that 60-90% of current computer errors are from software faults. • Software flaws that permit the variables to take on values outside of their intended operating limits often cause software failures • Software fails when coded correctly, but the design is in error • Software fails when the hardware or operating systems are changed in a way unanticipated by the designer • Software often fails when user overload it • Most software failures are due to some human error

  3. December, 2000 A study by the Department of Commerce’s National Institute of Standards and Technology (NIST), “bugs have become so frequent and harmful that they cost the US economy an estimated $59.5 billion annually.” “In fact, 80% of software development costs are now allocated to testing activities, so expanding the amount of testing may not be a good objective or even a feasible one.” “Improving the efficiency of the testing infrastructure by developing best test methods, which can be adopt as standards appears to be the logical direction of response.”

  4. In “Uniform Theory of Reliability Based Software Engineering”, Prof. L. Bernstein describes: A fault is an erroneous state of software and fault tolerance is the ability of the software system to avoid execution the fault in a way that causes the system to fail. The reliability of a system as a function of time R(t), is the conditional probability that the system has not failed in the interval [0,t], given that it was operational at t=0 The most common reliability model is: R(t) = e-λt where λ is the failure rate In a two state continuous time Markov chain the parameters to be estimated are failure rate λ and repair rate µ. λ 0 1 µ

  5. The Mean Time Between Failure (MTTF) = 1/ λ The Mean Time To Repair (MTTR) = 1/µ The steady-state availability is: Availability = MTTF/(MTTF + MTTR) = 1/ [ 1 + λ / µ ] The goal of Software Fault Tolerance is to make Availability = 1

  6. There are 2 major demands placed on today’s software test engineers: • We need to test quickly to meet aggressive product delivery schedules • We need to test well enough that damaging defects don’t escape to our customers

  7. What is Software Testing?? Software testing is a process of analyzing or operating software for the purpose of finding bugs. The word “process” is used to emphasize that testing involves planned, orderly activities. A bug is a flaw in the development of the software that causes a discrepancy between the expected result of an operation and the actual result. The bug could be a coding problem, a problem in the requirements or the design, or it could be a configuration or data problem. It could also be something that is different than what the customer had expected, which may or may not be in the product specifications.

  8. There are two basic functions of software testing: Verification and Validation Schulmeyer and Mackenzie (2000) define as follows: Verification is the assurance that the products of a particular phase in the development process are consistent with the requirements of that phase and the preceding phase. It is more on the activities of a particular phase of the development process. Validation is the assurance that the final product satisfies the system requirements. It makes sure the right product is being built

  9. Software testing encompasses many testing strategies. They include dynamic versus static testing, white (glass) box testing versus black box testing. Dynamic testing is concerned with demonstrating the software’s run-time behavior is response to selected inputs and conditions Static testing involves inspections, code walkthroughs, design review and program proving. Black box testing focuses on inputs, outputs, and principle function of a software module White box testing looks into the structure of code for a software module

  10. Black box (Functional) White box (Structural) Dynamic Random testing Computation testing Domain testing Domain testing Cause-effect graphing Path-based testing Data generation Mutation analysis Specification proving Code walkthroughs Inspections Program proving Symbolic execution Anomaly analysis Static

  11. We need to analyze the current software development process for ways to improve testing efficiency. We need to look at every phase of the software process from the viewpoint of the test engineer to see if there is away to speed up testing while maintaining or improving quality.

  12. Software Development Process – Waterfall Life Cycle Model Requirements Preliminary Design Detailed Design Code and Unit test Integration & System Test Characteristic: Each phase of the model is completed before the next stage begins. Acceptance Test, Implementation Operations & Maintenance The process relates to the building of a product is often called a life cycle. The development of a software product is often called a software life cycle.

  13. In the traditional waterfall model, the role of the test organization is not made explicit until the system testing and acceptance testing phases. To product “quality product” study has shown testing should start at the early stage of the development cycle. An example of a waterfall test process life cycle model is shown below: Requirements Analysis Test Planning Test Design Test Implementation Test Debugging System Testing Acceptance Testing Operations & Maintenance

  14. Development and Test Processes (Parallel Waterfall Model) Requirements Verify Validate Development Thread Test Thread System Design Test Planning Verify Program Design Test Design Verify Verify Verify Code & Unit Test Test Implementation Verify Integration Test Test Debug System Testing Acceptance Testing Tying Testing and Development together Operations & Maintenance

  15. Development and test activities need to be closely integrated. This integration of development and test activities needs to begin at the front end of the development process – when requirements are elicited from the user. The development team needs a clear set of requirements in order to design the software system, the test team needs requirements that are clear, unambiguous, and testable in order to develop a test plan and test designs. Every requirement must be testable!!!!

  16. Studies performed at companies including GTE, TRW, IBM and HP have measured and assigned costs to errors occurring at various phases of the project lifecycle, Davis (1993) summarized as follow: Requirements .1-.2 Design .5 Coding 1 Unit Test 2 Acceptance Test 5 Maintenance 20 Relative cost to repair The cost to find and fix a defect increase exponentially after requirement phase

  17. Test team needs to be involved in early in the requirements phase because: • The test team needs early and accurate requirements specifications to develop a test plan and test designs • The test team needs to apply static testing to the requirements specifications to prevent defects from escaping to later phases.

  18. Discussion on Requirements • IEEE Standard 830 defines an outline for requirements document: • Introduction • Purpose • Scope • Overview • General Description • Product Perspective • Product Functions • User Characteristics • General Constrains • Assumptions and Dependencies • Specific Requirements • Functional Requirements • External Interface Requirements • Performance Requirements • Design Constrains • Attributes • Other Requirements

  19. Requirements Testing • Static Testing on the requirements documents • Inspections • Walkthrough • Peer Reviews

  20. Six basic criteria that should be used in the static testing of requirements specifications: • Each requirement must be: • Complete • Unambiguous • Consistent • Traceable • Feasible • Testable • Understandable

  21. Test Planning • Define the test strategy • Define the scope of testing • Define the Testing Approach • Define testing Criteria and quality checkpoint • Define automation strategy • Define the test system (hardware and software) • Test architecture • Test tools • Test environment • Test configurations • Estimate the test effort (resources and schedule) • Identify tasks • Determine effort • Determine duration and construct schedule • Assess schedule risks and formulate mitigation plans • Prepare and review the test plan documents

  22. IEEE Standard 829 test plan template Introduction Test Items Features to be tested Features not to be tested Approach Item pass/fail criteria Suspension criteria and resumption requirements Test deliverables Testing tasks Environmental needs Responsibilities Staffing and training needs Schedule Risks and contingencies Approvals

  23. Home Work (week 1 1/20/05) • Read chapters 1-3 • Research on the web or in the library, find out the advantages and disadvantages of the “Waterfall software development process”

More Related