1 / 44

Software Engineering Session 7

Software Engineering Session 7. Software Testing. Recap. Fundamental development activities Specification Development Validation Maintenance So far, you have an implementation of the software produce, AND a pile of documents including requirements, models, designs, …

duer
Download Presentation

Software Engineering Session 7

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Engineering Session 7 Software Testing

  2. Recap • Fundamental development activities • Specification • Development • Validation • Maintenance • So far, you have • an implementation of the software produce, AND • a pile of documents including requirements, models, designs, … • This week • Software testing

  3. Learning Objectives • Understanding the stages of testing • From testing during development to acceptance testing by system customers • Techniques that help to choose test cases that are geared to discovering program defects • Know about three distinct types of testing: component testing, system testing and release testing • Understand the distinctions between development testing and user testing

  4. Why? Software Defects and Failures • Software defects and failures cost billions of pounds annually. • They can cause inconvenience on a massive scale. • In some cases, they cost lives. • Defects and failures can never be entirely eradicated. • However, the incidence of defects and failures can be significantly reduced through systematic and scientific software testing. • (Dijkstra 1972) Testing can only show the presence of errors, not their absence.

  5. Software Testing • Software testing is a quality assurance activity that is designed to ensure that software is fit for purpose. • In traditional software development methodologies, testing has been viewed as a discrete phase of the development lifecycle. • In contemporary software development practices, (general) software testing is integrated into all phases of the development process. • In agile methods, tests are written before any coding takes place, and • a project only moves forward once tests have been carried out, and identified faults dealt with (i.e. Test-driven Development).

  6. Testing process goals Defect testing • To discover faults or defects in the software • where its behaviour is incorrect, undesirable, or not in conformance with its specification • Root out undesirable system behavior • E.g. system crashes, unwanted interactions with other systems, incorrect computations and data corruption. • Test cases are designed to expose defects • can be deliberately obscure and need not reflect how the system is normally used. • A successful test makes the system perform incorrectly and so exposes a defect in the system. Validation testing • To demonstrate to developer/customer that the software meets its requirements • For custom software, there should be at least one test for every requirement • For generic software products, there should be tests for (combinations of) all of the system features • You expect the system to perform correctly using a given set of test cases that reflect the system’s expected use. • A successful test shows that the system operates as intended.

  7. Software Validation and Verification • Software testing is part of a broader process of software validation and verification (V&V). • Note: validation is essential because statements of req. do not always reflect the real wishes/needs of customers/users.

  8. Static Testing • V&V process involves more than software testing • Static “testing” is any testing in which the program code is not executed. • Static testing is part of validation (but sometimes also verification); it can take place at any point in the development lifecycle. • Examples of static testing activities include walkthroughs and inspections.

  9. Inspections • Inspectionsinvolve a systematic, formal scrutiny of development artefacts (code, requirements specs, etc.) • with the aim of discovering anomalies and defects earlier in the development lifecycle • Advantages over testing • Because inspection is a static process, there is no need to be concerned with interactions between errors. • Incomplete versions of a system can be inspected. • Can also be used to test non-functional attributes of a program (e.g. compliance with standards, portability and maintainability)

  10. (Dynamic) Testing • Dynamic testing is any testing which involves the execution of system code. • Dynamic testing is generally considered to be part of the software verification process. • Examples of dynamic testing techniques include: • Unit testing • Component testing • System/Integration testing • Release testing • User testing • We will be focusing on dynamic testing!

  11. Black-Box vs White-Box Testing These are general methodologies! • Ablack box test is any test in which the inner structure (code) of an artefact system being tested is not known to the tester. • Systems testing, acceptance testing. • Usually performed by software testers. • Test cases based on requirements specification. • Awhite box testis any test in which the inner structure (code) of an artefact being tested is known to the tester. • Unit tests, component tests, integration tests. • Usually performed by software developers. • Test cases based on code base.

  12. Testing Process (and a definition) Antonia Bertolino, FOSE’07. Software testing research: achievement, challenges, dreams Software testing consists of the dynamic verification of the behaviour of a program on a finite set of test cases, suitably selected from the usually infinite executions domain, against the specified expected behaviour. • The testing process, as used in plan driven development • Testing case: specifications of the input, expected output, what is being tested • Testing data: inputs that have been devised

  13. Test Planning • All software testing must be systematically planned in advance. • This involves the creation of a detailed test plan. • Information typically included in a test plan includes:

  14. Stages of Testing • Development testing • The system is tested during development to discover errors and defects. • Unit, component, integration and regression testing. • Release testing • A separate testing team test a complete version of the system before it is released to users. • Requirements testing, scenario based testing, performance testing • User testing • Users of a system test the system in their own environment. • Alpha testing, beta testing, acceptance testing

  15. Development Testing

  16. Development testing • Development testing includes all testing activities that are carried out (mostly) by the team developing the system. • Unit testing • individual program units or object classes are tested. Unit testing should focus on testing the functionality of objects or methods. • Component testing • several individual units are integrated to create composite components. • Component testing should focus on testing component interfaces. • System testing • some or all of the components in a system are integrated and the system is tested as a whole. • system testing should focus on testing component interactions.

  17. Unit Testing • Unit testing is the process of testing individual system components in isolation • It is a defect testing process. • In OO development, unit tests are used to test: • Object classes. • Object methods. • Object attributes. • The percentage of code which is tested by unit tests is called test coverage. • Ideally, test coverage should be as close to 100% as is feasible within extant constraints.

  18. Testing Strategies: how to choose testing cases? • Partition testing • This involves identifying groups of test data that have common characteristics and should be processed in the same way. • Test data should be chosen from within each of the identified groups. • Guideline-based testing • This involves using existing test guidelines to choose test cases. • Test guidelines reflect previous experience of the kinds of errors that programmers often make when developing components.

  19. Partition Testing (black-box) • In partition testing, input data and output results often fall into different classes where all members of a class are related. • Each of these classes is an equivalence partition or domain where • the program behaves in an equivalent way for each class member. • Test cases should be chosen from each partition. • In many applications, errors occur at the boundaries of input domain. Boundary value analysis is used to identify errors at boundaries instead of finding at centre of input domain

  20. General testing guidelines • Run positive tests and negative tests, and test boundary cases. • Choose inputs that force the system to generate all error messages. • Design inputs that cause input buffers to overflow. • Repeat the same input or series of inputs numerous times. • Force invalid outputs to be generated. • Force computation results to be too large or too small. • More

  21. Specific Testing guidelines (sequences, array, lists) • Test each aspect of the unit under test, including seemingly trivial aspects (e.g. set and get methods). • Test only a single aspect at a time (e.g. a single method). • Keep tests as simple as possible. • Keep tests independent of each other. • Document each test carefully (e.g. name, description, tester). • Test software with sequences which have only a single value. • Use sequences of different sizes in different tests. • Derive tests so that the first, middle and last elements of the sequence are accessed. • Test with sequences of zero length.

  22. Control Flow Testing (white-box) • Control-flow testing assures that program statements and decisions are executed by code execution • The control flow graph is used for defining and analysing control-flow test criteria. • The basic control-flow criteria are • Statement Coverage (SC): requires that test-data must execute every statement of the program at least once • branch coverage • condition coverage • path coverage • condition combination coverage • … • Zhu et al. Software unit test coverage and adequacy. ACM Computing survey. 29(4), 1997.

  23. Component Testing • Software components are often made up of several interacting objects. • Communication between these objects is enabled via defined component interfaces. • Components testing focuses on showing that a component interface behaves according to its specification. • Component testing should take place only when unit testing is complete.

  24. Interface testing • Objectives are to detect faults due to interface errors or invalid assumptions about interfaces. • Interface types • Parameter interfaces. Data passed from one method or procedure to another. • Shared memory interfaces.Block of memory is shared between procedures or functions. • Data is placed in the memory by one subsystem and retrieved by other subsystems • Procedural interfaces. Sub-system encapsulates a set of procedures to be called by other sub-systems. • Message passing interfaces. Sub-systems request services from other sub-systems

  25. Interface errors • Interface misuse • A calling component calls another component and makes an error in its use of its interface e.g. parameters in the wrong order. • Common in parameter interfaces • Interface misunderstanding • A calling component embeds assumptions about the behaviour of the called component which are incorrect. • E.g. binary search is called on an unordered array • Timing errors • The called and the calling component operate at different speeds and out-of-date information is accessed. • Real-time systems using shared memory/message-passing interfaces

  26. Interface testing guidelines • Design tests so that parameters to a called procedure are at the extreme ends of their ranges. • Always test pointer parameters with null pointers. • Design tests which cause the component to fail. • Use stress testing in message passing systems. • Design tests that generate many more messages than usual • In shared memory systems, vary the order in which components are activated.

  27. System Testing • System testing takes place during development and before user testing. • It involves integrating components to create a functioning version of the system, and then testing the integrated system. • System testing checks that modules and components are • compatible, • interact correctly and • transfer the right data at the right time across their interfaces. • System testing tests the emergent behaviours of a system • Some system functionality and characteristics only become obvious when put together

  28. System and component testing • During system testing, reusable components that have been separately developed and off-the-shelf systems may be integrated with newly developed components. • The complete system is then tested. • Components developed by different team members or sub-teams may be integrated at this stage. System testing is a collective rather than an individual process. • In some companies, system testing may involve a separate testing team with no involvement from designers and programmers.

  29. Use-case testing • System testing focus on testing the interactions between the components and objects • The use-cases developed to identify system interactions can be used as a basis for system testing. • Each use case usually involves several system components so testing the use case forces these interactions to occur. • The sequence diagrams associated with the use case documents the components and interactions that are being tested.

  30. Test cases derived from sequence diagram • An input of a request for a report should have an associated acknowledgement. A report should ultimately be returned from the request. • You should create summarised data that can be used to check that the report is correctly organised. • An input request for a report to WeatherStation results in a summarised report being generated. • Can be tested by • creating raw data corresponding to the summary that you have prepared for the test of SatComms • and checking that the WeatherStation object correctly produces this summary. • This raw data is also used to test the WeatherData object. Collect weather data sequence chart

  31. System Testing • There are several forms of system testing, including:

  32. Release Testing

  33. Release Testing • Release testing is also a form of system testing. • However, it differs from other forms of system testing in that: • The objective is to check that the system meets its requirements (validation) rather than to uncover code defects (verification). • A separate team that has not been involved in the system development, should be responsible for release testing. • Examples of release testing include: • Scenario testing. Example. • Requirements-based testing. • Performance testing

  34. Three main examples • Requirements-based testing involves examining each requirement and developing tests for it. • Scenario testing is an approach whereby • devise typical use scenarios and use these to develop test cases for the system. • Part of release testing may involve testing the emergent properties of a system, such as performance and reliability.

  35. Requirements-based testing: example • Mentcare system requirements: • If a patient is known to be allergic to any particular medication, then prescription of that medication shall result in a warning message being issued to the system user. • If a prescriber chooses to ignore an allergy warning, they shall provide a reason why this has been ignored. • Tests: • Set up a patient record with no known allergies. Prescribe medication for allergies that are known to exist. Check that a warning message is not issued by the system. • Set up a patient record with a known allergy. Prescribe the medication to that the patient is allergic to, and check that the warning is issued by the system. • Set up a patient record in which allergies to two or more drugs are recorded. Prescribe both of these drugs separately and check that the correct warning for each drug is issued. • Prescribe two drugs that the patient is allergic to. Check that two warnings are correctly issued • Prescribe a drug that issues a warning and overrule that warning. Check that the system requires the user to provide information explaining why the warning was overruled.

  36. A usage scenario for the Mentcare system George is a nurse who specializes in mental healthcare. One of his responsibilities is to visit patients at home to check that their treatment is effective and that they are not suffering from medication side effects. On a day for home visits, George logs into the Mentcare system and uses it to print his schedule of home visits for that day, along with summary information about the patients to be visited. He requests that the records for these patients be downloaded to his laptop. He is prompted for his key phrase to encrypt the records on the laptop. One of the patients that he visits is Jim, who is being treated with medication for depression. Jim feels that the medication is helping him but believes that it has the side effect of keeping him awake at night. George looks up Jim’s record and is prompted for his key phrase to decrypt the record. He checks the drug prescribed and queries its side effects. Sleeplessness is a known side effect so he notes the problem in Jim’s record and suggests that he visits the clinic to have his medication changed. Jim agrees so George enters a prompt to call him when he gets back to the clinic to make an appointment with a physician. George ends the consultation and the system re-encrypts Jim’s record. After, finishing his consultations, George returns to the clinic and uploads the records of patients visited to the database. The system generates a call list for George of those patients who He has to contact for follow-up information and make clinic appointments.

  37. Features tested by scenario • Authentication by logging on to the system. • Downloading and uploading of specified patient records to a laptop. • Home visit scheduling. • Encryption and decryption of patient records on a mobile device. • Record retrieval and modification. • Links with the drugs database that maintains side-effect information. • The system for call prompting.

  38. Performance testing • Testing the emergent properties of a system • performance and reliability. • Tests should reflect the operational profile of the system • A set of tests that reflect the actual mix of work that will be handled by the system • Performance tests usually involve planning a series of tests where • the load is steadily increased until the system performance becomes unacceptable. • Stress testing is a form of performance testing • where the system is deliberately overloaded to test its failure behaviour.

  39. User Testing

  40. User Testing • User or customer testing is a stage in the testing process in which a system is tested by actual users. • User testing is essential, even when comprehensive system and release testing have already been carried out. • Influences from the user’s working environment have a major effect on the reliability, performance, usability and robustness of a system. These cannot be replicated in a testing environment. • User testing generally involves extensive usability testing. • In agile approaches, the user is involved in the development and testing of a system from the start.

  41. User Testing • Alpha testing • Users of the software work with the development team to test the software at the developer’s site. • Beta testing • A release of the software is made available to users to allow them to experiment and to raise problems that they discover with the system developers. • Acceptance testing • Customers test a system to decide whether or not it is ready to be accepted from the system developers and deployed in the customer environment.

  42. Stages in the acceptance testing process • Define acceptance criteria • Plan acceptance testing • Derive acceptance tests • Run acceptance tests • Negotiate test results • Reject/accept system

  43. A summary

  44. A very good reference

More Related