1 / 38

Advanced Software Engineering: Software Testing, 2007 COMP 3705 (Lecture1)

Advanced Software Engineering: Software Testing, 2007 COMP 3705 (Lecture1). Sada Narayanappa TA: Seif Azgandhi Anneliese Andrews (Chair DU) Thomas Thelin Carina Andersson. Facts about testing. System development: 1/3 planning 1/6 coding 1/4 component test 1/4 system test [Brooks75].

tabithag
Download Presentation

Advanced Software Engineering: Software Testing, 2007 COMP 3705 (Lecture1)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Advanced Software Engineering: Software Testing, 2007COMP 3705 (Lecture1) Sada Narayanappa TA: Seif Azgandhi Anneliese Andrews (Chair DU) Thomas Thelin Carina Andersson

  2. Facts about testing System development: • 1/3 planning • 1/6 coding • 1/4 component test • 1/4 system test [Brooks75]

  3. TMM Test process Evaluation • Guided by Capability Maturity model (CMM) • Stages or levels to evolve from and to • Each level, except Level 1, has structure • Level 2 and above have: • Structure • Goals • Organization structure

  4. Internal structure of TMM Level indicate Maturity Goals Contains Testing Capability Supported By Maturity Sub goals Achieved By Activities/tasks/responsibilities Addresses Implementation and Organizational adaptation Organized By 3 Critical views Manager Developer User/Client

  5. Test Maturity Model

  6. TMM Levels • Level1- No process defined; debug/testing are same • Level2- Test & Debug tools/Test plan/Basic test process • Level3- Test Org/Technical Training/Lifecycle/ Control & Monitor, support V Model • Level4- Review process/test measurement/Software quality Evaluation – Test Logging with severity • Level5-Defect prevention/Quality/Process optimization

  7. Good enough quality To claim that any given thing is good enough is to agree with all of the following propositions: • It has sufficient benefits • It has no critical problems • The benefits sufficiently outweigh the problems • In the present situation, and all things considered, further improvement would be more harmful than helpful James Bach, IEEE Computer, 30(8):96-98, 1997.

  8. Quality attributes – ISO 9126

  9. Quality attributes

  10. Why use testing? • Risk mitigation • Faults are found early • Faults can be prevented • Reduce lead-time • Deliverables can be reused • …

  11. Why do faults occur in software? • Software is written by humans • Who know something, but not everything • Who have skills, but aren’t perfect • Who don’t usually use rigorous methods • Who do make mistakes (errors) • Under increasing pressure to deliver to strict deadlines • No time to check, assumptions may be wrong • Systems may be incomplete • Software is complex, abstract and invisible • Hard to understand • Hard to see if it is complete or working correctly • No one person can fully understand large systems • Numerous external interfaces and dependencies

  12. Fault model Defect sources Impact of software artifacts Impact from user’s view Lack of educationPoor communicationOversightTranscriptionImmature process Errors Faults / Defects Failures Poor quality software User dissatisfaction Origins of defects

  13. Whoops, that’s my calculator

  14. Definition 1 Verification is the product right? Validation is it the right product? Definition 2 Verification satisfies the conditions at the start of the phase Validation satisfies the requirements Testing, Verification & Validation Testing The process of evaluating a program or a system

  15. Definitions • Failure is an event, fault is a state of the software caused by an error • Error – human mistake • Fault / Defect – anomaly in the software • Failure – inability to perform its required functions • Debugging / Fault localization – localizing, repairing, retesting.

  16. Definitions • A TEST CASE consists of: • A set of inputs • Execution conditions • Expected outputs • A Test is: • Group of related test cases • Procedures needed to carry out the test case IEEE Definition Organization may define additional attributes

  17. Scripted and non-scripted testing • In scripted testing test cases are pre-documented in detailed, step-by-step descriptions • Different levels of scripting possible • Scripts can be manual or automated • Non-scripted testing is usually manual testing without detailed test case descriptions • Can be disciplined, planned, and well documented exploratory testing • or ad-hoc testing

  18. Test oracle • An oracle is the principle or mechanism by which you recognize a problem • Test oracle provides the expected result for a test, for example • Specification document • Formula • Computer program • Person • In many cases it is very hard to find an oracle • Even the customer and end user might not be able to tell which is the correct behaviour

  19. Test Bed • Environment contains • all the hardware and software to test software component/system • Examples: • Simulators • Emulators • Memory checkers

  20. Other Definitions • Important to understand the following definitions • Quality– degree of meeting specified requirement • Metric – quantitative measure • Quality metric • Correctness – perform the function • Reliability –perform under stated condition • Usability – effort to use the system • Integrity – withstand attacks • Portability/maintainability/interoperability …

  21. Principle 1 – purpose of testing Testing is the process of: • exercising a software component using a selected set of test cases, with the intent of • Revealing defects • Evaluating quality

  22. Principles 2: A good test case – When the test objective is to detect defects, then a good test case is one that has high probability of revealing a yet undetected defect(s) 3: Test result – The results should be inspected meticulously 4: Expected output – A test case must contain the expected output

  23. Principles 5: Input – Test cases should be developed for both valid and invalid input conditions 6: Fault content estimation – The probability of the existence of additional defects in a software component is proportional to the number of defects already detected in that component 7: Test organization – Testing should be carried out by a group that is independent of the development group

  24. Principles 8: Repeatable – Tests must be repeatable and reusable 9: Planned – Testing should be planned 10: Life cycle – Testing activities should be integrated into the software life cycle 11: Creative – Testing is a creative and challenging task

  25. Goals of the course • Knowledge • Skills • Attitudes A test specialist - trained engineer- have knowledge of test-related • Principles/processes/measurements, standards, plans, tools, and methods, and • learn how to apply - testing tasks to be performed.

  26. www.swebok.org

  27. www.swebok.org

  28. Defect reports/analysis • Requirement/SpecificationDefect Classes • Design Defect Classes • CodingDefect Classes • TestingDefect Classes • Defect Repository Functional DescriptionFeatureFeature interactionInterface description Algorithmic and processing Control, Logic, and sequence Data Module interface descriptionExternal interface description Algorithmic and processing Control, Logic, and sequence typographical data flow Data Module interface Code documentationExternal hardware, software Test HarnessTest design Test procedure Defect Classes Severity Occurrences • Defect reports/analysis Defect classes and Defect repository

  29. Lectures • Theory + discussions • Cover the basic parts of software testing • Introduction • Black-box, Reliability, Usability • Inspections, white-box testing • Lifecycle, documentation • Organization, tools • Metrics, TMM • Research presentation Economic Overview Technical Technical / Manager Managerial

  30. Lab sessions Preparation, Execution, Report • Black-box testing • Usage-based testing and reliability • White-box testing • Inspection and estimation • Software process simulation

  31. Project: Option 1 • Learn a specific area of software testing • Collect and summarize research information • Critical thinking beyond the written information • Present information in a structured way • Peer review

  32. Examination • Written exam based on the book and lab sessions • Lab sessions (approved) • Project/presentations are grades • See class web site for Assignment policy

  33. Schedule • Read • Course program • Projects in Software Testing • Check homepage • Not decided • Extra Lab dates

  34. This week • Read course program • Project • Read Projects in Software Testing • Exercise on Thursday • Decide Research/subject • Discuss papers with me – describe why is it interesting • Lab • Prepare lab 1 • Read Burnstein 1-3 • Prepare Burnstein 4,12

  35. Project: Option 1 • Research: solve a research problem; survey the state-of-the-art and identify the research problems in some area; develop and justify an extension to an existing technique; etc. • Evaluation: apply and evaluate a technique or evaluate a commercial testing or, analysis tool. • Practical: Use an existing technique to test a system or design and implement a prototype for a system.

  36. Project: Option 1 • Read Projects in Software Testing • Divide in groups (2-3 persons) • Discuss with me • http://www.cs.du.edu/~snarayan/sada/teaching/COMP3705/FilesFromCD/Project/Project_SwTest.pdf

  37. Project: Option 2 • Research paper presentation • Find an interesting paper • Many research papers we come across during class – pick one for presentation • Have four paper presentation – Choose your team and prepare • Reading paper takes time – start early

More Related