1 / 25

IPA Lentedagen 2006

This presentation delves into the testing process and its dimensions, providing an overview of concepts, techniques, and implementation strategies. It covers topics such as software characteristics, test generation, test implementation and execution, and evaluation of results.

nuri
Download Presentation

IPA Lentedagen 2006

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IPA Lentedagen 2006 Testing for Dummies Judi Romijn jromijn@win.tue.nl OAS, TU/e

  2. Outline • Terminology: What is... error/bug/fault/failure/testing? • Overview of the testing process • concept map • dimensions • topics of the Lentedagen presentations

  3. requirements: for input i, give output 2*i3 (so 6 yields 432) software: i=input(STDIN); i=double(i); i=power(i,3); output(STDOUT,i); output (verbose): input: 6 doubling input.. computing power.. output: 1728 failure error What is... error/fault/bug:something wrong in software failure: manifestation of an error (observable in software behaviour) something wrong in software behaviour (deviates from requirements)

  4. What is... testing: by experiment, • find errors in software (Myers, 1979) • establish quality of software (Hetzel, 1988) a succesful test: • finds at least one error • passes (software works correctly) test-to-fail test-to-pass

  5. What’s been said? • Dijkstra: Testing can show the presence of bugs, but not the absence • Beizer: 1st law: (Pesticide paradox) Every method you use to prevent or find bugs leaves a residue of subtler bugs, for which other methods are needed 2nd law: Software complexity grows to the limits of our ability to manage it • Beizer: Testers are not better at test design than programmers are at code design • Humphreys: Coders introduce bugs at the rate of 4.2 defects per hour of programming. If you crack the whip and force people to move more quickly, things get even worse. • ... • Developing software & testing are truly difficult jobs! • Let’s see what goes on in the testing process

  6. Concept map of the testing process  

  7. Dimensions of software testing • What is the surrounding software development process?(v-model/agile, unit/system/user level, planning, documentation, ...) • What is tested? • Software characteristics (design/code/binary, embedded?, language, ...) • Requirements (functional/performance/reliability/..., behaviour/data oriented, precision) • Which tests? • Purpose (kind of coding errors, missing/additional requirements, development/regression) • Technique (adequacy criterion: how to generate how many tests) • Assumptions (limitations, simplifications, heuristics) • How to test? (manual/automated, platform, reproducable) • How are the results evaluated? (quality model, priorities, risks) • Who performs which task? (programmer, tester, user, third party) • Test generation, implementation, execution, evaluation

  8. Dimensions + concept map 1 4 6 2 5 6 3 6 2

  9. requirements acceptancetest systemtest detaileddesign integrationtest specification unittest implementationcode 1: Test process in software development V-model:

  10. 1: Test process in software development Agile/spiral model:

  11. 1: Test process in software development Topics in the Lentedagen presentations: • Integration of testing in entire development process with TTCN3 • standardized language • different representation formats • architecture allowing for tool plugins • Test process management for manufacturing systems (ASML) • integration approach • test strategy

  12. 2: Software • (phase) Unit vs. integrated system • (language) imperative/object-oriented/hardware design/binary/… • (interface) data-oriented/interactive/ embedded/distributed/…

  13. 2: Requirements • functional: • the behaviour of the system should be correct • requirements can be precise, but often are not • non-functional: • performance, reliability, compatibility, robustness (stress/volume/recovery), usability, ... • requirements are possibly quantifiable, and always vague

  14. 2: Requirements Topics in the Lentedagen presentations: • models: • process algebra, automaton, labelled transition system, Spec# • coverage: • semantical: • by formal argument (see test generation) • by estimating potential errors, assigning weights • syntactical • risk-based (likelihood/impact)

  15. 3: Test generation: purpose What errors to find? Related to software development phase: • unit phase typical typos, functional mistakes • integration interface errors • system/acceptance: errors w.r.t. requirements • unimplemented required features‘software does not do all it should do’ • implemented non-required features‘software does things it should not do’

  16. data-based structure-based error seeding black box white box typical errors efficiency ... 3: Test generation: technique Dimensions: black box: we don’t have acces to the software to be tested white box: we have access to the software to be tested

  17. 3: Test generation Assumptions, limitations • single/multiple fault: clustering/dependency of errors • perfect repair • heuristics: • knowledge about usual programming mistakes • history of the software • pesticide paradox • ...

  18. 3: Test generation Topics in the Lentedagen presentations: • Mostly black box, based on behavioural requirements: • process algebra, automaton, labelled transition system, Spec# • Techniques: • assume model of software is possible • scientific basis: formal relation between requirements and model of software • Data values: constraint solving • Synchronous vs. asynchronous communication • Timing/hybrid aspects • On-the-fly generation

  19. 4: Test implementation & execution • Implementation • platform • batch? • inputs, outputs, coordination, ... • Execution • actual duration • manual/interactive or automated • in parallel on several systems • reproducible?

  20. 4: Test implementation & execution Topics in the Lentedagen presentations: • Intermediate language: TTCN3 • Timing coordination • From abstract tests to concrete executable tests : • Automatic refinement • Data parameter constraint solving • On-the-fly: • automated, iterative

  21. 5: Who performs which task • Software producer • programmer • testing department • Software consumer • end user • management • Third party • testers hired externally • certification organization

  22. 6: Result evaluation • Per test: • pass/fail result • diagnostical output • which requirement was (not) met • Statistical information: • coverage (program code, requirements, input domain, output domain) • progress of testing (#errors found per test-time unit: decreasing?) • Decide to: • stop (satisfied) • create/run more tests (not yet enough confidence) • adjust software and/or requirements, create/run more tests (errors to be repaired)

  23. 6: Result evaluation Topics in the Lentedagen presentations: • Translate output back to abstract requirements • possibly on-the-fly • Statistical information: • cumulative times at which failures were observed • fit statistical curve • quality judgement: X % of errors found • predict how many errors left, how long to continue • assumptions: total #errors, perfect repair, single fault

  24. Dimensions + concept map 1 4 6 2 5 6 3 6 2

  25. Hope this helps... Enjoy the Lentedagen!

More Related