Black-Box Testing - PowerPoint PPT Presentation

black box testing n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Black-Box Testing PowerPoint Presentation
Download Presentation
Black-Box Testing

play fullscreen
1 / 53
Black-Box Testing
208 Views
Download Presentation
maya-lamb
Download Presentation

Black-Box Testing

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Black-Box Testing ( “FÅP”: First-year Project Course, ITU, Denmark) Claus Brabrand [ brabrand@itu.dk ]

  2. Reminder: Learning & Exam Goals • ”Product”: • ”Oral Exam”:

  3. Outline • Introduction: • ”Bugs” and ”Black-box testing” • Testing the Specification: • [”static black-box testing”] • Testing the Code: • [”dynamic black-box testing”]: ”Equivalence partitioning” • Other techniques: • Model-checking & Static-analysis • Exercises: • Work on your group exercises

  4. Definition: ”bug” Main entry: 2bug Pronunciation:/’bəg/ Function:noun Etymology:origin unknown Date: 1622 1 a:an insect or other creeping or crawling invertebrate (as a spider or centipede) b:any of several insects (as the bedbug or cockroach) commonly considered obnoxious c:any of an order (Hemiptera and especially its suborder Heteroptera) of insects that have sucking mouthparts, forewings thickened at the base, and incomplete metamorphosis and are often economic pests —called also true bug 2: an unexpected defect, fault, flaw, or imperfection <the software was full of bugs> 3 a: a germ or microorganism especially when causing disease b: an unspecified or nonspecific sickness usually presumed due to a bug 4:a sudden enthusiasm 5:enthusiast <a camera bug> 6:a prominent person 7:a crazy person 8:a concealed listening device 9: a weight allowance given apprentice jockeys

  5. Example of a Bug :-) Hej Claus, A propos test og datoberegninger: Fredag morgen mente min PDA kalender (Microsoft Windows CE) at vi skrev lørdag 1. marts. Jeg stillede selvfølgelig tålmodigt uret tilbage til fredag 29. februar 2008(dette fandt sted på et møde hos Microsoft i Vedbæk ;-) Lørdag formiddag startede den så med at vise min (ret tomme) kalender for august 2049. Det indbyggede ur stod på 1. januar 1601. Ahem. Peter

  6. ”Severe conditions”: Fault Failure Defect ”Unintended operation”: Anomaly Incident Variance Feature (intended) Terminology (for ’Bugs’) • ”Generic terms”: • Problem • Error • Bug ( we’ll use this term) Typically imply blame; as in:- ”it was his fault” Famous quote: - ”It’s not a bug, it’s a feature”

  7. ”The Harvard Mark II Bug” “The first documented computer bug was a moth found trapped between points at Relay # 70, Panel F, of the Mark II Aiken Relay Calculator while it was being tested” Harvard University, Sep. 9, 1947 Photo of firstactual ”bug”:

  8. ~ bug specification(aka., ’spec’) implementation (aka., ’program’) Definition: ”Bug” • A ”bug” is a relation between: • Specification & • Implementation • Whether or not specification is: • Explicit or Implicit • Written or Oral • Formal orInformal (e.g., ”product spec” vs. ”back-of-envellope”)

  9. Specification • A specstates things an impl…: • Should do! • Shouldn’t do! • Unspecified? (’unclear’, ’unmentioned’, or ’left open’) Should do! Unspecified? S Shouldn’t do! Specs are often intentionally under-specified. It’s often better to not ”prematurely commit” to a particular solution (by specifying exactly how a task should be done) – and instead just state which overall tasks the system should do.

  10. [cf. ”Software Testing”, R.Patton, p.15] ~ S I S I I S ~ ! ideal S ?! ~ ?! I Formal Definition: ”Bug” • A (software)bugoccurs when: • 1.Impldoesn’t do sth specsays it should: • 2.Impldoes sth specsays it shouldn’t: • 3.Impldoes sth specdoesn’t mention: • 4.Specdoesn’t mention sth, but should: • 5.Implis hard to understand/use by user(s): (or does it incorrectly)

  11. Consequences… • Consequently, …: • ”Additional functionality”? • e.g., a calculator which alsodoes square root (but wasn’t supposed to) • ”Easter egg”? • e.g., hit [Alt+Shift+2] in Solitaire to win game …is a bug ! …is a bug !

  12. bug ~ Question • Question: • A) R.Patton interprets as (”spec :=impl”): •  trivially no bugs in impl ! • (according to [own] definition #1,#2,#3) • B) More sensible to interpret as ”no spec”: •  entire impl is essentially one big bug ! • (…according to definition #3+#4 [p.15]) ”an impl w/o a spec” ? ”Because the software is already complete, you have the perfect specification – the product itself” [R.Patton, ”Software Testing”, p.31]

  13. [cf. ”Software Testing”, R.Patton, p.17] Number one Cause of Bugs… • …is the specification: other impl spec design • Often due to: • complexity • tight schedule • under-specification • under-documentation (NB: no percentages given)

  14. [cf. ”Software Testing”, R.Patton, p.18] Cost of (Fixing) Bugs • Cost of bugs increases exponentially(over time): log $10,000,000 $1,000,000 $10,000 $100 $1 spec design code test release

  15. [cf. ”Software Testing”, R.Patton, p.19] Tester’s Goal • The goal of a software tester is: • Testers (therefore) often aren’t the most popular members of a project team - 1)“to find bugs”; - 2)“find them as early as possible”; and - 3)“make sure they get fixed” [R.Patton, “Software Testing”, p. 19]

  16. ”The Pesticide Paradox” • ”The pesticide paradox”: ”The more you test a software, the more immune it becomes to your tests” B.Beizer, “Software Testing Techniques”, 1990

  17. Bugs • Bugs often occur in groups • If you find one, chances are there are more nearby • Not all bugs will be fixed • Too costly? • Too risky? • Not ”cost-effective”? • Bug unknown :-)

  18. Constructive thinking: ”Test-to-pass” Often not a good idea to ”test” your own code :-( Destructive thinking: ”Test-to-fail” (aka., ”error-forcing”) Often better to test/break someone else’s code :-) Programming vs. Testing Advice: ”test-to-pass” before ”test-to-fail” (i.e., test ”normal usage” first)

  19. input bug ~ output Black-Box Testing • The goal of black-box testing is: • Make sure impl solves problem it is supposed to: • i.e., • Point of departure: • spec, not impl • not a particular program which ”claims” to solve problem • testing w/o insights of code impl ~ spec Static (test spec): Dynamic (test impl):

  20. Outline • Introduction: • ”Bugs” and ”Black-box testing” • Testing the Specification: • [”static black-box testing”] • Testing the Code: • [”dynamic black-box testing”]: ”Equivalence partitioning” • Other techniques: • Model-checking & Static-analysis • Exercises: • Work with your group exercises

  21. [cf. ”Software Testing”, R.Patton, p.61] Spec ”Warning Words” (I/III) • Unconditionals (universally): • ’Always’, ’for every’, ’for each’, ’for all’, … • Try to violate (i.e., find exemptions from rule)! • Unconditionals (never): • ’None’, ’never’, ’not under any circumstances’, … • Try to violate (i.e., find exemptions from rule)! • Tautologies: • ’Trivially’, ’certainly’, ’clearly’, ’obviously’, ’evidently’, … • Check assumptions (that nothing’s swept under the rug)!

  22. [cf. ”Software Testing”, R.Patton, p.61] Spec ”Warning Words” (II/III) • Unspecified conditionals: • ’Some(-times)’, ’often’, ’usually’, ’ordinarilly’, ’mostly’, … • Unclear spec (under which circumstances)? • Continuations: • ’Etcetera’, ’and so forth’, ’and so on’, … • Check that spec is comprehensively unambiguous? • Examples: • ’E.g.’, ’for example’, ’such as’, … • Is example representative (what about other examples)?

  23. [cf. ”Software Testing”, R.Patton, p.61] Spec ”Warning Words” (III/III) • Positive adjectives: • ’Good’, ’fast’, ’efficient’, ’small’, ’reliable’, ’stable’, … • Subjective (needs objectification if to be used for testing)! • Alegedly completed: • ’Handled’, ’processed’, ’taken care of’, ’eliminated’, … • Is something hidden? • Incompleted: • ’Skipped’, ’unnecessary’, ’superfluous’, ’rejected’, … • Is something forgotten?

  24. [cf. ”Software Testing”, R.Patton, p.61] Spec ”Warning Words” (if-then) • Finally, watch out for: • ”If … Then” (with missing ”Else”): • Check what happens in the ”Else-case” • IF … • THEN … • ELSE … ?!

  25. ~ Test Sequencing • Question: which is ”best” (…and why)? • A) white-box testing ; black-box testing ; • (i.e., white-box testing first) • B) black-box testing ; white-box testing ; • (i.e., black-box testing first) • Answer: ’B’ • Settle overallimpl~specproblems first • Before zoomingin on the impl[as in white-box testing] …or…

  26. Outline • Introduction: • ”Bugs” and ”Black-box testing” • Testing the Specification: • [”static black-box testing”] • Testing the Code: • [”dynamic black-box testing”]: ”Equivalence partitioning” • Other techniques: • Model-checking & Static-analysis • Exercises: • Work on your group exercises

  27. Test ”Boundaries” • Programs are vulnerable ”around the edges”: • e.g. testing legal inputs (time, in hours): • e.g. testing legal inputs (dates, in April): Property Minimum-1 Minimum Middle repr. Maximum Maximum+1 Expected output invalid valid valid valid invalid Actual output Input -1 0 11(e.g.) 23 24 Property Minimum-1 Minimum Middle repr. Maximum Maximum+1 Expected output invalid valid valid valid invalid Actual output Input 00/4 01/4 12/4(e.g.) 30/4 31/4

  28. Test ”Powers-of-Two” • Programs vulnerable ”around powers-of-two”: • e.g. years of age (assume held in a byte): • e.g. #game-spectators (assume held in a 16-bit word): Property Minimum-1 Minimum Middle repr. Maximum Maximum+1 Expected output invalid valid valid valid invalid Actual output Input -1 0 27 (e.g.) 255 256 Property Minimum-1 Minimum Middle repr. Maximum Maximum+1 Expected output invalid valid valid valid invalid Actual output Input -1 0 12345(e.g.) 65535 65536

  29. Test ”Empty Input” • Default / empty / blank / null / zero / none / : • e.g., ’any program’: Property No input Expected output Error message Actual output Input 

  30. Test ”Invalid Input” • Invalid / illegal / wrong / garbage / bogus data: • e.g., calculator: Property Invalid input Bogus data!!! Expected output Error message Error message Actual output Input +*31 #$+~´?=!

  31. Equivalence Partitioning Partitions and Equivalence Relations

  32. (-1,-1) (0,-1) (-1,0) (1,1) (0,0) (1,0) (0,1) sum(x,y) sum(x,y) sum(x,y) sum(x,y) sum(x,y) sum(x,y) sum(x,y) -2 -1 -1 1 0 2 1 Testing: Infinite process • Recall: ”testing is an incomplete process” • (i.e., ”testing can’t prove absence of bugs”) • There are infinitely many possible inputs: • (hence, testing will take an infinite amount of time) …

  33. Crash course on Relations Relations Equivalence Relations

  34. |_even 5 310 Aarhus  Copenhagen Relations • Example1: “even” relation: • Written as: as a short-hand for: • … and as: as a short-hand for: • Example2: “equals” relation: • Written as: as a short-hand for: • … and as: as a short-hand for: • Example3: “dist-btwn” relation: • Written as: • as short-hand for: |_even Z |_even 4 4  |_even 5  |_even ‘=’  Z  Z (2,2)  ‘=’ 2 = 2 (2,3)  ‘=’ 2  3 ‘’  CITY  Z  CITY (Aarhus, 310, Copenhagen)  ‘’

  35. Equivalence Relation • Let ‘~’ be a binary relation over set A: • ‘~’  A  A • ~ is an equivalence relation iff: • Reflexive: • Symmetric: • Transitive:  xA: x~x x,yA: x~y y~x  x,y,zA: x~ y y~z  x~z

  36. Exercise: Eq.Rel. • Which relations are equivalence relations?: • …and which are not (and why not)?: • a) The "less-than-or-equal-to" relation: '  ' • { (n,m) | n,m Z, n  m } • b) The ”almost-total-relation-on-integers”, (relating all numbers except 42, but relating 42 with 42): • { (n,m) | n,m  ( Z \ {42} ) }  { (42,42) } • c) The "is-congruent-modulo-three" relation: • { (n,m) | n,m Z, (n % 3) = (m % 3) } • d) The "have-talked-together" relation: • { (p,q) | n,m PEOPLE, p and q have talked together } • e) The "is-in-the-same-group-as" relation: • { (p,q) | n,m PEOPLE, p and q are in same FÅP group }

  37. Equivalence relation ’~’ = { (A,A), (B,B), (A,B), (B,A), (P,P), (X,X), (Y,Y), (Z,Z), (X,Y), (Y,X), (X,Z), (Z,X), (Y,Z), (Z,Y) } E.g.: A ~ B, P ~ P, X ~ X, X ~ Z A ~ P, B ~ Y, P ~ Z Partition ’~’: Eq. Rel. Partition P A X B Z Y ”Canonical representatives”: [A] = [B] = { A, B } [P] = { P } [X] = [Y] = [Z] = { X, Y, Z } Capture the same information; i.e., notion of ”equivalence”

  38. (-1,-1) (0,-1) (-1,0) (1,1) (0,0) (1,0) (0,1) sum(x,y) sum(x,y) sum(x,y) sum(x,y) sum(x,y) sum(x,y) sum(x,y) -2 -1 -1 1 0 2 1 Testing: Infinite process • Recall: ”testing is an incomplete process” • (i.e., ”testing can’t prove absence of bugs”) • There are infinitely many possible inputs: • (hence, testing will take an infinite amount of time) …

  39. Equivalence Partitioning • Partition input: • Finitary partition: • If finite # categories (aka. ”equivalence classes”) • Here 3: { ”zero”, ”pos”, ”neg” } • We can now test all equivalence classes • Using representative elements from each category zero … -3 -2 -1 1 2 3 … 0 neg pos

  40. Test Sum (cont’d) • We can now test all equivalence classes • Using representative input from each category • Sum (testing allequivalence classes): Property Pos, Pos Neg, Pos Zero,Pos Pos, Neg Neg, Neg Zero,Neg Pos, Zero Neg, Zero Zero,Zero Expected output 3 1 5 -1 -17 -10 11 -12 0 Actual output Input (1,2) (-3,4) (0,5) (6,-7) (-8,-9) (0,-10) (11,0) (-12,0) (0,0)

  41. Frequent Partitions for Testing • Numbers: • Positive, Negative, Zero • Zero, One, Two, Many (aka. ”Greenlandic Numbers”) • Lists: • Length-0, Length-1, Length-2, Length-3+ • Ascending-elements, Descending-elements, AscDesc-elements • … Advice: - consider how problem might be solved - partition into qualitatively different categories such that: (likely: ”same category  same error”)

  42. Outline • Introduction: • ”Bugs” and ”Black-box testing” • Testing the Specification: • [”static black-box testing”] • Testing the Code: • [”dynamic black-box testing”]: ”Equivalence partitioning” • Other techniques: • Model-checking & Static-analysis • Exercises: • Work on your group exercises

  43. Model-checking Modelling Verification Model-checking

  44. Program vs. Model World (Java) (modeling lang) Program world Model world abstraction P M concretization Abstract Concrete

  45. Methodology: Model-based Design • Design abstract model • Decompose model • Reason/Test/Verify model • individual parts and whole • Recompose insights • make model safe • Impl. concrete program abstract REAL PROBLEM MODEL reason ? ? test ? verify ? concretize SAFE MODEL SAFE PROGRAM

  46. Model-checking (Java) (modeling lang) Program world Model world 1. P  ? 2. abstract 3. M  ? P M 4. verify!! 5. M  ! 7. P  ! 6. concretize Abstract Concrete If models are “finite”, we can have a computer test all possibilities. “Never send a human to do a machine’s job” -- A.Smith (’99)

  47. Static Analysis Static Analysis Undecidability

  48. Undecidability • Most interesting properties are undecidable: • e.g., can program P have a type error (when run): • Compilers use approximations (computed via ”static analyses”): no program can ”decide” this line automatically(in all cases) ? incorrect correct incorrect correct safe(over-)approximation

  49. Undecidability (self-referentiality) • Consider "The Book-of-all-Books": • This book contains the titles of all books that do not have a self-reference(i.e. don't contain their title inside) • Finitely many books; i.e.: • We can sit down & figure out whether to include or not... • Q: What about "The Book-of-all-Books"; • Should it be included or not? • "Self-referential paradox"(many guises): • e.g. "Thissentence is false" "The Bible" "War and Peace" "ProgrammingLanguages, An Interp.-Based Approach" ... The Book-of-all-Books  

  50. Termination Undecidable! • Assume termination is decidable (in Java); • i.e.  some program, halts: PROG  BOOL • Q: Does P0loop or terminate...? :) • Hence: "Termination is undecidable" • ...for Java, C, C++, Pascal, -Calculus, Haskell, ... bool halts(Program p) { ... } -- P0.java -- bool halts(Program p) { ... } Programp0 = read_program("P0.java"); if (halts(p0) == “halts”) loop(); elsehalt();