relaxing the testcase assumptions in genprog
Download
Skip this Video
Download Presentation
Relaxing the Testcase Assumptions in GenProg

Loading in 2 Seconds...

play fullscreen
1 / 38

Relaxing the Testcase Assumptions in GenProg - PowerPoint PPT Presentation


  • 67 Views
  • Uploaded on

Relaxing the Testcase Assumptions in GenProg. Jonathan Dorn. EVALUATE FITNESS. INPUT. DISCARD. ACCEPT. GenProg. OUTPUT. MUTATE. EVALUATE FITNESS. INPUT. DISCARD. ACCEPT. GenProg. OUTPUT. MUTATE. EVALUATE FITNESS. INPUT. DISCARD. ACCEPT. GenProg. OUTPUT. MUTATE.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Relaxing the Testcase Assumptions in GenProg' - wynona


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
genprog

EVALUATE FITNESS

INPUT

DISCARD

ACCEPT

GenProg

OUTPUT

MUTATE

genprog1

EVALUATE FITNESS

INPUT

DISCARD

ACCEPT

GenProg

OUTPUT

MUTATE

genprog2

EVALUATE FITNESS

INPUT

DISCARD

ACCEPT

GenProg

OUTPUT

MUTATE

testcases in genprog

Guide the search.

  • Validate the repair.

EVALUATE FITNESS

Testcases in GenProg
testcases in genprog1

Indicate which functionality to repair.

Indicate which functionality to retain.

Negative Testcases

Positive Testcases

Testcases in GenProg
testcase assumptions

Testcasesexist.

  • They are correct.
  • They are comprehensive.
Testcase Assumptions
testcase difficulties

Human-written testcases are expensive.

    • Development and maintenance costs.
    • Legacy code or remote deployments.
  • “Complete coverage” approached but rarely achieved.
Testcase Difficulties
agenda

Automatic Testcase Generation

  • Evaluation
  • Preliminary Results
Agenda
automatic testcases

“Competent Programmer” assumption

    • Correct behavior already encoded in program.
    • Extract and re-encode as testcases.
Automatic Testcases
the oracle comparator model

Input

Output

Oracle

Comparator

Pass / Fail

The Oracle Comparator Model
structure of a testcase

Test Setup

    • Network connections, opened files, etc.
    • Argument values
  • Run Function Under Test
  • Check Results
    • Return value
    • Side-effects
Structure of a Testcase
automatic testcases1

Test Setup

    • Network connections, opened files, etc.
    • Argument values
  • Run Function Under Test
  • Check Results
    • Return value
    • Side-effects
Automatic Testcases
automatic testcases2

Test Setup

    • Network connections, opened files, etc.
    • Argument values
  • Run Function Under Test
  • Check Results
    • Return value
    • Side-effects
Automatic Testcases
test input generation

DART

  • CUTE
  • CREST
  • Symstra
  • Austin
  • Pex

Long-established research area

Test Input Generation
concolic execution

Generate initial input.

  • Run test; record constraints at branches.
  • Negate one constraint.
  • Solve for new input.
  • Repeat.
Concolic Execution
concolic execution1

int f(int x) {

  • if (x < 10)
  • return 1;
  • else
  • return 2;
  • }

Input: x = 456123

pred = {}

Concolic Execution
concolic execution2

int f(int x) {

  • if (x < 10)
  • return 1;
  • else
  • return 2;
  • }

Input: x = 456123

pred = {}

Concolic Execution
concolic execution3

int f(int x) {

  • if (x < 10)
  • return 1;
  • else
  • return 2;
  • }

Input: x = 456123

pred = {x>10}

Concolic Execution
concolic execution4

int f(int x) {

  • if (x < 10)
  • return 1;
  • else
  • return 2;
  • }

Input: x = 9

pred = {}

Concolic Execution
automatic testcases3

Test Setup

    • Network connections, opened files, etc.
    • Argument values 
  • Run Function Under Test
  • Check Results
    • Return value
    • Side-effects
Automatic Testcases
automatic testcases4

Test Setup

    • Network connections, opened files, etc.
    • Argument values 
  • Run Function Under Test
  • Check Results
    • Return value
    • Side-effects
Automatic Testcases
slide24

Oracle: check for invariants.

  • What are the interesting invariants?
    • Checking that 1 = 1 is not useful.
    • Invariants are true for all runs of program but violated for some runs of not-quite-the-same program.
μtest
slide25

f = min(a,b)

  • Identify predicates that are true for all inputs.
μtest
slide26

Mutate the function.

  • Identify predicates that fail in the mutants.
μtest
slide27

Take intersection as oracle invariants.

May miss invariants if mutants do not fail.

μtest
automatic testcases5

Test Setup

    • Network connections, opened files, etc.
    • Argument values 
  • Run Function Under Test
  • Check Results
    • Return value 
    • Side-effects
Automatic Testcases
research questions

Does augmenting the human-generated test suite enable more fixes?

  • Do automatic testcases miss desired functionality?
Research Questions
evaluation

Only generated testcases.

    • Pretend human test cases do not exist.
  • Generated testcases + X% of original.
    • How much is human burden reduced?
Evaluation
testcase assumptions1

Testcasesexist.(concolic execution + μTEST)

  • They are correct.(compent programmer assumption)
  • They are comprehensive.(may need small number of human tests)
Testcase Assumptions
conclusion

We can create testcasesautomatically to augment human-created tests.

  • Initial results suggest these tests permit repairs without comprehensive test suites.
Conclusion
ad