1 / 21

Specification-Based Error Localization

Specification-Based Error Localization. Brian Demsky Martin Rinard Laboratory for Computer Science Massachusetts Institute of Technology. Have to trace symptom back to cause Corruption may not cause visible error in test suite. Problem. Crash or Unexpected Result.

china
Download Presentation

Specification-Based Error Localization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Specification-Based Error Localization Brian Demsky Martin Rinard Laboratory for Computer Science Massachusetts Institute of Technology

  2. Have to trace symptom back to cause Corruption may not cause visible error in test suite Problem Crash or Unexpected Result Execution with Broken Data Structure Error Introduced

  3. Solution: discover bugs when they corrupt data not when effect becomes visible Perform frequent consistency checks Bug localized between first unsuccessful check and last successful check Problem Crash or Unexpected Result Execution with Broken Data Structure Error Introduced

  4. Architecture Abstract Model Concrete Data Structure Model DefinitionRules o1 nodes o1 value next values o1 o1 value Consistency Constraints

  5. Architecture Rationale Why use the abstract model? • Model construction separates objects into sets • Reachability properties • Field values • Different constraints for objects in different sets • Appropriate division of complexity • Data structure representation complexity encapsulated in model construction rules • Consistency property complexity encapsulated in (clean, uniform) model constraint language

  6. tile grid[EDGE][EDGE]; structure tile { int terrain; city *city; } structure city { int population; } Simplified Freeciv Example Terrain Grid O = Ocean O P M M P = Plain O O P M M = Mountain O P M M City Structures P P P M

  7. Sets and Relations in Model • Sets of objects set TILE of tilegrid; set CITY of city; • Relations between objects – values of object fields, referencing relationships between objects relation CITYMAP : TILE -> CITY; relation TERRAIN : TILE -> integer;

  8. Model Translation Bits translated to sets and relations in abstract model using statements of the form: Quantifiers, Condition  Inclusion Constraint for x in 0..EDGE*EDGE, true  grid[x] in TILE for t in TILE, true  t,t.terrain in TERRAIN for t in TILE, !t.city = NULL  t,t.city inCITYMAP for t in TILE, !t.city=NULL t.city in CITY

  9. Model in Example Tiles Cities grid[0] grid[1] grid[2] grid[3] city 1 2 3 4

  10. Consistency Properties Quantifiers, Body • Body is first-order property of basic propositions • Inequality constraints on numeric fields • Cardinality constraints on sizes of sets • Referencing relationships for each object • Set and relation inclusion constraints • Example: for t in TILE, MIN <= t.TERRAIN and t.TERRAIN<=MAX for c in CITY, size(CITYMAP.c)=1 for c in CITY, !(CITYMAP.c).TERRAIN=OCEAN

  11. Consistency Violations Evaluate consistency properties, find violations for c in CITY, size(CITYMAP.c)=1 Tiles Cities grid[0] grid[1] grid[2] grid[3] city 1 2 3 4

  12. Slide about checks • TODO

  13. Optimized Implementation • Compilation (4.7x speedup) • Fixed point elimination (210x speedup) • Evaluate model definition rules using simple traversal • Relation construction elimination (500x speedup) • Evaluate uses of relations directly on data structures • Set construction elimination (3900x speedup) • Evaluate constraints while traversing data structures • Bottom line • Interpreted version X times slower than uninstrumented • Optimized version Y times slower than uninstrumented

  14. Freeciv Case Study • Multiplayer Client/Server based online game • Available at www.freeciv.org • Case study looked at the server • Server contains 73,000 lines of code • Added 750 instrumented sites • 20,000 consistency checks performed in our sample execution

  15. Case Study • Created three buggy version of Freeciv • Two groups of three developers • One used conventional tools • One used specification-based consistency checking • Each participant was asked to spend at least one hour on each version • Both populations given a pre-instrumented version of Freeciv

  16. Consistency Properties • Map exists size(MAP)=1 • Grid of tiles exists size(GRID)=1 • Tiles have valid terrain values for t in TILE, MIN <= t.TERRAIN and t.TERRAIN<=MAX • Cities are not in the ocean for c in CITY, !(CITYMAP.c).TERRAIN=OCEAN • Each city has exactly one reference from the grid for c in CITY, size(CITYMAP.c)=1

  17. Bugs Introduced • Actual errors in buggy versions • First error creates invalid terrain values (violates valid terrain property) • Second causes two tiles to refer to the same city (violates single reference property) • Third causes a city to be placed on ocean (violates cities not in ocean property)

  18. Results • User study shows benefit from approach • With tool • All developers found and fixed all bugs • Mean of 11 minutes required • Without tool • Three developers found total of one bug (out of nine developer/bug combinations) • Others spent hour debugging (unsuccessfully)

  19. Repair for Deployed Systems • Consistency specifications for repair • Input: inconsistent data structure • Output: consistent data structure • Technique enables programs to recover from data structure corruption • And continue to execute successfully • OOPSLA ’03 paper describes this technique Full reference here

  20. Related Work • Specification languages such as UML or Alloy • Specification-based testing • Korat (Boyapati et. al. ISSTA 2002) • Testera (Marinov and Khurshid) • Eiffel (Meyer) • Invariant inference and checking • Daikon (Ernst et. al.) • DIDUCE (Hangal and Lam) • Carrot (Pytlik et. Al.)

  21. Conclusion • Consistency checking to localize data structure corruption bugs • Good experimental results • With checker, bugs fixed in minutes • Without checker, bugs not fixed in an hour • Optimizations for good performance • Data structure repair

More Related