html5-img
1 / 14

Aliases in a bug finding tool

Aliases in a bug finding tool. Benjamin Chelf Seth Hallem June 5 th , 2002. Overview. Observation Bug-finding tools can be sound or unsound What distinguishes the results between them? Goal Evaluate varying levels of precision and soundness of how they handle aliases. Hypothesis

wood
Download Presentation

Aliases in a bug finding tool

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Aliases in a bug finding tool Benjamin Chelf Seth Hallem June 5th, 2002

  2. Overview • Observation • Bug-finding tools can be sound or unsound • What distinguishes the results between them? • Goal • Evaluate varying levels of precision and soundness of how they handle aliases. • Hypothesis • Sound analysis is not necessary to detect of the types of bugs currently found by state of the art tools.

  3. Alias analysis in Metal • No alias analysis in Metal • What must be done to add it? • When transitioning a variable to a new state, must transition anything that may alias it. • Alias analysis must provide the potential aliases given a program point and context. • Caveat: When transitioning aliases, Metal must also insert identity transition edges.

  4. Target Alias Analyses • Sound analyses • from the literature: Steensgard, CLA, Wilson/Lam, Instantiation Constraints, One-Level Flow (Das) • Unsound Analyses • no analysis (Metal as is) • non-conservative single-level/double-level

  5. Non-conservative analysis • Reverse standard assumptions • Conservative: unless we can guarantee that two pointers are not aliased, they could be aliased • Non-conservative: assume pointers are not aliased and add aliasing relationships that we can identify easily • One-level: p = q  p, q are aliased • assume all pointers are one-level • Two-level: p = q  p, q, *p, *q are aliased • assume all pointers are at most two-level

  6. One-level Aliasing: Example • Tracked aliasing: int main (void) { int *p, *q; p = q; // p and q are aliased. free (p); // p and q are both freed. } • Missed aliasing: caught with two-level int main (void) { int **p, **q; p = q; // p and q are aliased. free (*p); // *p is freed, *q is not. }

  7. One-level Aliasing: Example • Tracked aliasing: int main (void) { int *p, *q; p = q; // p and q are aliased. free (p); // p and q are both freed. } p unk q unk

  8. One-level Aliasing: Example • Tracked aliasing: int main (void) { int *p, *q; p = q; // p and q are aliased. free (p); // p and q are both freed. } p unk q unk

  9. One-level Aliasing: Example • Tracked aliasing: int main (void) { int *p, *q; p = q; // p and q are aliased. free (p); // p and q are both freed. } p freed q

  10. One level algorithm: basics • Initially, assign each program object (typed expression) a “fresh” alias node • node holds state attached to that object • assignment: left-hand side inherits alias node from the right-hand side • function call: formals = actuals • save alias set at call, restore at return • track struct fields, pointer arithmetic

  11. Optimizations • Kill sets • Flow insensitively track second-level assignments. • Track another level • Alias relationships can pass up. • Where is the 90% boundary?

  12. Evaluation of analysis • Who and what to check • Buggy, real systems code (varying sizes) • Null pointers, free errors, lock and unlock, etc. • Metrics • Running time of analysis and extensions • # of bugs / # of false positives • Traditional alias analysis metrics • Reasons for imprecision • Want to know why we have a false positive/negative

  13. Research costs • Difficult parts • Implementation of analyses • Bug finding interface • Error report inspection (wading through FPs) • Good source of future research • Categorize FPs for simple elimination

  14. Related work • Bug finding • Metal • RHS • Heine / Lam • Alias analyses • CLA (Andersen’s) • Steensgard • Instantiation Constraints • Wilson / Lam • One level flow (Das)

More Related