1 / 39

CS527: (Advanced) Topics in Software Engineering (Software Testing and Analysis)

CS527: (Advanced) Topics in Software Engineering (Software Testing and Analysis). Darko Marinov August 23, 2011. Course Overview. Graduate seminar on program analysis (for bug finding), emphasis: systematic testing (for concurrent code) Focus on a ( research) project

adonai
Download Presentation

CS527: (Advanced) Topics in Software Engineering (Software Testing and Analysis)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS527: (Advanced) Topics in Software Engineering(Software Testing and Analysis) Darko Marinov August 23, 2011

  2. Course Overview • Graduate seminar on program analysis (for bug finding), emphasis: systematic testing (for concurrent code) • Focus on a (research) project • Papers: reading in advance, writing reports, presenting, and discussing • Project: initial proposal, progress report, [final presentation, based on enrollment size], paper • One homework assignment (maybe two) to help with projects

  3. Administrative Info • Meetings: TR 2-3:15pm, 1302 SC • Credit: 4 graduate hours • Auditors welcome for discussions • Can come to any lecture, mostly self-contained • Prerequisites: some software engineering and programming languages

  4. Evaluation • Grading • Project [40%] • Presentation [20%] • Participation (reports and discussion) [20%] • Homework assignment(s) [20%] • Distribution • Grades will be A- centered • No guarantee for A (or even a passing grade)! • Project is the most important

  5. Project • Proposal (due in late September) • Progress report (in November) • [Presentation (after Thanksgiving Break)] • Paper (last day of classes) • 15 students who took similar classes published 12 papers based on their projects • I’m happy to help, no co-authorship required

  6. Fair Warnings • This class will differ from most you take • Seminar style, reading papers • Centered around (research) projects • Projects are NOT easy • Require that you explore a topic in great depth • The topic can/should be fairly narrow

  7. Repeated Warning • Project matters the MOST • If you like open-ended projects, do take this course • If you don’t like open-ended projects, please drop this course now • If you’re unsure, please discuss with me • The worst scenario: take the course but realize you don’t like projects

  8. Project Overview • Testing/analysis of some open-source code • We will use mostly Java this semester • You can use C#/C/C++/C--/Clojure/Cecil…http://en.wikipedia.org/wiki/List_of_programming_languages#C • Sample topics • Automated unit testing • Test coverage and adequacy criteria • Test-input generation, test oracles • Test maintenance • …

  9. Course Communication • Wikihttp://wiki.engr.illinois.edu/display/cs527fa11 • Mailing listcs527-fa11 AT cs.illinois.edu

  10. Personnel • Instructor: Darko Marinov • Office: 3116 SC, hours: by appointment • Phone number: 217-265-6117 • NetID: marinov • TA: Shin Hwei Tan • NetID: stan6 • Please use your netid@illinois.edu email addresses for communication

  11. Signup Sheet • Name • NetID (please write it legibly) • Program/Year • Group • Interests • Please also sign up on Wiki (if possible,as there are some delays with netids)

  12. This Lecture: Overview • Why look for bugs? • What are bugs? • Where they come from? • How to detect them?

  13. Some Costly “Bugs” • NASA Mars space missions • Priority inversion (2004) • Different metric systems (1999) • BMW airbag problems (1999) • Recall of >15000 cars • Ariane 5 crash (1996) • Uncaught exception of numerical overflow • http://www.youtube.com/watch?v=kYUrqdUyEpI • Your own (in)famous example?

  14. Some “Bugging” Bugs • An example bug on my laptop • “Jumping” file after changing properties • Put a read-only file on the desktop • Change properties: rename and make not read-only • Your own favorite example?

  15. Economic Impact • “The Economic Impact of Inadequate Infrastructure for Software Testing”NIST Report, May 2002 • $59.5B annual cost of inadequate software testing infrastructure • $22.2B annual potential cost reduction from feasible infrastructure improvements

  16. Estimates • Extrapolated from two studies (5% of total) • Manufacturing: transportation equipment • Services: financial institutions • Number of simplifying assumptions • “…should be considered approximations” • What is important for you? • Correctness, performance, functionality

  17. Terminology • Anomaly • Bug • Crash • Defect • Error • Failure, fault • Glitch • Hangup • Incorrectness • J…

  18. Dynamic vs. Static • Incorrect (observed) behavior • Failure, fault • Incorrect (unobserved) state • Error, latent error • Incorrect lines of code • Fault, error

  19. “Bugs” in IEEE 610.12-1990 • Fault • Incorrect lines of code • Error • Faults cause incorrect (unobserved) state • Failure • Errors cause incorrect (observed) behavior • Not used consistently in literature!

  20. Correctness • Common (partial) properties • Segfaults, uncaught exceptions • Resource leaks • Data races, deadlocks • Statistics based • Specific properties • Requirements • Specification

  21. RequirementsAnalysis DesignChecking ImplementationUnit Testing IntegrationSystem Testing MaintenanceVerification Traditional Waterfall Model

  22. Phases (1) • Requirements • Specify what the software should do • Analysis: eliminate/reduce ambiguities, inconsistencies, and incompleteness • Design • Specify how the software should work • Split software into modules, write specifications • Checking: check conformance to requirements

  23. Phases (2) • Implementation • Specify how the modules work • Unit testing: test each module in isolation • Integration • Specify how the modules interact • System testing: test module interactions • Maintenance • Evolve software as requirements change • Verification: test changes, regression testing

  24. Testing Effort • Reported to be >50% of development cost [e.g., Beizer 1990] • Microsoft: 75% time spent testing • 50% testers who spend all time testing • 50% developers who spend half time testing

  25. When to Test • The later a bug is found, the higher the cost • Orders of magnitude increase in later phases • Also the smaller chance of a proper fix • Old saying: test often, test early • New methodology: test-driven development(write tests before code)

  26. Software is Complex • Malleable • Intangible • Abstract • Solves complex problems • Interacts with other software and hardware • Not continuous

  27. Software Still Buggy • Folklore: 1-10 (residual) bugs per 1000 nbnc lines of code (after testing) • Consensus: total correctness impossibleto achieve for (complex) software • Risk-driven finding/elimination of bugs • Focus on specific correctness properties

  28. Approaches for Detecting Bugs • Software testing • Model checking • (Static) program analysis

  29. Software Testing • Dynamic approach • Run code for some inputs, check outputs • Checks correctness for some executions • Main questions • Test-input generation • Test-suite adequacy • Test oracles

  30. Other Testing Questions • Maintenance • Selection • Minimization • Prioritization • Augmentation • Evaluation • Fault Characterization • …

  31. Model Checking • Typically hybrid dynamic/static approach • Checks correctness for all executions • Some techniques • Explicit-state model checking • Symbolic model checking • Abstraction-based model checking

  32. Static Analysis • Static approach • Checks correctness for all executions • Some techniques • Abstract interpretation • Dataflow analysis • Verification-condition generation

  33. Comparison • Level of automation • Push-button vs. manual • Type of bugs found • Hard vs. easy to reproduce • High vs. low probability • Common vs. specific properties • Type of bugs (not) found

  34. Soundness and Completeness • Do we detect all bugs? • Impossible for dynamic analysis • Are reported bugs real bugs? • Easy for dynamic analysis • Most practical techniques and tools are both unsound and incomplete! • False positives • False negatives

  35. Analysis for Performance • Static compiler analysis, profiling • Must be sound • Correctness of transformation: equivalence • Improves execution time • Programmer time is more important • Programmer productivity • Not only finding bugs

  36. Combining Dynamic and Static • Dynamic and static analyses equal in limit • Dynamic: try exhaustively all possible inputs • Static: model precisely every possible state • Synergistic opportunities • Static analysis can optimize dynamic analysis • Dynamic analysis can focus static analysis • More discussions than results

  37. Current Status • Testing remains the most widely used approach for finding bugs • A lot of recent progress (within last decade) on model checking and static analysis • Model checking: from hardware to software • Static analysis: from sound to practical • Vibrant research in the area • Gap between research and practice

  38. Topics Related to Finding Bugs • How to eliminate bugs? • Debugging • How to prevent bugs? • Programming language design • Software development processes • How to show absence of bugs? • Theorem proving • Model checking, program analysis

  39. Next Lecture • Thursday, August 25, at 2pm, 1302 SC • Texts to read (listed on Wiki) • How to Read an Engineering Research Paperby William G. Griswold • Writing Good Software Engineering Research Papersby Mary Shaw (ICSE 2003) • If you have read that paper, read on another area • You don’t have to write any report now • Assignment 0: Modify “People” on Wiki

More Related