automated testing of system software virtual machine monitors l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Automated Testing of System Software (Virtual Machine Monitors) PowerPoint Presentation
Download Presentation
Automated Testing of System Software (Virtual Machine Monitors)

Loading in 2 Seconds...

play fullscreen
1 / 17

Automated Testing of System Software (Virtual Machine Monitors) - PowerPoint PPT Presentation


  • 222 Views
  • Uploaded on

Automated Testing of System Software (Virtual Machine Monitors) . Tao Xie Department of Computer Science North Carolina State University http://www.csc.ncsu.edu/faculty/xie/. Automated System Software Testing.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Automated Testing of System Software (Virtual Machine Monitors)' - kelvin


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
automated testing of system software virtual machine monitors

Automated Testing of System Software (Virtual Machine Monitors)

Tao Xie

Department of Computer Science

North Carolina State University

http://www.csc.ncsu.edu/faculty/xie/

automated system software testing
Automated System Software Testing
  • Purpose: automated testing of system code bases (e.g., virtual machine monitors) for robustness, security, functionality, coverage..
  • Often highly environment-dependent software
  • Challenges
    • Code bases are complex
      • Heavily interact with system APIs; system behavior depends on environment state, e.g., open("/dev/tty",O_WRONLY)
    • Testing requires sophisticated system setup
testing environment dependent software
Testing Environment-Dependent Software
  • Test inputs: method arguments, receiver object state & input environment state
  • Test outputs: method return values, receiver object state & output environment state
  • Sufficient & safe testing of software
    • generate high-covering tests
    • cause no threat to the environment
application of automated testing tools
Application of Automated Testing Tools
  • Dynamic symbolic execution tools generate input method arguments and receiver object state
    • Microsoft Pex (C#) and CREST (C)
  • Empirical study: applied these tools to test Xen, CodePlex Client, and NUnit framework
    • Heavily interact with the file-system environment
  • Observed results: test generation tools failed to generate high-covering test inputs
  • Identified problem: required input environment states
problems
Problems
  • P1: Generating input environment state is beyond the scope of test-generation tools.
  • P2: Arbitrary program inputs generated by test-generation tools can lead to pollution/threat of the environment states.

Example code under test

//Code Coverage – Many cases – Environment state

01: public void Add(string localPath,

bool recursive,SourceItemCallback callback)

02: {

03: Guard.ArgumentNotNullOrEmpty(localPath, "localPath");

04: if (fileSystem.DirectoryExists(localPath))

05: AddFolder(localPath, recursive, callback, true);

06: else if (fileSystem.FileExists(localPath))

07: AddFile(localPath, callback, true);......

// Safe testing – delete directory

01:boolOnBeforeAddItem(SourceItem item)

02: {

..........

09: if (item.ItemType == ItemType.File)

10: File.Delete(item.LocalName);

11: else

12: Directory.Delete(item.LocalName);

13: ......

14: return (answer =="y" || answer =="a");

15: }

outline
Outline
  • Mock objects as a solution to the identified problems
  • Challenges
  • Proposed approach
  • Preliminary results
  • Future work
mock objects
Used to simulate the required environment, avoiding interacting with the real environment

Benefits:

Enable unit testing

Increase code coverage

Ensure safe testing

Challenge: Non-trivial to implement a mock object

Mock Objects

//Code under test in Mock object based testing approach

01: public void Add(string localPath,

bool recursive,SourceItemCallback callback)

02: {

03: Guard.ArgumentNotNullOrEmpty(localPath, "localPath");

04: if (mockFileSystem.DirectoryExists (localPath))

05: AddFolder(localPath, recursive, callback, true);

06: else if (mockFileSystem.FileExists (localPath))

07: AddFile(localPath, callback, true);......

mock objects cont
Incomplete/incorrect implementation causes false alarms

Non-trivial to implement sophisticated mock object

Tedious task!

Our solution:

Systematic approach to build a mock object to pass given tests failed due to insufficient mock object

//incorrect implementation

publicbool DirectoryExists(string path)

{ returnfalse; }

publicvoid CreateDirectory(string path)

{ listOfCreatedDir.Add(path); }

//correct implementation

publicbool DirectoryExists(string path){

if (listOfCreatedDir.Contains(path))

returntrue;

returnfalse;

}

publicvoid CreateDirectory(string path){

if (listOfCreatedDir.Contains(path))

return;

else

listOfCreatedDir.Add(path);

}

Mock Objects (cont.)
approach
Approach
  • Moca (MOCk Assistant) follows Test-Driven Development (TDD) to systematically build a high-quality mock object that is sufficient to achieve effective testing of the code under test without causing any false alarms
  • Moca makes use of PUTs (Parameterized Unit Tests)
  • PUTs = unit tests (TUTs) with parameters, including functional specifications
  • Microsoft research tool, Pex, accepts these PUTs and generates high-covering tests
approach cont
Approach (cont.)
  • Input to Moca
    • A set of conventional unit tests (failing due to an insufficient mock object)
    • An environment that needs to be mocked
    • A set of PUTs
  • Moca assists developers in building a mock object that can be used to replace the real-environment interactions
approach cont11
Approach (cont.)

TUT – Conventional unit tests

CUM – Class to mock

ENV – Environment

MUM – Method to mock

CUT – Code under test

preliminary results
Preliminary Results
  • Application on a real-world application, CodePlex Client
  • Results: Moca can assist developers in building a mock object
    • Effective in achieving high coverage, w/o false alarms
    • Sufficient when compared to a naive implementation,
    • Less complex and thus incurring less effort than a manual sophisticated implementation
summary
Summary
  • Identified problems with automated testing of environment-dependent software
  • Conducted empirical study to show benefits of using mock objects and indentify challenges in building mock objects
  • Proposed an approach based on TDD methodology to build mock objects
  • Demonstrated the feasibility and benefits of the proposed approach
  • Also developed new techniques for test generation
key outcomes
Key Outcomes

Relevance to military/DoD:

  • An undergraduate student, Justin Gorham, is working as a summer intern at the Fort Hood Army Electronic Proving Ground (EPG) teamin applying Pex and our extensions on Army code bases.
  • A PhD student, Kunal Taneja, is working as a summer intern at FDAin applying Pex and our extensions on DoD code basefor regulatory purposes (mocking databases).

Publications:

  • [AST 09] Madhuri R Marri, Tao Xie, Nikolai Tillmann, Jonathan de Halleux, and Wolfram Schulte. An Empirical Study of Testing File-System-Dependent Software with Mock Objects. In Proceedings of the 4th International Workshop on Automation of Software Test (AST 2009), Business and Industry Case Studies, pp. 149-153, May 2009.   
  • [Mutation 09] Tao Xie, Nikolai Tillmann, Jonathan de Halleux, and Wolfram Schulte. Mutation Analysis of Parameterized Unit Tests. In Proceedings of the 4th International Workshop on Mutation Analysis (Mutation 2009), pp. 177-181, April 2009. 
  • [SUITE 09] Madhuri R Marri, Suresh Thummalapenta, and Tao Xie. Improving Software Quality via Code Searching and Mining. In Proceedings of the First International Workshop on Search-Driven Development – Users, Infrastructure, Tools and Evaluation (SUITE 2009), pp. 33-36, May 2009. 
  • [DSN 09] Tao Xie, Nikolai Tillmann, Peli de Halleux, and Wolfram Schulte. Fitness-Guided Path Exploration in Dynamic Symbolic Execution. To appear in Proceedings of the 39th Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN 2009), June-July 2009.
  • [ESEC/FSE 09] Suresh Thummalapenta, Tao Xie, Nikolai Tillmann, Peli de Halleux, and Wolfram Schulte. MSeqGen: Object-Oriented Unit-Test Generation via Mining Source Code. To appearin Proceedings of the 7th joint meeting of the European Software Engineering Conference and the ACM SIGSOFT Symposium on the Foundations of Software Engineering (ESEC/FSE 2009), August 2009. 
other related funding
Other Related Funding

Related new funding over the SOSI project period:

  • NSF CAREER Award: 5 years (Aug 09-July 14) $425,000

“Cooperative Developer Testing with Test Intentions”

Other ongoing supports

  • ARO Award: 3 years (Sept 08-Aug 11) $300,000

“ Mining Program Source Code for Improving Software Quality”

  • NSF SoD Award: 3 years (Jan 08-Dec 10) $245,000

“Collaborative Research: SoD-TEAM: Designing Tests for Evolving Software Systems”

  • NSF CyberTrust Award: 3 years (Aug 07-July 10) $227,275

“CT-ISG: Collaborative Research: A New Approach to Testing and Verification of Security Policies”

future directions
Future Directions
  • Test generation
    • Guided exploration of paths [DSN 09]
    • Method-sequence generation [ESEC/FSE 09]
    • Security (attack)/access control test generation (w/ NIST)
    • Performance testing
    • Embedded/network/db/SOA-app test generation
  • Dealing with environments
    • Fully automate Moca
    • Domain-specific mock object tools/libraries (e.g., file system, database, network, hardware environments)
  • Test oracles
    • Detection of insufficiency of assertions
    • Inference of normal behavior as approximate oracles
questions
Questions?

More info on research of NCSU Automated Software Engineering Group:

  • http://www.csc.ncsu.edu/faculty/xie/research.htm
  • http://www.csc.ncsu.edu/faculty/xie/publications.htm
  • Recent industry impact
    • Our Fitnex strategy [DSN 09] integrated in Microsoft Research Pex as its default strategy (second half of 2008, download count of 5,600)