Choosing
Download
1 / 10

Choosing SATE Test Cases Based on CVEs - PowerPoint PPT Presentation


  • 91 Views
  • Uploaded on

Choosing SATE Test Cases Based on CVEs. Sue Wang [email protected] October 1, 2010 The SAMATE Project http://samate.nist.gov/. Purpose and Motivation. Provide test cases with exploitable vulnerabilities In an ideal world a tool detects significant bugs

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Choosing SATE Test Cases Based on CVEs' - adam-carpenter


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

Choosing SATE Test Cases Based on CVEs

Sue Wang

[email protected]

October 1, 2010

The SAMATE Project

http://samate.nist.gov/

SATE 2010 Workshop


Purpose and motivation
Purpose and Motivation

Provide test cases with exploitable vulnerabilities

In an ideal world a tool detects significant bugs

Also provide a fixed version of each test case

To confirm low false positive rate

  • Mentioned by SATE organizers and detailed proposal by Paul Anderson (SATE 2009 workshop)

  • Brought up by tool makers and supported by users (SATE 2010 organization meeting)


Selection criteria
Selection Criteria

  • Open source software in C/C++ and Java

  • AND with known security-related bugs

  • AND get older versions

  • AND manually pinpoint the bugs

  • AND find a fixed version

  • AND compile the source code


Primary sources
Primary Sources

  • Brainstorm and exchange ideas within SAMATE team and with others

  • Search for open sources, for instance

    • java-source.net

    • sourceforge.net

    • Other lists of scanned projects

  • Search for related vulnerabilities

    • CVE – Common Vulnerabilities and Exposures (cve.mitre.org)

    • NVD – National Vulnerabilities Database (nvd.nist.gov)

    • CVE Details – enhanced CVE data (www.cvedetails.com)

    • OSVDB – The Open Source Vulnerability Database (osvdb.org)


Selection process
Selection Process

Narrowed down to 12 open source software





Observations
Observations

  • Took far more time and effort than expected

    • CVEs are not created equal

      • Newer CVEs have higher quality info

      • Some CVEs required large amounts of research

    • Locating the path and sink is much harder than finding the fix

  • Reasons for low CVE selection rate

    • Not present in the selected version

    • Could not locate the source code or could not locate the sink

  • Useful resources and tips

    • Source’s patches, bug tracking and version control info

    • Combine information from multiple resources

      (e.g., version -> bug # -> tracking -> batches)


Possible future work
Possible Future Work?

  • Re-use the 3 test cases

    • Pinpoint more CVE flaws

    • Involve developers for confirming some of the pinpointed flaws

    • Invite tool makers to map warnings to CVEs

    • Analyze the warning and CVE mappings amount different tool makers and SATE findings

    • Store well understood CVE related test cases in SRD

  • Other suggestions?


ad