Improving software security with precise static and runtime analysis
Download
1 / 52

Improving Software Security with Precise Static and Runtime Analysis - PowerPoint PPT Presentation


  • 132 Views
  • Uploaded on

Improving Software Security with Precise Static and Runtime Analysis. Benjamin Livshits SUIF Compiler Group Computer Systems Lab Stanford University. http://suif.stanford.edu/~livshits/work/griffin/. Security Vulnerabilities: Last 10 Years. *Source: NIST/DHS Vulnerability DB.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Improving Software Security with Precise Static and Runtime Analysis' - sabin


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Improving software security with precise static and runtime analysis l.jpg

Improving Software Security with Precise Static and Runtime Analysis

Benjamin Livshits

SUIF Compiler Group

Computer Systems Lab

Stanford University

http://suif.stanford.edu/~livshits/work/griffin/


Security vulnerabilities last 10 years l.jpg
Security Vulnerabilities: Last 10 Years Analysis

*Source: NIST/DHS Vulnerability DB


Which security vulnerabilities are most prevalent l.jpg
Which Security Vulnerabilities are Most Prevalent? Analysis

  • Analyzed 500 vulnerability reports, one week in November 2005

294 vuln.

or 58%

*Source: securityfocus.com


Focusing on input validation issues l.jpg
Focusing on Input Validation Issues Analysis

Web application

vulnerabilities

*Source: securityfocus.com


Sql injection example l.jpg
SQL Injection Example Analysis

  • Web form allows user to look up account details

  • Underneath: Java J2EE Web application serving requests

String username = req.getParameter(“user”);

String password = req.getParameter(“pwd”);

String query = “SELECT * FROM Users WHERE

username =“ + user +

“ AND Password =“ + password;

con.executeQuery(query);


Injecting malicious data 1 l.jpg
Injecting Malicious Data Analysis(1)

submit

  • ...

  • ...

    String query = “SELECT * FROM Users WHERE

    username = 'bob'

    AND password = ‘********‘”;

    ...


Injecting malicious data 2 l.jpg
Injecting Malicious Data Analysis(2)

submit

  • ...

  • ...

    String query = “SELECT * FROM Users WHERE

    username = 'bob‘--

    ‘AND password = ‘ ‘”;

    ...


Injecting malicious data 3 l.jpg
Injecting Malicious Data Analysis(3)

submit

...

...

String query = “SELECT * FROM Users WHERE

username = 'bob‘; DROP Users--

‘AND password = ‘‘”;

...


Attack techniques for taint style vulnerabilities l.jpg

1. AnalysisSources (inject)

Parameter manipulation

Hidden field manipulation

Header manipulation

Cookie poisoning

Second-level injection

2. Sinks (exploit)

SQL injections

Cross-site scripting

HTTP request splitting

Path traversal

Command injection

Attack Techniques for Taint-Style Vulnerabilities

1. Parameter manipulation + 2. SQL injection = vulnerability


Goals of the griffin software security project l.jpg
Goals of the Griffin Software Security Project Analysis

  • Financial impact

    • Cost per incident: $300,000+

    • Total cost of online fraud: $400B/year

  • Griffin Project goals

    • Address vulnerabilities in Web applications

    • Focus on large Java J2EE Web applications

2000

2001

2002

2003

2004

2005

2006


Griffin project contributions l.jpg
Griffin Project Contributions Analysis

  • Effective solution addressing a large range of real life problem in domain of Web Application Security

  • Pushed state of the art in global static/pointer analysis; precisely handle large modern applications

  • Design of an efficient dynamic techniques that recover from security exploits at runtime

  • Comprehensive, large scale evaluation of problem complexity and analysis features; discovered many previously unknown vulnerabilities

Effective solution of Web App Sec problems

Static analysis

Runtime analysis

Experimental validation


Overview of the griffin project l.jpg

Overview Analysis

Overview of the Griffin Project

Static

Extensions

Dynamic

Experiments

Conclusions

Future


Griffin project framework architecture l.jpg
Griffin Project: Framework Architecture Analysis

Vulnerability

specification

Application

bytecode

Provided

by user

Static analysis

Dynamic analysis

[Livshits and Lam,

Usenix Security ’05]

[Martin, Livshits, and Lam,

OOPSLA ’05]

Vulnerability

warnings

Instrumented

application

Pros:

  • Find vulnerabilities early

  • Explores all program executions

  • Sound, finds all vuln. of particular kind

Pros:

  • Keeps vulnerabilities from doing harm

  • Can recover from exploits

  • No false positives, but has overhead


Following unsafe information flow taint flow l.jpg
Following Unsafe Information Flow / Taint Flow Analysis

How do we know

what these are?

sanitizer

String.substring

“…”+ “…”

sink

Servlet.getParameter(“user”)

(source)

Security violation

Statement.executeQuery(...)

(sink)


Vulnerability specification l.jpg
Vulnerability Specification Analysis

  • User needs to specify

    • Source methods

    • Sink methods

    • Derivation methods

    • Sanitization methods

  • PQL: Program Query Language [Martin, Livshits, and Lam OOPSLA’05]

    • General language for describing events on objects

  • Real queries are longer

    • 100+lines of PQL

    • Captures all vulnerabilities

    • Suitable for all J2EE applications

query simpleSQLInjection

returns

object String param, derived;

uses

object HttpServletRequest req;

object Connection con;

object StringBuffer temp;

matches{

param = req.getParameter(_);

temp.append(param);

derived = temp.toString();

con.executeQuery(derived);

}


Static analysis l.jpg

Overview Analysis

Static Analysis

Static

Extensions

Dynamic

Experiments

Conclusions

Future


Motivation why pointer analysis l.jpg
Motivation: Why Pointer Analysis? Analysis

  • What objects do usernameand strpoint to?

  • Question answered by pointer analysis

    • A classic compiler problem for 20 years+

    • Rely on context-sensitive inclusion-based pointer analysis [Whaley and Lam PLDI’04]

String username = req.getParameter(“user");

list1.addFirst(username);

...

String str = (String) list2.getFirst();

con.executeQuery(str);


Pointer analysis precision l.jpg
Pointer Analysis Precision Analysis

Runtime

heap

Static

representation

  • Imprecision of pointer analysis → false positives

  • Precision-enhancing pointer analysis features

    • Context sensitivity [Whaley and Lam, PLDI’04] (not enough)

    • Object sensitivity

    • Map sensitivity

Static approximation

h

o1

o2

o3


Importance of context sensitivity l.jpg
Importance of Context Sensitivity Analysis

Imprecision →Excessive tainting → false positives

c1

tainted

tainted

c1

String id(String str) {

return str;

}

c2

untainted

tainted

c2

points-to(vc : VarContext, v : Var, h : Heap)

points-to(v : Var, h : Heap)

Context insensitive

Context sensitivity


Handling containers object sensitivity l.jpg
Handling Containers: Object Sensitivity Analysis

1. String s1 = new String(); // h1

2. String s2 = new String(); // h2

3.

4. Map map1 = new HashMap();

5. Map map2 = new HashMap();

6.

7. map1.put(key, s1);

8. map2.put(key, s2);

9.

10. String s = (String) map2.get(key);

points-to(*, s, h1)

points-to(*, s, h2)

points-to(*, s, *)

points-to(vc : VarContext, v : Var, h : Heap)

points-to(vo1:Heap, vo2 :Heap, v : Var,

ho :Heap, h :Heap)

points-to(vc : VarContext, vo1:Heap, vo2 :Heap, v : Var,

ho :Heap, h :Heap)

points-to(vo : Heap, v : Var, h :Heap)

1-level object sensitivity + context sensitivity

1-level object sensitivity

Context sensitivity

Object sensitivity


Inlining poor man s object sensitivity l.jpg
Inlining: Poor Man’s Object Sensitivity Analysis

  • Call graph inlining is a practical alternative

    • Inline selected allocations sites

      • Containers: HashMap, Vector, LinkedList,…

      • String factories: String.toLowerCase(), StringBuffer.toString(), ...

    • Generally, gives precise object sensitive results

  • Need to know what to inline: determining that is hard

    • Inlining too much → doesn’t scale

    • Inlining too little → false positives

    • Iterative process

  • Can’t always do inlining

    • Recursion

    • Virtual methods with >1 target


Map sensitivity l.jpg
Map Sensitivity Analysis

1. ...

2. String username = request.getParameter(“user”)

3. map.put(“USER_NAME”, username);

...

4. String query = (String) map.get(“SEARCH_QUERY”);

5. stmt.executeQuery(query);

6. ...

  • Maps with constant string keys are common

  • Map sensitivity: augment pointer analysis:

    • Model HashMap.put/getoperations specially

“USER_NAME” ≠ “SEARCH_QUERY”


Analysis hierarchy l.jpg
Analysis Hierarchy Analysis

Context

sensitivity

Object

sensitivity

Map

sensitivity

Flow

sensitivity

None

None

None

None

1-OS

Local flow

k-CFA

k-OS

Constant

keys

Predicate-

sensitive

∞-CFA

∞-OS

Symbolic analysis

of keys

Interprocedural

predicate-sensitive

∞-CFA

Partial 1-OS

Constant string keys

Local flow


Pql into datalog translation l.jpg
PQL into Datalog Translation Analysis

PQL Query

Datalog Query

[Whaley, Avots, Carbin, Lam, APLAS ’05]

simpleSQLInjection(hparam, hderived ) :–

ret(i1, v1),

call(c1, i2, "ServletRequest.getParameter"),

pointsto(c1, v1, hparam),

actual(i2, v2, 0), actual(i2, v3, 1),

call(c2, i2, "StringBuffer.append"),

pointsto(c2, v2, htemp),

pointsto(c2, v3, hparam),

actual(i3, v4, 0), ret(i3, v5),

call(c3, i3, "StringBuffer.toString"),

pointsto(c3, v4, htemp),

pointsto(c3, v5, hderived),

actual(i4, v6, 0), actual(i4, v7, 1),

call(c4, i4, "Connection.execute"),

pointsto(c4, v6, hcon),

pointsto(c4, v7, hderived).

query simpleSQLInjection

returns

object String param, derived;

uses

object ServletRequest req;

object Connection con;

object StringBuffer temp;

matches {

param = req.getParameter(_);

temp.append(param);

derived = temp.toString();

con.executeQuery(derived);

}

Relevant

instrumentation

points

Datalog solver

Vulnerability

warnings


Eclipse interface to analysis results l.jpg
Eclipse Interface to Analysis Results Analysis

  • Vulnerability traces are exported into Eclipse for review

    • source → o1 → o2 → … → on → sink


Importance of a sound solution l.jpg
Importance of a Sound Solution Analysis

  • Soundness:

    • only way to provide guarantees on application’s security posture

    • allows us to remove instrumentation points for runtime analysis

  • Soundness claim

Our analysis finds all vulnerabilities in statically analyzed code that are captured by the specification


Static analysis extensions l.jpg

Overview Analysis

Static Analysis Extensions

Static

Extensions

Dynamic

Experiments

Conclusions

Future


Towards completeness l.jpg
Towards Completeness Analysis

  • Completeness goal:

    • analyze all code that may be executed at runtime

specify roots

discover the rest


Generating a static analysis harness l.jpg
Generating a Static Analysis Harness Analysis

public class Harness {

public static void main(String[] args){

processServlets();

processActions();

processTags();

processFilters();

}

...

}

<servlet>

<servlet-name>blojsomcommentapi</servlet-name>

<servlet-class> org.blojsom.extension.comment.CommentAPIServlet </servlet-class>

<init-param>

<param-name>blojsom-configuration</param-name>

</init-param>

<init-param>

<param-name>smtp-server</param-name>

<param-value>localhost</param-value>

</init-param>

<load-on-startup>3</load-on-startup>

</servlet>

App

App

App

500,000+ lines of code

web.xml

web.xml

web.xml

Application Server

(JBoss)

1,500,000+ lines of code

2M+ lines of code


Reflection resolution l.jpg
Reflection Resolution Analysis

Constants

Specification points

1. String className = ...;

2. Class c = Class.forName(className);

3. Object o = c.newInstance();

4. T t = (T) o;

Q: what object does this create?

1. String className = ...;

2. Class c = Class.forName(className);

Object o = new T1();

Object o = new T2();

Object o = new T3();

4. T t = (T) o;

  • [Livshits, Whaley, and Lam, APLAS’05]


Reflection resolution results l.jpg
Reflection Resolution Results Analysis

  • Applied to 6 large Java apps, 190,000 lines combined

Call graph sizes compared

Methods


Dynamic analysis l.jpg

Overview Analysis

Dynamic Analysis

Static

Extensions

Dynamic

Experiments

Conclusions

Future


Runtime vulnerability prevention l.jpg
Runtime Vulnerability Prevention Analysis

[Martin, Livshits, and Lam, OOPSLA’05]

App

  • Detect and stop

  • Detect and recover

App

Vulnerability

specification

App

Application Server

(JBoss)


Runtime instrumentation engine l.jpg
Runtime Instrumentation Engine Analysis

  • PQL spec → into state machines

    • Run alongside program, keep track of partial matches

    • Run recovery code before match

{x=y=o3}

{x=o3}

y := x

ε

ε

ε

t=x.toString()

y := derived(t)

ε

ε

{x=y=o3}

{x=o3}

ε

{x=o3}

t.append(x)

y := derived(t)

{x=o3}

sanitizer


Reducing instrumentation overhead l.jpg
Reducing Instrumentation Overhead Analysis

query simpleSQLInjection

returns

object String param, derived;

uses

object ServletRequest req;

object Connection con;

object StringBuffer temp;

matches {

param = req.getParameter(_);

temp.append(param);

derived = temp.toString();

con.executeQuery(derived);

}

  • Instrument events on objects that may be part of a match

  • Soundness allows to remove instrumentation points

1. String name = req.getParameter(“name”);

2. StringBuffer buf1 = new StringBuffer();

3. StringBuffer buf2 = new StringBuffer(“def”);

4. buf2.append(“abc”);

5. buf1.append(name);

6. con.executeQuery(buf1.toString());

7. con.executeQuery(buf2.toString());


Experimental results l.jpg

Overview Analysis

Experimental Results

Static

Extensions

Dynamic

Experiments

Conclusions

Future


Experimental evaluation l.jpg
Experimental Evaluation Analysis

  • Comprehensive evaluation:

    • SecuriBench Macro [SoftSecTools ’05]

    • SecuriBench Micro

    • Google: SecuriBench

  • Compare Griffin to a commercially available tool

    • Griffin vs. Secure Software CodeAssure

    • CodeAssure: March 2006 version


Benchmark statistics l.jpg
Benchmark Statistics Analysis

Lines of Code



Vulnerability classification l.jpg
Vulnerability Classification Analysis

  • Reported issues back to program maintainers

    • Most of them responded, most confirmed as exploitable

    • Vulnerability advisories issued



A study of false positives in blojsom l.jpg

With sanitizers added Analysis

0

A Study of False Positives in blojsom

Base

Q: How important are analysis features for avoiding false positives?

114

With context sensitivity

84

With object sensitivity

43

With map sensitivity

5


Griffin vs codeassure l.jpg
Griffin vs. CodeAssure Analysis

SecuriBench

Macro

80+

Q: What is the relationship between false positives and false negatives?

SecuriBench

Micro

40+


Deep vs shallow vulnerability traces l.jpg
Deep vs. Shallow Vulnerability Traces Analysis

Q: How complex are the vulnerabilities we find?


Analyzing personalblog l.jpg
Analyzing personalblog Analysis

Hibernate library code

Application code

Q: What is the connectivity between sources and sinks?

sinks

sf.hibernate.Session.find(…)

sources

objects

roller

1falsely tainted object → 100+ false positives


Runtime analysis results l.jpg
Runtime Analysis Results Analysis

  • Experimental confirmation

    • Blocked exploits at runtime in our experiments

  • Naïve implementation

    • Instrument every string operation → high overhead

  • Optimized implementation

    • 82-99% of all instrumentation points are removed

< 1%


Related work conclusions l.jpg

Overview Analysis

Related Work & Conclusions

Static

Extensions

Dynamic

Experiments

Conclusions

Future


Lessons learned l.jpg
Lessons Learned Analysis

  • Context sensitivity is good. Object sensitivity is great, but hard to scale. Scaling it: important open problem

  • Can’t ignore reflection in large programs; reflection makes the call graph much bigger

  • Many of the bugs are pretty shallow; there are, however, complex bugs, especially in library code

  • Practical tools tend to introduce false negatives to avoid false positives; not necessarily a good choice

  • Automatic recovery from vulnerabilities is a very attractive approach; overhead can be reduced


Related work l.jpg
Related Work Analysis

  • Web application security work

    • Penetration testing tools (black box testing)

    • Application firewalls (listen on the wire and find patterns)

  • Practical automatic static error detection tools

    • WebSSARI (static and dynamic analysis of PHP) [Huang,... ’04]

    • JDBC checker (analysis of Java strings) [Wasserman, Su ’04]

    • SQLRand (SQL keyword randomization) [Boyd and Keromytis ’04]

[Usenix ’05]

[OOSPLA ’05]

*Source: http://suif.stanford.edu/~livshits/work/griffin/lit.html


Future work l.jpg
Future Work Analysis

Applying Model-checking to Web applications (Michael Martin)

Learning Specification from Runtime Histories (with Naeim)

Partitioned BDDs to Scale bddbddb Better (with Jean-Gabriel/Prof. Dill)

Analyze Sources of Imprecision in Datalog

Analyzing Sanitization Routines

Attack Vectors in Library Code

Type Qualifiers in Java (with Dave Greenfieldboyce at UMD)

Using Model Checking to Break Sanitizers


Special thanks l.jpg
Special Thanks Analysis

Stella,My parents, My sister

Monica

Alex, Dan, Dawson, Elizabeth

Ramesh Chandra, Darlene Hadding, David Heine, Michael Martin, Brian Murphy, Joel Sandin, Constantine Sapuntzakis, Chris Unkel, John Whaley, Kolya Zeldovich

Dzintars Avots, Ron Burg, Mark Dilman, Craig Foster, Chris Kaelin, Amit Klein, Ted Kremenek, Iddo Lev, John Mitchell, Carrie Nielsen, David Pecora, Ayal Pincus, Jai Ranganathan, Noam Rinetzky, Mooly Sagiv, Elena Spector, Jeff Ullman, Eran Yahav, Gaylin Yee, Andreas Zeller, Tom Zimmerman

National Science Foundation


The end l.jpg

The End. Analysis

Griffin Security Project http://suif.stanford.edu/~livshits/work/griffin/

Stanford SecuriBench http://suif.stanford.edu/~livshits/securibench/

Stanford SecuriBench Micro http://suif.stanford.edu/~livshits/work/securibench-micro/

PQL language http://pql.sourceforge.net/

  • Finding Security Vulnerabilities in Java Applications with Static Analysis, Livshits and Lam, 2005.

  • Finding Application Errors and Security Flaws Using PQL, Martin, Livshits, and Lam, 2005.

  • Defining a Set of Common Benchmarks for Web Application Security, Livshits, 2005.

  • Reflection Analysis for Java, Livshits, Whaley and Lam, 2005.

  • DynaMine: Finding Common Error Patterns by Mining Software Revision Histories, Livshits and Zimmermann, 2005.

  • Locating Matching Method Calls by Mining Revision History Data, Livshits and Zimmermann, 2005.

  • Turning Eclipse Against Itself: Finding Bugs in Eclipse Code Using Lightweight Static Analysis, 2005.

  • Context-Sensitive Program Analysis as Database Queries, Lam, Whaley, Livshits, Martin, Avots, Carbin, Unkel, 2005.

  • Improving Software Security with a C Pointer Analysis, Avots, Dalton, Livshits, M.S. Lam, 2005.

  • Findings Security Errors in Java Applications Using Lightweight Static Analysis, Livshits, 2004.

  • Tracking Pointers with Path and Context Sensitivity for Bug Detection in C Programs, Livshits and Lam, 2003.

  • Mining Additions of Method Calls in ArgoUML, Zimmerman, Breu, Lindig, and Livshits, 2006.