software security assessment n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Software Security Assessment PowerPoint Presentation
Download Presentation
Software Security Assessment

Loading in 2 Seconds...

play fullscreen
1 / 83

Software Security Assessment - PowerPoint PPT Presentation


  • 118 Views
  • Uploaded on

Software Security Assessment. COEN 225. Code Auditing vs. Black Box Penetration Testing. Code Auditing vs. Black Box Penetration Testing. Security audits of software: White box testing Auditors work with source code Manual code inspection Automated tools such as

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Software Security Assessment' - sabina


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
code auditing vs black box penetration testing1
Code Auditing vs. Black Box Penetration Testing
  • Security audits of software:
    • White box testing
      • Auditors work with source code
        • Manual code inspection
        • Automated tools such as
          • RATS, ITS4, Splint, Flawfinder, Jlint, Codespy
      • +: Complete code coverage is possible
      • -: Complexity:
        • Tools are imperfect and need to be supported by manual review
      • -: Occasional lack of availability of source code
    • Black box testing
      • Auditors provide input to program / service under audit.
      • +: Black box testing is always possible
      • +: Portability
        • Can test several applications with the same test suite
      • +: Simplicity
      • -: Coverage
      • -: Lack of intelligence
code auditing vs black box penetration testing2
Code Auditing vs. Black Box Penetration Testing
  • Black Box Testing
    • Manual Testing
      • E.g.: Provide single quotes to various parameters in a form to find an sql or XSS attack possibility
    • Automated Testing or Fuzzing
    • Pros:
      • Availability: Fuzzing is always possible
      • Reproducibility: Fuzzing ports to similar applications to be tested
      • Simplicity: Fuzzing replaces analysis with extensive trials
    • Contras:
      • Coverage: Coverage usually implies code inspection
      • Intelligence: Fuzzing is unlikely to find complicated attack patterns
code auditing vs black box penetration testing3
Code Auditing vs. Black Box Penetration Testing
  • Gray box testing
    • Combines black box testing with some Reverse Engineering (RE)
    • RE is used to identify possible vulnerabilities
    • Manual gray box testing
      • Use IDA PRO or similar disassembler tool to generate assembly code of binary
      • Identify possible vulnerabilities
      • Generate sample input
    • Automated gray box testing
      • Number of tools that automatize the process
        • BugScam
        • Inspector
        • Bin Audit
  • LogiScan
  • SecurityReview
code auditing vs black box penetration testing4
Code Auditing vs. Black Box Penetration Testing
  • Gray box testing
    • Pro:
      • Availability: Can always be done
      • Coverage: Better than black box testing
    • Contra:
      • Complexity: Reverse Engineering is very difficult
code auditing vs black box penetration testing example
Code Auditing vs. Black Box Penetration TestingExample

struct keyval {

char * key;

char * value;

}

int handle_query_string(char * query_string)

{

struct keyval *qstring_values, *ent;

char buf[1024];

if(!query_string)

return 0;

qstring_values = split_keyvalue_pairs(query_string);

if(ent = find_entry(qstring_values, ″mode″))!= NULL)

{

sprintf(buf, ″MODE=%s″,ent->value);

putenv(buf);

}

}

Vulnerability:

Programmer assumes that

ent->value fits into buffer

ent->value is controlled by input

code auditing vs black box penetration testing example1
Code Auditing vs. Black Box Penetration TestingExample
  • Web server behaves differently if the query string contains
    • mode=xxx
      • Places string xxx into buffer
      • buffer can overflow
  • Black box testing will have difficulties finding this possible vulnerability
  • Gray box testing needs to find the if statement
  • Code inspection can find the faulty sprintf statement and check for existence of an actual vulnerability
system development life cycle
System Development Life Cycle
  • Feasibility study
  • Requirements definition
  • Design
  • Implementation
  • Integration and Testing
  • Operation and Maintenance
trust relationships
Trust Relationships
  • Trust relationships:
    • Different components in a system place varying degrees of trust in each other.
    • Trust relationships need to be made explicit and investigated.
  • Transitive Nature of Trust
trust relationships misplaced trust
Trust Relationships:Misplaced Trust
  • Misplaced Trust = Making an unfounded assumption
    • Input:
      • Most vulnerabilities are triggered by malicious input
        • Developer assumes that no-one enters a 5000 character telephone number
        • Developer assumes that related software module sanitizes input to module under development
trust relationships misplaced trust1
Trust Relationships:Misplaced Trust
  • Misplaced Trust = Making an unfounded assumption
    • Interfaces:
      • Mechanisms by which software components communicate with each other and the outside world.
      • Developers
        • chose method of exposing interface that does not provide enough protection from external attackers.
        • chose reliable method of exposing interface, but configure it incorrectly
        • assume that interface is too difficult for an attacker to access.
      • Example:
        • Custom network interface with custom encryption.
        • Attacker needs to reverse engineer a client
trust relationships misplaced trust2
Trust Relationships:Misplaced Trust
  • Misplaced Trust = Making an unfounded assumption
    • Environment
      • Software does not run in a vacuum
      • Developer trusts environment, but attacker manipulates it
      • Classic example – teaser: /tmp – race
        • Application creates file in /tmp or /var/tmp
        • Attacker creates symbolic link while app is running
        • Application writes to the symbolic link instead

Teaser

Teaser

TEASER

trust relationships misplaced trust3
Trust Relationships:Misplaced Trust
  • Misplaced Trust = Making an unfounded assumption
    • Exceptions
      • Attacker causes unexpected change in application’s program flow by external measures
      • Example:
        • App writes to a (attacker-controlled) pipe
        • Attacker causes pipe to close just before write
        • Results in a SIGPIPE exception (in *nix)
        • App aborts, possibly leaving data incoherent

Teaser

Teaser

TEASER

design review
Design Review
  • Algorithms
    • E.g.: Sorted list poses a DoS risk if attacker can cause it to increase beyond reasonable bounds
  • Problem Domain Logic – Business Logic
    • Banking Example:
      • Person can make one monthly transaction with their money market account to or from checking.
      • Can make unlimited transfers to checking account.
      • If checking account is below limit, money is transferred from money market account to checking
design review1
Design Review
  • Trust Relationships
    • Trust boundary
      • Reflects limitation of trust between modules
    • Trust Domains
      • Regions of shared trust, limited by trust boundaries
    • Trust Model
      • Abstraction that presents these relationships
design review trust relationship
Design Review: Trust Relationship
  • Win98 Trust Model
    • Users are not protected from each other
    • If not networked, need to get physical access to machine

Administrator

Rest

of

World

Administrative Privilege Boundary

Physical Access Boundary

User 1

User 2

design review examples for design flaws
Design ReviewExamples for Design Flaws
  • Exploiting Strong Coupling
    • Application is not decomposed along trust boundaries
  • Example: Windows Shatter
    • Windows WM_TIMER Message Handling can enable privilege elevation (12/2002)
      • Interactive processes need to react to user events
      • One mechanism is WM_TIMER, sent at expiration of timer
      • Causes process to execute a timer callback function
      • One process can cause another process to execute a timer callback function (of its choosing), even if the second process did not set a timer.
      • Several processes run with LocalSystem privileges
      • Attacker logs onto system interactively, executes program, that levies a WM_TIMER request upon a LocalSystem privilege process, causing it to take any action the attacker specifies.
    • Fixed by MS 2003
design review examples for design flaws1
Design ReviewExamples for Design Flaws
  • Exploiting transitive trusts
    • Solaris Example:
      • Solaris contains automountd
        • Runs as root
        • Allows root to specify a command as part of a mounting operation
        • Does not listen on an IP network
        • Available only through three protected loopback transports
      • Solaris contains rpc.statd
        • Runs as root
        • Listens on TCP and UDP interfaces
        • Monitors NFS servers to send out notification if they go down
        • Clients tell rpc.statd which host to contact and what RPC program number to call on host
design review examples for design flaws2
Design ReviewExamples for Design Flaws
  • Exploiting transitive trusts
    • Solaris Example continued:
      • Attacker registers local automountd with rpc.statd
      • Attacker tells rpc.statd that NFS server has crashed
      • rpc.statd contacts automountd daemon through loopback device
      • automountd trusts message since it comes from root through loopback device and carries out a command of the attacker’s choice.
        • Some work needed to make request a valid automountd request.
design review examples for design flaws3
Design ReviewExamples for Design Flaws
  • Failure Handling
    • User friendly:
      • Recovers from problem
      • Generates assistance in solving problems
    • Security conscious:
      • Assumes that failure conditions are result of an attack
      • Close down app without explanation
design review examples for design flaws4
Design ReviewExamples for Design Flaws
  • Authentication
    • Lack of authentication
      • Attacker can get access to a (presumably) private interface between modules in an app
        • Example: Web site does authentication in a main page, but then does not check it when using links from main site.
      • Untrustworthy credentials
        • Versions of sadmind were shipped without a default of “no authentication required” (1999)
        • Use of source IP address as a credential
design review examples for design flaws5
Design ReviewExamples for Design Flaws
  • Authorization
    • Omitting authorization checks
    • Allowing users to set up authorization themselves
design review examples for design flaws6
Design ReviewExamples for Design Flaws

200801091536 Logon Failure: Bob

200801091539 Logon Success: Alice

200801091547 Logout: Alice

  • Accountability
    • Logging Failure
      • Example: Log of Login Attempts
      • User name allows newlines
  • Accountability
    • Logging Failure
      • Example: Log of Login Attempts
      • User name allows newlines

username: Alice\n 200801091559 Logon Success: Alice

200801091536 Logon Failure: Bob

200801091539 Logon Success: Alice

200801091559 Logon Failure: Alice

200801091559 Logon Success: Alice

design review examples for design flaws7
Design ReviewExamples for Design Flaws
  • Confidentiality & Integrity
    • Obfuscation instead of encryption
    • Insecure Use of Encryption
      • Example: XOR-encryption
    • Storing Sensitive Data Unnecessarily
      • Example: Storing a password
        • Instead store (1-way) salted hash of password
          • Without salt, can use rainbow tables to crack password
    • Bait & Switch Attacks
      • Example: Using an insecure hash (MD5, SHA1) to validate
        • Application signs hash of request
        • If hash is insecure, can generate request with the same hash
design review threat modeling
Design ReviewThreat Modeling
  • Michael Howard and David LeBlanc: Writing Secure Code, Microsoft Press, 2002
  • Frank Swiderski and Window Snyder: Threat Modeling, Microsoft Press 2004
design review threat modeling1
Design ReviewThreat Modeling
  • Process during design phase, updated in later development phases
    • Information Collection
    • Application Architecture Modeling
    • Threat Identification
    • Documentation of Findings
    • Prioritizing of Implementation Review
design review threat modeling information collection
Design ReviewThreat Modeling: Information Collection
  • Goal: Get understanding of application
    • Assets:
      • What has value for an attacker?
    • Entry points:
      • Path through which an attacker can access the system.
    • External entities:
      • Those that communicate with process through the entry points
    • External trust levels
    • Major components
    • Use scenarios
design review threat modeling information collection1
Design ReviewThreat Modeling: Information Collection
  • Developer Interviews
    • Keep in Mind
      • Developers have put lots of efforts into work.
      • Avoid any judgmental or condescending overtones
  • Developer Documentation
    • Often incomplete
    • Often no longer representative of implementation
  • Standards Documentation
  • Source Profiling (Not Source Inspection)
design review threat modeling information collection2
Design ReviewThreat Modeling: Information Collection
  • System Profiling
    • Requires access to a functioning installation
    • Approaches:
      • File system layout
      • Code reuse
      • Imports and Exports
      • Sandboxing
        • Determine all objects touched and all activities performed
        • Use sniffer and application proxies to record any network activity
        • Use tools such as FileMon, RegMon, WinObj, Process Explorer
      • Scanning
design review threat modeling application architecture modeling

Database

Web

Application

User

Design ReviewThreat Modeling: Application Architecture Modeling
  • Create Data Flow Diagrams

http request

database query

https request

database response

http answer

https answer

design review threat modeling application architecture modeling1

Database

Login

process

User

Authenticated Operations

Design ReviewThreat Modeling: Application Architecture Modeling
  • DFD level-0 diagram of login process

login

database query

login status

database response

operational request

database query

operational response

Authenticated user boundary

database response

design review threat modeling application architecture modeling2

Check for HTTPS

Look-up user

Database

Access denied

Check password

User

Design ReviewThreat Modeling: Application Architecture Modeling

Submit login request

https connection accepted

Redirect to https

Query password salt for user

Return salt

Salt is valid

Login accepted

Query for username with salted password

Invalid password

Return user record

Login failed

Invalid salt value

design review threat modeling application architecture modeling3

Check for HTTPS

Look-up user

Database

Check password

User

Design ReviewThreat Modeling: Application Architecture Modeling

Submit login request

Query password salt for user

https connection accepted

Redirect to https

Return salt

Invalid user name

Salt is valid

Query for username with salted password

Invalid password

Return user record

Login accepted

design review threat identification
Design ReviewThreat Identification
  • Process of determining an application’s security exposure
  • Uses attack trees
design review threat identification1
Design ReviewThreat Identification
  • Process of determining an application’s security exposure
  • Uses attack trees
design review threat identification2
Design ReviewThreat Identification

1. Adversary gains access to a user’s personal information

1.1. Gain direct access to the database

1.2. Login as target user

1.3. Hijack user session

1.4. Passively intercept personal data

1.1.1. Exploit a hole in system application or kernel

1.2.1. Brute force login

1.2.2. Steal user credentials

1.3.1. Steal user session cookie

1.4.1. Identify user connection initiation

1.4.2. Sniff network traffic for personal data

1.2.1.1. Identify user name

1.2.1.2. Identify user password

design review threat identification3
Design ReviewThreat Identification

1. Adversary gains access to a user’s personal information

OR 1.1 Gain direct access to the database

1.1.1 Exploit a hole in system application or kernel

1.2 Log in as target user

OR 1.2.1 Brute-force login

AND 1.2.1.1 Identify user name

1.2.1.2 Identify user password

1.2.2 Steal user credentials

1.3 Hijack user session

1.3.1 Steal user session cookie

1.4 Passively intercept personal data

AND 1.4.1 Identify user connection initiation

1.4.2 Sniff network traffic for personal data

  • Textual representation
design review threat mitigation
Design ReviewThreat Mitigation
  • Adorn attack tree with threat mitigation measures
  • Dashed lines indicate improbable attack vectors
design review threat mitigation1

https required

System patches up to date

https required

Design ReviewThreat Mitigation

1. Adversary gains access to a user’s personal information

1.1. Gain direct access to the database

1.2. Login as target user

1.3. Hijack user session

1.4. Passively intercept personal data

1.1.1. Exploit a hole in system application or kernel

1.2.1. Brute force login

1.2.2. Steal user credentials

1.3.1. Steal user session cookie

1.4.1. Identify user connection initiation

1.4.2. Sniff network traffic for personal data

1.2.1.1. Identify user name

1.2.1.2. Identify user password

design review documentation of findings
Design ReviewDocumentation of Findings
  • Threat summary structure:
    • Threat:Bruce force login
    • Affected component:Web application login
    • Description:Clients can brute force attack usernames and passwords by repeatedly connecting and attempting to log in. This thread is increased because the application returns different error messages for invalid usernames and passwords making usernames easier to guess.
    • Result:Untrusted clients can gain access to a user account and therefore read or modify sensitive information
    • Mitigation Strategies: Make error messages ambiguous so that an attacker does not know whether the username or password is invalid. Lock the account after repeated failed login attempts
design review dread risk ratings
Design ReviewDREAD Risk Ratings

Brute force login

  • Damage potential: 6
  • Reproducibility 8
  • Exploitability 4
  • Affected users 5
  • Discoverability 8
  • Overall 6.2
operational review
Operational Review
  • Operational vulnerabilities
    • result from application’s configuration
    • result from deployment environment
  • Operational vulnerabilities can result from
    • configuration options
    • failure to use platform security mechanisms properly
    • insecure deployment
    • insecure base platform
  • Hence, responsibility falls between developer and administrative personnel
operational review1
Operational Review
  • Attack surface reductions
    • Minimizing attack surface = Hardening platform
    • Get rid of unnecessary services
      • Use virtualization
    • Example:
      • IIS HTR vulnerabilities
        • Scripting technology not widely used because supplanted by ASP
        • Default IIS enabled HTR service
        • 1999 – 2002: Number of HTR vulnerabilities
  • Insecure Defaults
    • In order to make installation simple
    • Pay attention to
      • Application’s default settings
      • Platform’s default settings
operational review2
Operational Review
  • Access Control
    • Externally, application depends completely on access control of host OS or platform
      • Example:
        • Python on Windows installed on C:\Python25
        • Default write permissions on Windows to any direct subdirectory of c: drive
        • python uses mscvr71.dll
        • Attacker can provide mscvr71.dll in the Python25 directory
        • python.exe will pick mscvr71.dll in its own directory by preference
operational review3
Operational Review
  • Secure Channel Vulnerabilities:
    • Simply not using a secure channel
      • Example: Web site sends session cookie in the clear
        • Acceptable for web-based email
        • Not acceptable for banking
  • Spoofing and Identification
    • Trusting TCP or UDP source addresses
  • Network profiles
    • NFS or Server Message Block (SMB) are acceptable within a firewall, but not without
operational review4
Operational Review
  • HTTP request methods:
    • Question honoring TRACE, OPTIONS, and CONNECT requests
      • OPTIONS – Lists all services server accepts
      • TRACE – echoes request body to client
        • Worry about cross-scripting attacks
      • CONNECT – provides way for proxies to establish SSL connections
  • Directory Indexing
operational review5
Operational Review
  • Protective Measures
    • Stack protection
      • Non-executable stack
      • Canaries
    • Address space layout randomization
    • Registered function pointers
      • Long-lived function pointer calls are wrapped by protective checks
    • Virtual machines
operational review6
Operational Review
  • Host-based measures
    • Object and file system permissions
    • Restricted accounts
    • Chroot jails
    • System virtualization
      • One virtual system per service
    • Enhanced kernel protection
      • SELinux, Core Force
    • Host-based firewalls
    • Antimalware applications
    • File and object change monitors
    • Host-based intrusion detection systems
operational review7
Operational Review
  • Network-based measures
    • Network address translation
    • Virtual private networks
    • Network Intrusion Detection Systems (NIDS)
application review process1
Application Review Process
  • Process Outline
    • Preassessment
      • Planning and scoping an application review
      • Setting up a time line
    • Application Review
    • Documentation and Analysis
    • Remediation Support
application review process2
Application Review Process
  • Application Access can be
    • Source only
    • Binary only
    • Both binary and source
    • Checked build
      • Binary with additional debug information
    • Strict black box
application review process3
Application Review Process
  • Application Review
    • Natural to follow waterfall model, starting with design
    • However, design review needs thorough understanding of code, which comes with exposure.
    • Postpone design review.
application review process4
Application Review Process
  • Methodology is constrained by code reviewer’s capability to concentrate
    • Application review process
      • Initial preparation
        • Without documentation Derive design from implementation:
          • Top Down, Bottom Up, or Hybrid
      • Iterate through 2-3 hr cycles:
        • Plan step
        • Work
        • Reflect
        • Break
      • Documentation, Analysis, and Recommendations
application review process code navigation
Application Review ProcessCode Navigation
  • Described in terms of
    • External flow sensitivity
      • Control flow vs. data flow
    • Tracing direction
      • Forward vs. backward
application review process code navigation1
Application Review ProcessCode Navigation

int foo(int c) {

if( c == 4) bar(c);

if( c == 72) fubar();

for(; c; c--)

updateGlobalState();

}

Data flow sensitive:

Start with top. Then follow into bar, because it receives c. Do not follow fubar and updateGlobalState, because they do nothing with c.

Ignoring external control flow and data flow:

Read code from top to bottom

Control flow sensitive:

Start with top. Then inspect bar, fubar, and updateGlobalState

Control and data flow:

You would have some idea on the range of c. For example, if c is always larger than 40, you would not bother following bar.

application review process code auditing strategies
Application Review ProcessCode Auditing Strategies
  • Three basic categories of code auditing strategies:
    • Code Comprehension
      • Analyze source code directly
        • to discover vulnerabilities
        • to improve understanding of code
    • Candidate Point Strategies
      • Create list of potential issues
      • Examine source code to determine relevance of issues
    • Design Generalization Strategies
      • To analyze potential medium to high logic and design flaws
application review process code comprehension strategies
Application Review ProcessCode Comprehension Strategies
  • CC1: Tracing malicious input
    • Start at entry point to the system
    • Trace flow of code forward, while performing limited data flow analysis
      • Basically, keep set of “malicious input” in the back of your mind as you read the code
      • Focus effort on any type of behavior that fits into a vulnerability class that you know

Strengths: Inherent focus on security-relevant code

Can sometimes identify subtle flaws

Difficult to go off track

Weaknesses: Code and data paths balloon up quickly

Easy to overlook issues

Requires focus and experience

application review process code comprehension strategies1
Application Review ProcessCode Comprehension Strategies
  • CC2: Analyze a module
    • Read a single source file from top to bottom
    • Look at each function in a vacuum and document potential issues

Strengths: You learn the way application is programmed

Easier to analyze cohesive modules

Can find subtle and abstract flaws

Weaknesses: Mentally taxing

Easy to mismanage time

Constant documentation requires discipline

application review process code comprehension strategies2
Application Review ProcessCode Comprehension Strategies
  • CC3: Analyze an algorithm
    • Look at the algorithm and identify any possible weakness in the design of the algorithm
    • Pick security relevant algorithms
      • Crypto
      • Security model enforcement
      • Input processing

Strengths: You cannot go off track

Can find subtle and abstract flaws

Weaknesses: Mentally taxing

Lacks context

application review process code comprehension strategies3
Application Review ProcessCode Comprehension Strategies
  • CC4: Analyze a class / object
    • Focus on class instead of module (CC2)

Strengths: Less likely to go off track than for module analysis

Weaknesses: Mentally taxing

Might lack context

More likely to go off track than algorithm analysis

application review process code comprehension strategies4
Application Review ProcessCode Comprehension Strategies
  • CC5: Trace black box hits
    • Start out with a list of possible problems obtained by fuzzing or manual black boxing
      • Problems : program crashes or program displays useful information
    • Trace input to vulnerabilities

Strengths: Traces some form of known issue

Easy to stay on track

Simple

Weaknesses: Ignores many other issues

Has the potential to have to deal with many false positives

application review process candidate point strategies
Application Review ProcessCandidate Point Strategies
  • CP1: General Candidate Point Approach
    • Start with low level routines that grant access to application assets or that could harbor vulnerabilities
    • Trace backward to the code to see whether these routines expose any vulnerabilities accessible from an application entry point

Strengths: Good coverage of known vulnerability classes

Not too difficult

Hard to get off track

Weaknesses: Biases towards a limited set of potential issues

Comprehensive impact is much lower than with code comprehension strategies

The results are only as good as your candidate points

application review process cp1 example
Application Review ProcessCP1 Example
  • Assume tool reports: util.c line 143: sprintf() used on a stack buffer
  • You cannot determine whether this bug is exploitable
    • unless you can control either argument
    • get them to be long enough to overflow buffer
  • Need to check all calls to the function

int construct_email(char * name, char * domain)

{

char buf [1024];

sprintf(buf, %s@%s, name, domain);

...

}

application review process candidate point strategies1
Application Review ProcessCandidate Point Strategies
  • CP2: Automated source analysis tool
    • Early source analysis tools were simply lexical analyzers
      • Search for patterns matching potentially vulnerable source code
    • Newer systems do a more extensive analysis job
      • Helpful in identifying candidate points
      • Offer some level of analysis to speed up manual review

Strengths: Good coverage for easily identified vulnerabilities

Not too difficult

Hard to get off track

Weaknesses: Biases towards confirming only a limited set of potential issues

Comprehensive impact is much lower than with code comprehension strategies

The results are only as good as your tool

application review process candidate point strategies2
Application Review ProcessCandidate Point Strategies
  • CP3: Simple lexical candidate points
    • A wide range of vulnerabilities can be easily identified
      • SQL injection
      • format string vulnerabilities
    • Simple tools can find all instances of a certain vulnerability class
    • Eliminate from this list everything that cannot be a vulnerability, based on whether a module handles any potentially malicious input
    • After pairing down, use the candidate point strategy

Strengths: Good coverage for known vulnerability classes

Not too difficult

Hard to get off track

Weaknesses: Confirms only a limited set of issues

Does not lead to comprehension

Search results depends on the pattern used

application review process candidate point strategies3
Application Review ProcessCandidate Point Strategies
  • CP4: Simple binary candidate points
    • Certain classes of vulnerabilities can also be found in binary code
      • sign extension vulnerabilities by looking for MOVSX instruction
    • Trace backward from these candidates

Strengths: Good coverage for known vulnerability classes

Not too difficult

Hard to go off track

Weaknesses: Confirms only a limited set of issues

Does not lead to comprehension

Search results depends on the pattern used

application review process candidate point strategies4
Application Review ProcessCandidate Point Strategies
  • CP5: Black box generated candidate points
    • Use fuzzing or manual black boxing to find issues
    • Trace them back to user-malleable input

Strengths: Good coverage for known vulnerability classes

Not too difficult, after training and depending on trace

Hard to go off track

Weaknesses: Confirms only a limited set of issues

Does not lead to comprehension

Results depend on the tool

application review process example
Application Review ProcessExample

A huge copy occurs if we control the short integer at [eax+0Ah] and set it to zero.

The dec ecx will result in an integer underflow

Crash would occur on the rep movsd instruction (which get the number of moves from the ecx register)

Once crash is identified, we need to figure out where [eax+0Ah] is populated

movzx ecx, word ptr [eax+0Ah]

dec ecx

mov edx, ecx

shr ecx, 2

lea edi, [eax+19h]

rep movsd

mov ecx, edx

and ecx, 3

rep movsb

pop edi

pop esi

application review process example1
Application Review ProcessExample
  • Crash in analyzed by:
    • Finding the instruction where the program crashed
    • Examine why it crashed there:
      • Invalid source operand?
      • Invalid destination written to?
      • Index to memory to large?
      • Loop counter not a sane value?
    • Work backward to determine where the invalid operand came from.
    • Connect invalid operand with some data fed to the program at the entry point that was fuzz-tested. Determine the part of data that causes the exception to occur
application review process example2
Application Review ProcessExample
  • Dealing with faults where application seems to crash at a random location:
    • Usually memory corruption, where corrupted portion is not used immediately:
  • In our example:
    • You determined that [eax+0Ah] was set to 10h at initialization and never changed
    • But obviously it now contains 0.
    • Two possibilities:
      • Memory corruption in the structure to which eax points
      • Corruption of another buffer on the heap has overwritten the structure eax points at.
    • In the first case: fuzzing with same input should result in an identical crash
    • In the second case: application might crash somewhere else or not at all
application review process candidate point strategies5
Application Review ProcessCandidate Point Strategies
  • CP6: Application specific candidate points
    • Sometimes after working with an application, you find recurring vulnerability patterns
    • Search for the resulting patterns – the application specific candidate points

Strengths: Good balance of speed and depth of coverage

Not too difficult

Hard to go off track

Weaknesses: Requires a thorough understanding of the code base

Deals only with a limited set of issues

application review process design generalization strategies
Application Review ProcessDesign Generalization Strategies
  • DG1: Model the system
    • Start with implementation
    • Reverse engineer design
    • Limit yourself to security-critical components

Strengths: Most effective method for identifying logic and design vulnerabilities

Can identify even some operational vulnerabilities

Provides detailed knowledge of the application’s design

and architecture

Weaknesses: Requires a thorough understanding of the system implementation

Requires focus and experience

Can be extremely time consuming

application review process design generalization strategies1
Application Review ProcessDesign Generalization Strategies
  • DG2: Hypothesis testing
    • Determines design of smaller elements of code by making a hypothesis and testing the hypothesis
    • If you are correct, you reverse engineered a part of the application and investigates its consequences
    • If not, then you probably understand the code element better and can make a better guess

Strengths: Faster method to identify issues with design

Helps build good understanding of design

Is well suited to identify more complex and subtle issues

Weaknesses: Easy to go off track

Poor assumptions can derail design analysis

Mentally taxing

application review process design generalization strategies2
Application Review ProcessDesign Generalization Strategies
  • DG3: Deriving purpose and function
    • Try to directly identify the abstraction that the implementation represents
    • Pick key programmatic elements elements and summarize them
    • Should lead to a good understanding of the programmatic idioms responsible for the components of the trust model
    • Derive and identify design and architectural issues

Strengths: Focuses on areas known to be security relevant

Builds good understanding of the application design and architecture

Builds good understanding of individual design aspects

Weaknesses: Poor assumptions can derail later elements of the review

Mentally taxing

application review process design generalization strategies3
Application Review ProcessDesign Generalization Strategies
  • DG4: Design conformity check
    • Focuses on vulnerabilities arising from differences between implementation and design
      • Design is typically underspecified
      • Implementation can also just deviate from design
    • Method tries to find “policy breaches”
      • These are then analyzed for security consequences

Strengths: Hard to go off track

Provides good balance between implementation and design understanding

Much easier than deriving function without a design

Weaknesses: Misinterpretation of design could result in overlooking vulnerabilities

Quality of result depends on the quality of original design

application review process code auditing tactics
Application Review ProcessCode Auditing Tactics
  • CODE AUDITING TACTICS:
    • Purpose:
      • Make errors such as skipping a line of code less likely
    • Set of simple tricks
application review process code auditing tactics1
Application Review ProcessCode Auditing Tactics

char * ReadString(int fd, int maxlength)

{

int length;

char * data;

if(read_integer(fd, &length) < 0)

return NULL;

data = (char *) malloc(length + 1);

if (data ==NULL)

return NULL;

if (read(fd, data, length) < 0)

{

free(data);

return NULL;

}

data[length] = ‘\0’;

return data;

}

If read_integer fails:

read_integer(fd, &length);

return NULL;

Not very exciting.

  • Internal Flow Analysis
    • Follow all control and data flows in a given module
    • Overlooked, but potentially relevant code:
      • Error-checking branches
      • Pathological code paths:
        • Functions with many small and non-terminating branches
  • Example:

If read fails:

read_integer(fd, &length);

data = (char *) malloc(length + 1);

return NULL;

read(fd, data, length)

free(data);

return NULL;

There is a major difference in handling a failure between these two function calls

application review process code auditing tactics2
Application Review ProcessCode Auditing Tactics
  • Subsystem and dependency analysis
    • Often, security sensitive code is spread over several modules
  • Rereading code
    • With different emphasis and vulnerability class targets
application review process code auditing tactics3
Application Review ProcessCode Auditing Tactics
  • Desk-Checking
    • Create table of all variables in a code fragment
    • Populate them with initial values
    • Follow execution by steping through each line of code.
application review process code auditing tactics4
Application Review ProcessCode Auditing Tactics
  • Test Cases
    • For a program or a small isolated part of code
  • Implemented by:
    • Writing software to interact with program and provide input
    • Entering values manually into a program using debugger
    • Entering values manually into a program using desk-checking
  • Choosing test values:
    • Boundary values
    • Several inputs  Too many cases
      • Constraint establishment
        • Practice of verifying that certain test values cannot reach the program fragment
      • Extraneous input thinning
        • Eliminate test inputs that are not a concern