Sra raba center
This presentation is the property of its rightful owner.
Sponsored Links
1 / 15

SRA // RABA CENTER PowerPoint PPT Presentation


  • 94 Views
  • Uploaded on
  • Presentation posted in: General

SRA // RABA CENTER. SRS II Winter PI Meeting December 18, 2007. Agenda. Background SRA’s Infra-Red Team Approach Rules of Engagement What’s Next?. SRA SRS Phase II Quad Chart. Background. RABA was founded in 1994 as a Boutique Technology Company

Download Presentation

SRA // RABA CENTER

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Sra raba center

SRA // RABA CENTER

SRS II Winter PI Meeting

December 18, 2007


Agenda

Agenda

  • Background

  • SRA’s Infra-Red Team Approach

  • Rules of Engagement

  • What’s Next?


Sra srs phase ii quad chart

SRA SRS Phase II Quad Chart


Background

Background

  • RABA was founded in 1994 as a Boutique Technology Company

    • Acquired by SRA International, Inc. in October 2006

  • Related Past Performance:

    • 6 Red Team evaluations on SRS Phase I prototypes

      • AWDRAT (MIT), Dawson (GITI), Genesis (UVa)

      • LRTSS (MIT), QuickSilver (Cornell), Steward (Johns Hopkins University)

    • Red Team review of Maryland's Direct Recording Electronic (DRE) voting system

      • Only one of its kind in the nation

      • Recommendations were adopted, in part, by Maryland, Ohio, and California

    • Trusted contractor to the Intelligence Community for over 10 years designing and performing Red Team exercises for national systems

      • Penetration testing in a classified environment both as government employees and private contractors

      • Hardware and software systems

  • Our Assessment Team:

    • Unique composition for each assessment depending on the technology

    • All TS SCI cleared individuals

    • All have extensive experience in US Gov, DoD, and Intel Community

    • All have extensive experience in Information Warfare and Information Operations

    • All have extensive systems / software development experience


Sra s infra red team approach

SRA’s Infra-Red Team Approach

85% of effort

5% of effort

10% of effort


Value and mindset of the red team approach

Existing System’s

Perspective

Red Team’s Perspective

Value and Mindset of the Red Team Approach


Keys to success

Keys to Success

  • Tailor Each Assessment Completely

    • to the technology under test and with respect to DARPA’s goals

  • Unearth the Assumptions

    • of the designers / developers to clearly describe the current state of the technology

  • Focus

    • on the most important and innovating aspects

    • on the goal of providing a thorough evaluation of the technology

  • Add value

    • to the development through a collaborative relationship with the Blue Team

    • by not gaming the assessment in an effort to rack up points

  • Extensively Study the Technologies

    • In-depth study of performer’s briefings, technical papers, source code, and relevant publications on state-of-the-art


What we have done

What We Have Done…

  • Reviewed all four programs

    • Performer’s Briefings, Related Publications provided by Blue Teams

  • Researched the state-of-the-art in each technology

  • Performed additional research into your specific solutions and innovations

    • Publications by PIs available on the Internet

  • Developed DRAFT RoEs for your review and feedback

    • Delivered 11/30/2007

    • Participated in 3 conference calls to resolve disputes as well as exchanged several emails toward that end


Rules of engagement

Rules of Engagement

  • Collaborating with Blue / White Teams to produce a mutually-agreeable RoE:

    • Based on lessons-learned from prior IPTO Red Teams – great benefit to think through all the potential issues and scoring early rather than during the exercise itself.

  • 6 Specific Areas of Interest:

    • Red Team Start Position

    • Allowed and Disallowed Red Team Activities

    • Victory Conditions

    • Events Counted for Score

    • Test Script

  • Common Concerns voiced by Blue Teams:

    • No fair beating my dead horse!

    • That’s outside my scope – you can’t test that!

    • Scored vs. In-Scope vs. Out-of-Scope

    • My system doesn’t fit the metrics exactly, so the scoring diagram doesn’t work for me.


Attack distribution

Attack Distribution

  • Uniqueness

    • Concerns regarding finding one attack that works, then performing that attack multiple times to score more points

    • Our goal is not to score points. Our goal is to fully and rigorously evaluate the prototype.

  • Irrelevant Attacks

    • Concerns that attacks will not do something malicious with respect to the system under test

    • Carefully define attack such that it affects the system in a way that is relevant way


Attack distribution cont

Attack Distribution Cont.

  • Issues with Not-So-Random Sampling and % Metrics

    • Concerns about attacks selected not representing the entire population of attacks, but the subset population of attacks this prototype is not capable of protecting against

    • A biased sample produces biased calculations

  • Validation, Conceded, and In-Determinant Test Sets

    • Validation set to even the population a bit and to serve as verification that prototype meets the Blue Team’s claims

    • Conceded set to prevent expending resources developing attacks where the outcome is known

    • In-Determinant set to address those attacks where the outcome is not known


Scored vs in scope vs out of scope

Scored vs. In-Scope vs. Out-of-Scope


Metrics and scoring diagrams

Metrics and Scoring Diagrams

  • Multiple interpretations of the written-word metrics

    • compound sentences and distribution of percentages

  • Various definitions for the key performance indicators

    • detection, effective response, correct thwart, attribution

  • In some cases, only partial compliance with the metrics as stated in the BAA

    • As DARPA’s advocates, we look to verify the prototypes meet the criteria to the satisfaction of the PM

    • Prototypes are somewhat specialized and Blue Team research is often more narrow than what the metrics measure

    • Struggling to find a compromise that is fair to the Blue Team and still a satisfactory application of the metrics


What s next

What’s Next?

  • Continue to study your technology

  • Visit your site

    • Get hands-on experience with your prototype (when possible)

    • Engage in white-board discussions

  • Collaborate and communicate attack strategies

    • Until the Go Adversarial Date

  • Obtain the baseline version of your source code

  • Develop the Assessment Plan

  • Perform the Assessments

    • Day 1: Logistics, Set-up and Assessment Plan Review

    • Day 2: Test Execution Event

    • Day 3: Execute Remaining Tests Give the Assessment Out-Brief (last hour of event)

  • Analyze the Assessment Results, develop the Assessment Report

  • Close out the program with a Final Report


Questions

Questions?


  • Login