1 / 24

DETER Team

Overview: Cyber Defense Technology Experimental Research (DETER) Testbed Terry V. Benzel, C. Neuman Information Sciences Institute University of Southern California. DETER Team. USC – ISI Terry Benzel, Bob Braden, Dongho Kim, Cliff Neuman UC Berkeley Eric Fraser, Anthony Joseph, Keith Sklower

niabi
Download Presentation

DETER Team

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Overview:Cyber Defense Technology Experimental Research (DETER)TestbedTerry V. Benzel, C. NeumanInformation Sciences InstituteUniversity of Southern California

  2. DETER Team • USC – ISI • Terry Benzel, Bob Braden, Dongho Kim, Cliff Neuman • UC Berkeley • Eric Fraser, Anthony Joseph, Keith Sklower • Sparta • Ron Ostrenga, Steve Schwab

  3. AGENDA • 8:30 Welcome: Dr. Joseph Evans, NSF, and Dr. Douglas Maughan, DHS • 8:45 DETER/EMIST Overview: Terry Benzel, USC – ISI, George Kesidis, Penn State • 9:30 DETER Workbench: Bob Braden, USC – ISI • ESVT: A Toolkit Facilitating use of DETER: Peng Lui, Penn State • [10:00 – 4:00] Parallel Hands-on Demo and Experiments in Conf room • 10:00 Using Source Models to Evaluate Enterprise Worm Defenses: • Nick Weaver, Vern Paxson, Scott Crosby, ICSI • 10:45 Break • 11:00 Requirements and Tools for Routing Experiments: • Sandy Murphy, Sparta, S. Felix Wu, UC Davis • 11:45 lseb: Trace Drive Modeling of Internet-Scale BGP Attacks and Countermeasures: • Patrick McDaniel, Penn State • 12:15 Lunch

  4. AGENDA (continued) • 1:15 Evaluation of Worm Defense Systems Through Experimentation: • Karl Levitt, Jeff Rowe, UC Davis, Phil Porras, SRI • 2:30 DDoS Defense Experiment Methodology -- Impact of Traffic Generation • Selection on Precision of Detection and Response Characterization: • Stephen Schwab, Sparta Inc • 3:15 Break • 3:30 Methodology and Tools for High-Fidelity Emulation of DDoS Attacks • Sonia Fahmy, Purdue University • 4:15 Cyber Early WArning System (CEWAS): • Abhrajit Ghosh, Rajesh Talpade, Sudha Ramesh, Telcordia • 4:45 General Discussion and Q&A • 5:00 Adjourn

  5. Project Background • DETER and EMIST are two companion efforts • George Kesidis will overview EMIST • Period of Performance Sept. 03 – Aug. 06 • Funded by NSF and DHS HSARPA • Joe Evans NSF, Doug Maughan DHS PM’s • Operating as one unified project

  6. DETER+EMIST Vision • ... to provide the scientific knowledge required to enable the development of solutions to cyber security problems of national importance • Through the creation of an experimental infrastructure network -- networks, tools, methodologies, and supporting processes -- to support national-scale experimentation on research and advanced development of security technologies.

  7. DETER Testbed Goals • Facilitate scientific experimentation • Establish baseline for validation of new approaches • Provide a safe platform for experimental approaches that involve breaking network infrastructure • Create researcher- and vendor-neutral environment • Provide access for wide community of users

  8. Experiment Methodology and Security Workbench • Experimenter’sselect from apalette of predefined elements: Topology, Background and Attack Traffic, and Packet Capture and Instrumentation • Our Methodology frames standard, systematic questions that guide an experimenter in selecting and combining the right elements • Experiment Automation increases repeatability and efficiency by integrating the process to the DETER testbed environment TOPOLOGY TRAFFIC ATTACK DATA-CAPTURE PALETTESs METHODOLOGY & GUIDANCE ? EXPERIMENT AUTOMATION

  9. DETER Architectural Plan • Construct homogeneous emulation clusters based upon University of Utah’s Emulab • Implement network services – DNS, BGP • Add containment, security, and usability features to the software • Add (controlled) hardware heterogeneity • Specialized devices – Routers, IDP, IPS, black boxes

  10. DETER Experimental Network Based On Emulab Cluster of N nearly identical experimental nodes, interconnected dynamically into arbitrary topologies using VLAN switches. Pool of N processors 160 PC PC PC Switch Control Interface N x 4 @1000bT Data ports Programmable Patch Panel (VLAN switch)

  11. Example DETER Topologies

  12. Testbed Software • Begin with Utah’s Emulab software. • Add containment, security, and usability features • Collaborate with Utah on new development • User access: Web interface [Emulab] to define, request, control an experiment. • Encrypted tunnels across Internet (SSL/SSH/IPsec) • No direct IP path into experimental network.

  13. DETER Testbed Cluster Architecture User Internet Control DB Ethernet Bridge with Firewall ‘User’ Server Master Server External VLAN User files Web/DB/SNMP, switch mgmt User Acct & Data logging server Boss VLAN Users VLAN Router with Firewall Control Hardware VLAN Node Serial Line Server … Power Serial Line Server Control Network VLAN 139 Control ports 160 Power Controller PC PC PC 139 x 4 @1000bT Data ports Switch Control Interface Programmable Patch Panel (VLAN switch)

  14. Interconnecting Clusters • Two clusters: USC -ISI, UCB • One control site (ISI) • One user entry point, accounts, control • Connection • CENIC: CalREN-HPR • VLAN switches interconnected using IPsec tunnels • Form one pool of nodes to be allocated • User can control whether span multiple clusters

  15. FW FW Node Serial Line Server User Internet ISI Cluster CENIC UCB Cluster User files 'Boss' Server ‘User’ Server Download Server Node Serial Line Server IPsec Control Network Control Network PC … PC PC … Power Cont’ler Power Cont’ler PC PC IPsec Cisco and Nortel switch Foundry and Nortel switch trunk trunk

  16. Hardware Status and Plan ISI Juniper IDP-200 8 x 1Gbps 2 x 64 x Dell pc3000 64 x IBM pc733 11 x Sun pc2800 80 x Dell ? Juniper M7i 5 x 4 x 1Gbps Nortel 5510 Cisco 6509 1 GBps (4 later) Cloud Shield 2200 ~150Mbps with IPSec 1 x 4 x 1Gbps UCB McAfee Intrushield 2600 2 x 2 x 1Gbps 64 x Dell ? 40 x HP 32 x Dell bpc3000 30 x Sun bpc2800 Nortel 5510 Foundry 1500 1 GBps (4 later)

  17. Handling Scary Code • Objective: Variable-safety testbed • Adaptable to threat level of experiment • Supports shared, remote experimenter access for low-threat code; varying degrees of isolation. • Research question: can we design DETER to safely handle the entire range of threats, or will really scary stuff have to run in some other isolated containment facility? Usability Security DETER ? Emulab Isolated Containment

  18. DETER Testbed Security is Critical • Defenses employed by the test-bed must balance the requirements of containment, isolation, and confidentiality, with the need for remote management of experiments. • Experiments will be categorized according to the consequences of loss of containment, and procedures applied according to that categorization.

  19. AchievingSecurity • Operational • Procedures for proposing and reviewing experiments. • Guidelines for categorizing safety of experiments. • Vetting of investigators and experiments • Procedures used by investigators • Technical • Firewall, routing, intrusion detection and network isolation techniques. • Data protection, system protection, and state destruction techniques.

  20. Experiment Safety Panel • Experiment description provided by investigator: • Identify containment, isolation, confidentiality, and other security considerations. • Panel assesses proposed category: • Determines safety category, level of isolation required • Assesses if isolation can be maintained • Imposes technical measures to assure isolation requirements are met.

  21. Security Consideration Matrix • Management requirements • M1 -- Disconnected from Internet • M2 -- Self-contained batch mode acceptable • M3 -- Remote submission and collection • M4 -- Remote real-time monitoring and control of parameters • M5 -- Full remote control neededfor interactive experimentation with high bandwidthneeds • Threat from experimental agents • A1 -- Attack traffic trace analysis • A2 -- Simulations that generate attack traffic • A3 -- DDoS attacks and tools in circulation • A4 -- Previously released viruses and worms- still in circulation, eradicated, or defenses deployed • A5 -- Current/new viruses, worms, DDoS, etc- moderate to severe threat • - defenses not broadly deployed • Sensitivity of experiment data • S1 -- Results eventually to be open • S2 -- Results commercially sensitive • S3 -- Results extremely commercially sensitive -- disclosure gives a way proprietary approaches • S4 -- Results extremely safety sensitive-- instructive to those trying to breach security • Impact of the experiment • I1 -- Minimal traffic generation or performance degradation • I2 -- High but bounded traffic • I3 -- Flooding of single link or site • I4 -- Floods network and severely degrades performance

  22. Containment Mechanisms • Physical separation • Transfers in and out when experiments not running, perhaps via physical media. • Virtual separation • VPN’s and network overlays allow secure connectivity over Internet. • Firewalled separation • Connectivity may be reduced when experiments are running. • Decontamination procedures • After an experiment runs • When data is removed from testbed

  23. DETER Experimenters Community • DETER testbed is itself an important research and engineering problem • Do not expect a ready-made “turnkey” platform • Expect to become active participants in the community of security researchers • Help shape the development of the DETER testbed.

  24. Access to Testbed • Open to community – request via email: deterinfo@isi.edu • Important addresses: • www.isi.edu/deter • www.isi.deterlab.net • http://emist.ist.psu.edu • www.emulab.net • Hiring – email tbenzel@isi.edu

More Related