1 / 42

CSCE 727 Awareness and Training Secure System Development and Monitoring

CSCE 727 Awareness and Training Secure System Development and Monitoring. Reading. Reading for this lecture: Denning: Chapter 14 Recommended: Rainbow Series Library, http://www.fas.org/irp/nsa/rainbow.htm Common Criteria, http://www.commoncriteriaportal.org/. System Certification.

Download Presentation

CSCE 727 Awareness and Training Secure System Development and Monitoring

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSCE 727 Awareness and Training Secure System Development and Monitoring

  2. Reading • Reading for this lecture: • Denning: Chapter 14 • Recommended: • Rainbow Series Library, http://www.fas.org/irp/nsa/rainbow.htm Common Criteria, http://www.commoncriteriaportal.org/ Information Warfare - Farkas

  3. System Certification Information Warfare - Farkas

  4. Building It Secure • 1960s: US Department of Defense (DoD) risk of unsecured information systems • 1970s: • 1977: DoD Computer Security Initiative • US Government and private concerns • National Bureau of Standards (NBS – now NIST) • Responsible for stadards for acquisition and use of federal computing systems • Federal Information Processing Standards (FIPS PUBs) Information Warfare - Farkas

  5. NBS • Two initiatives for security: • Cryptography standards • 1973: invitation for technical proposals for ciphers • 1977: Data Encryption Standard • 2001: Advanced Encryption Standard (NIST) • Development and evaluation processes for secure systems • Conferences and workshops • Involves researchers, constructors, vendors, software developers, and users • 1979: Mitre Corporation: entrusted to produce an initial set of criteria to evaluate the security of a system handling classified data Information Warfare - Farkas

  6. National Computer Security Center • 1981: National Computer Security Center (NCSC) was established within NSA • To provide technical support and reference for government agencies • To define a set of criteria for the evaluation and assessment of security • To encourage and perform research in the field of security • To develop verification and testing tools • To increase security awareness in both federal and private sector • 1985: Trusted Computer System Evaluation Criteria (TCSEC) == Orange Book Information Warfare - Farkas

  7. Orange Book • Orange Book objectives • Guidance of what security features to build into new products • Provide measurement to evaluate security of systems • Basis for specifying security requirements • Security features and Assurances • Trusted Computing Base (TCB) security components of the system: hardware, software, and firmware + reference monitor Information Warfare - Farkas

  8. Orange Book Support • Users: evaluation metrics to assess the reliability of the security system for protection of classified or sensitive information when • Commercial product • Internally developed system • Developers/vendors: design guide showing security features to be included in commercial systems • Designers: guide for the specification of security requirements Information Warfare - Farkas

  9. Orange book • Set of criteria and requirements • Three main categories: • Security policy – protection level offered by the system • Accountability – of the users and user operations • Assurance – of the reliability of the system Information Warfare - Farkas

  10. Security Policy • Concerns the definition of the policy regulation the access of users to information • Discretionary Access Control • Mandatory Access Control • Labels: for objects and subjects • Reuse of objects: basic storage elements must be cleaned before released to a new user Information Warfare - Farkas

  11. Accountability • Identification/authentication • Audit • Trusted path: no users are attempting to access the system fraudulently Information Warfare - Farkas

  12. Assurance • Reliable hardware/software/firmware components that can be evaluated separately • Operation reliability • Development reliability Information Warfare - Farkas

  13. Operation reliability • During system operation • System architecture: TCB isolated from user processes, security kernel isolated from non-security critical portions of the TCB • System integrity: correct operation (use diagnostic software) • Covert channel analysis • Trusted facility management: separation of duties • Trusted recovery: recover security features after TCB failures Information Warfare - Farkas

  14. Development reliability • System reliable during the development process. Formal methods. • System testing: security features tested and verified • Design specification and verification: correct design and implementation wrt security policy. TCB formal specifications proved • Configuration management: configuration of the system components and its documentation • Trusted distribution: no unauthorized modifications Information Warfare - Farkas

  15. Documentation • Defined set of documents • Minimal set: • Trusted facility manual • Security features user’s guide • Test documentation • Design documentation • Personnel info: Operators, Users, Developers, Maintainers Information Warfare - Farkas

  16. Orange Book Levels Highest Security • A1 Verified protection • B3 Security Domains • B2 Structured Protection • B1 Labeled Security Protections • C2 Controlled Access Protection • C1 Discretionary Security Protection • D Minimal Protection No Security Information Warfare - Farkas

  17. Orange Book • C1, C2: simple enhancement of existing systems. Does not break applications. • B1: relatively simple enhancement of existing system. May break some of the applications. • B2: major enhancement of existing systems. Will break many applications. • B3: failed A1 • A1: top-down design and implementation of a new system from scratch. Information Warfare - Farkas

  18. NCSC Rainbow Series • Orange: Trusted Computer System Evaluation Criteria • Yellow: Guidance for applying the Orange Book • Red: Trusted Network Interpretation • Lavender: Trusted Database Interpretation Information Warfare - Farkas

  19. Evaluation Process • Preliminary technical review (PTR) • Preliminary technical report: architecture potential for target rating • Vendor assistance phase (VAP) • Review of the documentation needed for the evaluation process, e.g., security features user’s guide, trusted facility manual, design documentation, test plan. For B or higher, additional documentations are needed, e.g., covert channel analysis, formal model, etc. • Design analysis phase (DAP) • Initial product assessment report (IPAR): 100-200 pages, detailed info about the hardware, software architecture, security relevant features, team assessments, etc. • Technical Review Board • Recommendation to the NCSC Information Warfare - Farkas

  20. Evaluation Process • Formal evaluation phase (FEP) • Product Bulletin: formal and public announcement • Final Evaluation Report: information from IPAR and testing results, additional tests, review code (B2 and up), formal policy model, proof. • Recommends rating for the system • National Cyber Security Center (NCSC) decides final rating • Rating maintenance phase (RAMP) • Minor changes and revisions • Reevaluated • Rating maintenance plan Information Warfare - Farkas

  21. European Criteria • German Information Security Agency: German Green Book (1988) • British Department of Trade and Industry and Ministry of Defense: several volumes of criteria • Canada, Australia, France: works on evaluation criteria • 1991: Information Technology Security Evaluation Criteria (ITSEC) • For European community • Decoupled features from assurance • Introduced new functionality requirement classes • Accommodated commercial security requirements Information Warfare - Farkas

  22. Common Criteria • January 1996: Common Criteria • Joint work with Canada and Europe • Separates functionality from assurance • Nine classes of functionality: audit, communications, user data protection, identification and authentication, privacy, protection of trusted functions, resource utilization, establishing user sessions, and trusted path. • Seven classes of assurance: configuration management, delivery and operation, development, guidance documents, life cycle support, tests, and vulnerability assessment. Information Warfare - Farkas

  23. Common Criteria • Evaluation Assurance Levels (EAL) Lowest Security • EAL1: functionally tested • EAL2: structurally tested • EAL3: methodologically tested and checked • EAL4: methodologically designed, tested and reviewed • EAL5: semi-formally designed and tested • EAL6: semi-formally verified and tested • EAL7: formally verified design and tested Highest Security Information Warfare - Farkas

  24. National Information Assurance Partnership (NIAP) • 1997: National Institute of Standards and Technology (NIST), National Security Agency (NSA), and Industry • Aims to improve the efficiency of evaluation • Transfer methodologies and techniques to private sector laboratories • Functions: developing tests, test methods, tools for evaluating and improving security products, developing protection profiles and associated tests, establish formal and international schema for CC Information Warfare - Farkas

  25. National Security Issues Interesting read: B. Baer Arnold, Cyber war in Ukraine – business as usual for the Russian bear, Homeland Security News Wire, March 13, 2014, http://www.homelandsecuritynewswire.com/dr20140313-cyber-war-in-ukraine- business-as-usual-for-the-russian-bear Roger C. Molander, Peter A. Wilson, B. David Mussington, Richard Mesic: What is Strategic Information Warfare?, 1996, http://www.rand.org/content/dam/rand/pubs/monograph_reports/2005/MR661.pdf Information Warfare - Farkas

  26. National Security and IW • U.S. agencies responsible for national security: large, complex information infrastructure • 1990: defense information infrastructure (DOD). Supports • Critical war-fighting functions • Peacetime defense planning • Information for logistical support • Defense support organizations • Need proper functioning of information infrastructure • “digitized battlefield” Information Warfare - Farkas

  27. National Security and IW • Increased reliance on information infrastructure • Heavily connected to commercial infrastructure • 95% of DOD’s unclassified communication via public network • No boundaries, cost effectiveness, ambiguous Information Warfare - Farkas

  28. National Security and IW • Vital human services • Law enforcement • Firefighters • Emergency telephone system • Federal Emergency Management Agency • Other Government Services and public utilities • Financial sector • Transportation • Communications • Power • Health system Information Warfare - Farkas

  29. Information Warfare • Persian Gulf War: first “information war” • After the war: • U.S. concern about own vulnerability for IW • “strategic” level of information warfare • No clear understanding of objectives, actors, and types of activities • What is IW? • Academia, national security community, intelligence community, etc. Information Warfare - Farkas

  30. Strategic Warfare • Cold War: “single class of weapons delivered at a specific range” (Rattray) • E.g., use of nuclear weapons with intercontinental range • Current: “variety of means … can create “strategic” effects, independent of considerations of distance and range.” • Center of gravity: • Those characteristics, capabilities, or sources of power from which a military force derives its freedom of action, physical strength, or will to fight (DOD) Information Warfare - Farkas

  31. Strategic IW “…means for state and non-state actors to achieve objectives through digital attacks on an adversary’s center of gravity.” (Rattray) Information Warfare - Farkas

  32. SIW Operating Environment • Man-made environment • Increased reliance on information infrastructure  new center of gravity Information Warfare - Farkas

  33. Strategic Warfare vs. SIW • Similar challenges • Historical observation: centers of gravity are difficult to damage because of • Resistance • Adaptation Information Warfare - Farkas

  34. Dimensions of Strategic Analysis • Threads: • Need to engage in multiple related means to achieve desired results • Interacting with opponent capable of independent action • Distinction between” • “grand strategy”: achievement of political object of the war (includes economic strength and man power, financial pressure, etc.) • “military strategy”: gain object of war (via battles as means) Information Warfare - Farkas

  35. Waging Strategic Warfare • Creates new battlefields and realms of conflict • Need identification of center of gravity • WWI: • German submarines: strangle U.K. economy • Airplanes: tactical use: reconnaissance and artillery spotting. Strategic use: 1915: German zeppelin: striking cities in England Information Warfare - Farkas

  36. Strategic Air Power • Targets center of gravity • WWI: • Deliver devastating strikes • Civilian morale • WWII: • U.S. targets German economic targets • Massive bombing campaigns • Crushing civilian morale • Paralyzing economy • Problems: • Difficulty to achieve general industrial collapse • Grossly overestimated the damage Information Warfare - Farkas

  37. Other Weapons – Cold War • Military capacity as means to achieve political leverage through strategic attacks: • E.g., nuclear weapons, ballistic missile, satellite capability, WMD • Massive retaliation • Ability to use is limited, e.g., 1956 Soviet invasion of Hungary Information Warfare - Farkas

  38. SW – Past • Focused on offensive actions • Largely ignored • Interaction between adversaries  difficult to determine utility of offensive action • Defense capabilities, vulnerabilities, and commitment Information Warfare - Farkas

  39. Necessary conditions for SW • Offensive freedom of action • Significant vulnerability to attack • Prospects for effective retaliation and escalation are minimized • Vulnerabilities can be identified, targeted, and damage can be assessed Information Warfare - Farkas

  40. SIW • Growing reliance  new target of concern • Commercial networks for crucial functions • Rapid change • Widely available tools • Significant uncertainties • Determining political consequences • Predicting damage, including cascading effects Information Warfare - Farkas

  41. SIW • Complexity and openness • Weakness • Strength • Difficult to distinguish offensive from defensive • Public information • Vulnerabilities • Incentives Information Warfare - Farkas

  42. Next class Midterm exam Information Warfare - Farkas

More Related