1 / 53

Lecture 8

Lecture 8. Security Evaluation. Contents. Introduction The Orange Book TNI-The Trusted Network Interpretation Information Technology Security Evaluation Criteria The Common Criteria Security Analysis. What is an Evaluation?.

conley
Download Presentation

Lecture 8

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 8 Security Evaluation

  2. Contents • Introduction • The Orange Book • TNI-The Trusted Network Interpretation • Information Technology Security Evaluation Criteria • The Common Criteria • Security Analysis

  3. What is an Evaluation? • Independent Verification and Validation (IV&V) by an accredited and competent Trusted Third Party • Provides a basis for international Certification against specific formal standards (i.e. CC) by national authorities

  4. Produce provide formal evidence of Assurance giving Information Asset Owners Confidence require Privacy Requirements Properly Managed Privacy Rights that to protect are Evaluation Process Assurance Techniques Independent Evaluations

  5. The target of the evaluation • Products, e.g., operating systems, which will be used in a variety of applications and have to meet the generic security requirements. • Systems, i.e., a collection of products assembled to meet the specific requirements of a given application.

  6. The purpose of the evaluation • Evaluation: assessing whether a product has the security properties claimed for it. • Certification: assessing whether a product is suitable for a given application. • Accreditation: deciding that a product will be used in a given application.

  7. The method of the evaluation • A method must prevent from: • The product is later found to contain a serious flaw • Different evaluations of the same product disagree in their assessment • Product-Oriented: • Examine and test the product. • Different evaluations may give different results. • Process-Oriented: • Look at the documentation and the process of product development. • Easier to achieve repeatable results, but may not be very useful

  8. The structure of the evaluation criteria • Functionality: • The security features of a system, e.g., DAC (Discretionary), MAC (Mandate), authentication, auditing • Effectiveness: • The mechanisms used appropriate for the given security requirements? • Assurance: • The thoroughness of the evaluation

  9. Organizations of the evaluation process • Government agency: backs the evaluation process and issues the certification • Accredited private enterprise: enforce the consistency of evaluations (repeatability and reproducibility)

  10. What Do CC Evaluations Give Us?-Benefits • Confidence & Trust in privacy and security characteristics of products and the processes used to develop and support them (full product life cycle) • Build official assurance arguments • Prove technologies are indeed privacy enhancing as claimed • formal, independently verifiable and repeatable methods • Provide basis for international certification • Provide Certification Report • Differentiate products • Formally support demonstrable due diligence/care

  11. The costs of evaluation • Costs: • Fee paid to evaluation; • Time to collect required evidences • Time and money of training of evaluators • Time and efforts of liaising with the evaluation team

  12. The Orange Book • Trusted Computer Security Evaluation Criteria (1985) • A yardstick for users to assess the degree of trust that can be placed in a computer security systems; • Guidance for manufacturers of computer security systems; • A basis for specifying security requirements when acquire a computer security system

  13. Security and evaluation categories • Security policy: • mandatory and discretionary access control policies expressed in terms of subjects and objects • Marking of objects: • Labels specify the sensitivity of objects • Identification of subjects: • Individual subjects must be identified and authenticated • Accountability: • audit logs of security relevant events have to be kept.

  14. Security and evaluation categories (Cont’d) • Assurance: • Operational: architecture • Life cycle: design, test, and configuration management • Documentation: • Required by system managers, users and evaluators • Continuous protection: • Security mechanisms cannot be tampered with.

  15. Four security divisions • D: Minimal protection • C: Discretionary protection • B: Mandatory protection • A: Verified protection

  16. Orange Book Ratings • D: Minimal Protection • Did not qualify for higher • C1: Discretionary Security Protection • Resources protected by ACLs, memory overwrites prevented • C2: Controlled Access Protection • Access control at user level, clear memory when released • B1: Labeled Security Protection • Users, files, processes, etc, must be labeled • B2: Structured Security Protection • Big step, covert channels, secure kernel, etc. • B3: Security Domains • Auditing, secure crash • A1: Verified Design • Same requirements, more rigorous implementation

  17. TNI-The Trusted Network Interpretation-The Red Book • Two kinds of networks • Networks of independent components, with different jurisdictions, policies, management, etc. • Centralized networks with single accredited authority, policy and network trusted computing base • The red book only considers the 2nd type. • The vulnerability of the communication paths • Concurrent and asynchronous operation of the network components

  18. Red book Policy • Security policies deal with secrecy and integrity. • Node names as DAC group identifiers in C1 • Audit trails should log the user of cryptographic keys. • In the red book, integrity refers to • The protection of data and labels against unauthorized modification • The correctness of message transmission, authentication of source and destination of a message. • Labels indicate whether an object had ever been transmitted between nodes

  19. Other Security Services in the Red Book • Describe Services • Functionality • Strength: how well it is expected to meet its objective • Assurance: derived from theory, testing, SE practice, validation and verification. • Rating • None • Minimum (C1) • Fair(C2) • Good (B2) • Not offered - present

  20. Services in the red book • Communication integrity • Authentication • Communication field integrity • Non-repudiation • Denial of Service • Continuity of operation • Protocol-based protection • Network management • Compromise protection • Data confidentiality • Traffic confidentiality

  21. Windows NT security rating • Windows NT is only secure for such purposes (e.g. - "C2 Certified") if: • Run the particular Compaq or Digital hardware models specified by NIST, • Run the particular version of Windows NT (3.50) specified by NIST, • Remove the floppy drive from the computer, and • Remove network connectivity, and • Configure Windows NT as specified by NIST.

  22. UNIX security rating • The Unix system is only as secure as the C1 criterion • provides only discretionary security protection (DSP) against browsers or non-programmer users • AT&T, Gould and Honeywell made vital changes to the kernel and file system in order to produce a C2 rated Unix operating system. • Have to sacrifice some of the portability of the Unix system. It is hoped that in the near future a Unix system with an A1 classification will be realized, though not at the expense of losing its valued portability. • http://secinf.net/unix_security/Unix_System_Security_Issues.html

  23. ITSEC: Information Technology Security Evaluation Criteria • A harmonized European criteria refers to (1991) • Effectiveness: how well a system is suited for countering the threats envisaged. • And correctness: assurance aspects relating to the development and operation of a system.

  24. The Evaluation Process • TOE (Target of Evaluation) • An IT system, part of a system or product that has been identified as requiring security evaluation; ie, that which is being evaluated. • A Target of Evaluation (TOE) is the specific IT product or system that is subject to evaluation. • It is particularly relevant to, and part of the standard terms within, Common Criteria and ITSEC. • A Security Target (ST) contains the IT security objectives and requirements as pertaining to a specific target of evaluation with the definition of its functional and assurance measures. • http://www.itsecurity.com/papers/border.htm

  25. Security target • All aspects of the TOE that are relevant for evaluation • Security objectives • Statements about the system environment • Assumptions about the TOE environment • Security functions • Rationale for security functions • Required security mechanisms • Required evaluation level • Claimed rating of the minimum strength of mechanisms • http://www.rycombe.com/itsec.htm • Definition (Matt Bishop, 2003): • A security target is a set of security requirements and specifications to be used as the basis for evaluation of an identified product or system.

  26. Security Functionality • Security functionality description: • Security objectives: • Why is the functionality wanted? • Security functions: • What is actually done? • Security mechanisms: • How is it done?

  27. Security functions in ITSEC • Identification and authentication • Access control • Accountability: record the exercise of rights • Audit: detect and investigate events that might represent threats to security • Object reuse • Accuracy: correctness and consistency of data • Reliability: consistency and availability of service • Data exchange: referring to the International standard ISO 7498-2

  28. Security rating in ITSEC • F1: C1 Discretionary security protection • F2: C2 Controlled Access Protection • F3: B1 Labeled Security Protection • F4: B2 Structured Protection • F5: B3 & A1 Security domain and verified design • F6: high integrity • F7: high availability • F8: data integrity during communication • F9: for high confidentiality (cryptographic devices) • F10: for networks with high demands on confidentiality and integrity

  29. Assurance of Effectiveness • An assessment of effectiveness should examine • Suitability of functionality • Binding of functionality (compatibility) • Strength of mechanism • Ease of use • Assessment of security vulnerabilities within the construction of the TOE, e.g., ways of bypassing or corrupting security enforcing functions • Assessment of security vulnerabilities within the operation of the TOE.

  30. Assurance of Correctness • Seven levels E0-E6 specify the list of documents that have to be provided by the sponsor and the actions to be performed by the evaluator. • Development Process: • Following the stages of a top-down methodology, security requirements, architectural design, detailed design, and implementation are considered. • Development evaluation: • includes configuration control and, from class E2 upwards, developer security, e.g., the confidentiality of documents association. • Operation: • Refers to operational document, including delivery, configuration, star-up and operation.

  31. Seven Evaluation classes • E0: fail • E1:a security target and an informal description of the target • E2:+informal description of detailed design, configuration control and a controlled distribution process • E3: + a detailed design and the source code corresponding to the security functions shall be provided • E4: formal model of the security policy; rigorous approach and notation for architectural and detailed design, vulnerability analysis • E5: close correspondence between detailed design and source code, vulnerability based on source code • E6: formal description of the security architecture of the TOE, consistent formal model of security policy, possible to relate portions of the executable form of TOE to the source code

  32. Correspondence between Orange Book and ITSEC

  33. Common Criteria ISO 15408 • International ISO IT Security standard for formally specifying IT Security Requirements and how these are to be independently evaluated and tested so products may be formally certified as being trustworthy (1991) • 3-Part Standard, plus evaluation methodology • http://www.corsec.com/ccc_faq.php

  34. CC Evaluations Involve: • ANALYSIS • Product Documentation • Product Design (Security & Privacy Focus) • Development Processes & Procedures • Operation & Administration Guidance and Procedures • Vulnerability Assessments • TESTING • Independent & Witnessed • Fully Documented & Repeatable • REPORTS • Lead to International Certification

  35. Types of Common Criteria evaluations • Categories of Evaluations • Typically as the first step in an EAL. Protection Profile *Security Target Evaluation Assurance Levels (EALs)

  36. Scope • Interviews • Full Documentation Review • Independent Testing • Witness of Developer Testing • Observation Reports When Required • Deliverables: • Security/Privacy Target or Protection Profile • Evaluation Technical Report • Certification Report (published by CSE, and recognized by NSA and other Certification Bodies)

  37. Protection Profiles • A Protection Profile (PP) is an implementation-independent statement of security requirements that is shown to address threats that exist in a specified environment. • A PP would be appropriate in the following cases: • A consumer group wishes to specify security requirements for an application type • A government wishes to specify security requirements for a class of security products • An organization wishes to purchase an IT system to address its security requirements • A certified protection profile is one that a recognized Certification Body asserts as having been evaluated by a laboratory competent in the field of IT security evaluation to the requirements of the Common Criteria and Common Methodology for Information Technology Security Evaluation.

  38. The evaluationprocess • Work not necessarily performed by the CCTL: • Documentation preparation • Writing the Security Target • Other consulting • Evaluations must be performed by lab personnel Develop Security Target -------------------- Vendor Consultant or Lab Documentation Preparation ------------------- Vendor Consultant Conduct Evaluation -------------------- Lab Vendor NIAP CCEVS NIAP Issues Certificate -------------------- NIAP CCEVS • CCTL: Common Criteria Test Labs • CCEVS (Common Criteria Evaluations) • The National Information Assurance Partnership (NIAP) is the governing body for all CCTLs in the U.S.

  39. Required evaluation materials • Security Target • TOE (target of evaluation) • Configuration Management documentation • Functionality Specification • High and low level design documentation • User and Administrator’s guides • Life-cycle documentation • Development tool documentation • Security Policy model • Correspondence analyses • Installation and start-up procedures • Delivery procedures

  40. Steps in theevaluation process Evaluation assurance levels (EAL)

  41. Results of theevaluation process • Outcomes of Common Criteria Testing • In U.S. this follows approval of lab test results • Public posting of ST, validation report, and certificate Validation Certificate

  42. How the Process Works • Privacy (and security) requirements for a technology and associated claims are precisely specified using the CC • Technology is built, documented and tested to these requirements • Technology is submitted to nationally accredited labs for evaluation against the standards • Evaluation is conducted under the oversight of national authority

  43. Process (Continued) • Once vendor claims are proven, national authority confers certification and publishes a Certification Report • Results are internationally recognized under a Mutual Recognition Arrangement

  44. Evaluation assurance levels (EAL) • To meet the great variation in required levels of security within and between both government and commercial interests, there are seven levels of evaluation (EAL-1 through EAL-7). • Only the first four levels can be evaluated by commercial laboratories.

  45. EAL (Cont’d) • EAL-1 examines the product and its documentation for conformity, establishing that the Target does what its documentation claims. • EAL-2 tests the structure of the product through an evaluation, which includes the product’s design history and testing. • EAL-3 evaluates a product in design stage, with independent verification of the developer’s testing results, and evaluates the developer’s checks for vulnerabilities, the development environmental controls, and the Target’s configuration management. • EAL-4 is an even greater in-depth analysis of the development and implementation of the Target and may require more significant security engineering costs.

  46. EAL (Cont’d) • EALs 5-7 require even more formality in the design process and implementation, analysis of the Target’s ability to handle attacks and prevent covert channels, for products in high-risk environments. • In the United States, evaluation to EALs 5-7 must be done by the National Security Agency (NSA) for the U.S. government.

  47. Correspondences

  48. International evaluationshistory • TCSEC (1980) • Trusted Computer System Evaluation Criteria (U.S.) • ITSEC (1991) • Information Technology Security Evaluation and Certification Scheme (Europe) • CTCPEC (1993) • Canadian Trusted Computer Product Evaluation Criteria (CTCPEC)

  49. International evaluations history TCSEC – U.S. (Orange Book) 1985 Canadian Criteria 1993 Common Criteria V1 1996 V2 1998 Federal Criteria Draft 1993 U.K. Confidence Levels 1989 ITSEC 1991 German Criteria French Criteria www.commoncriteria.org

  50. Common Criteriaparticipating countries • Certificate consuming countries • Austria • Finland • Greece • Israel • Italy • Netherlands • Norway • Spain • Sweden • Certificate producing countries • Australia • New Zealand • Canada • France • Germany • United Kingdom • United States

More Related