1 / 49

Computer Security The Security Kernel-

Computer Security The Security Kernel-. The Security Kernel. Layers of an IT system. Applications Services Operating System OS kernel Hardware. OS integrity. Orange Book Glossary – DoD Trusted Computing Evaluation Criteria TCSEC Reference monitor

liberty
Download Presentation

Computer Security The Security Kernel-

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computer SecurityThe Security Kernel-

  2. The Security Kernel Layers of an IT system Applications Services Operating System OS kernel Hardware

  3. OS integrity Orange Book Glossary – DoD Trusted Computing Evaluation Criteria TCSEC • Reference monitor • An abstract machine that mediates all accesses to object by subjects. • Security Kernel • Hardware, firmware & software that implement the reference monitor. • Trusted computing base (TCB) • The protecting mechanisms within a computer system (hardware, firmware & software) that enforce the security policy.

  4. OS integrity Generic security policies • Users should not be able to modify the operating system • Users should be able to use (invoke) the operating system • Users should not be able to misuse the operating system. To achieve these goals two mechanisms are used: • status information and controlled invocation (restricted previlege)

  5. OS integrity Modes of operation The OS should be able to distinguish computations on • supervisor (system) mode: on behalf of the OS • user mode: on behalf of the users. This prevents users from writing directly to memory and corrupting the logical file structure. If a user wants to execute an operation requiring supervisor mode, then the processor has to switch modes – this process is called controlled invocation.

  6. OS integrity Controlled invocation Example A user wants to execute an operation requiring supervisory mode, e.g, write to a memory location. To deal with this, the processor has to switch between modes, but this is a problem. Simply changing the status bit to supervisor mode would give the user all privileges associated with this mode, without any control of what the user actually does.

  7. OS integrity Controlled invocation Example –continued Therefore it is desirable that the system only performs a certain predefined set of operations in supervisory mode and then return to user mode before handing back control to the user. We refer to this as controlled invocation

  8. OS integrity Hardware security features A schematic description of a computer CPU Bus Memory

  9. Computer architecture • The Central Processing Unit • The Arithmetic Logic Unit • Registers • General purpose • Dedicated • Program counter • Stack pointer • Status register

  10. Computer architecture • Memory structure • Random Access Memory • Security concerns: integrity, confidentiality • Read-Only Memory • Security concerns : confidentiality • Erasable & Programmable ROM • Security concerns : more sophisticated attacks • Write-once ROM • Security: good for recording audit trails, storing crypto keys, etc.

  11. Computer architecture Processes and Threads A process is a program in execution. It consists of: • executable code • data • the execution context. A process works in its own address space and can communicate with other processes only through the primitives provided by the OS. The logical separation between processes is a useful basis for security. On the other hand, a context switch between processes is an expensive operation as the OS has to save the whole execution context on the stack.

  12. Computer architecture Processes and Threads Threadsare strands of execution within a process. As threads share an address space they avoid the overhead of a full context switch, but they also avoid control by a potential security mechanism.

  13. Computer architecture Controlled Invocation – interrupts Processes are equipped to deal with interruptions of executions, created by errors in the program, user requests, hardware failure etc. The mechanisms to do this are called varyingly, interrupts, exceptions and traps. We shall use the term trap. When a trap occurs the system saves its current state on the stack and then executes theinterrupt handler.

  14. Computer architecture Controlled Invocation – interrupts Interrupt Interrupt vector table Memory TRAP #n Interrupt vector Interrupt handler

  15. Computer architecture Controlled Invocation – interrupts The interrupt handler has to make sure that the system is restored to a proper state, e.g., by clearing the supervisor status bit before returning control to the user program. It is possible for a further interrupt to arrive while the processor deals with a current interrupt.

  16. Computer architecture Controlled Invocation – interrupts The processor may then have to interrupt the current handler. This may allow a user to enter supervisory mode by interrupting the execution of an OS call.

  17. Computer architecture Reference Monitor Operating Systems manage access to data, and usually are not involved with the interpretation of data. They must protect their own integrity and prevent users from accidentally/intentionally accessing other users data.

  18. Computer architecture Reference Monitor -integrity of OS This is achieved by separating user space from OS space. Logical separation of users protects accidental/intentional interference by users. Separation can take place at two levels: • file management –logical memory objects • memory management – physical memory objects

  19. Reference Monitor * Memory structure • segmentation • paging The first divides data into segments = logical units. Each segment has a name and items have an offset. The OS maintains a table of segment names with their true Addresses. Segmentation is used for logical access control. This is a good basis for enforcing security policies, however segments have variable length – memory management is harder.

  20. Reference Monitor * Memory structure - paging This divides memory into pages of equal size. Addresses consist of two parts, the page number and an offset (within a page). Paging allows for more efficient memory management, but is not a good basis for access control. • a page may contain objects which require different protection, • logical objects can be stored across the boundary of a page –this allows for a covert channel.

  21. Reference Monitor * Memory protection This OS must protect its own integrity and confine each process to a separate address space. This means that the OS must control data objects in memory. This can be achieved: • by modifying addresses it receives • Address sandboxing: the address has an identifier and a offset. Th OS sets the correct segment identifier. • by constructing effective addresses from relative addresses it receives • Relative addressing: the address is specified by an offset relative to a given base address • checking that the addresses it receives are within given bounds.

  22. Kernel Primitives * These are based on the Multics operating system, which is similar to BLP • subjects = processes • These contain a descriptor segment that contains information about the process, including the objects the process has access to. The object has a segment descriptor word. Multics segment descriptor word segment id pointer read: on execute: off write: on

  23. Kernel Primitives * • objects • These are memory segments, I/O devices etc. • They are organized hierarchically in a directory tree. • Information about an object such as its security level or its access control list(ACL) is kept in its parent directory. • To access an object, a process has to traverse the tree from root to the target object. • If any node of the path is not accessible then the target object is not accessible --we require that the security level of an object dominates that of its directory.

  24. Kernel Primitives * Finally a set of Kernel primitives has to be specified. These are the state transitions, in an abstract BLP type Model. We then must show these preserve the BLP security policies.

  25. Computer SecuritySecurity Evaluation

  26. Security Evaluation • How do you get assurance that your computer systems are adequately secure? • You could trust your software providers. • You could check the software yourself, but you would have to be a real expert. • You could rely on an impartial security evaluation by an independent body. • Security evaluation schemes have evolved since the 1980s; currently the Common Criteria are used internationally. www.wiley.com/go/gollmann

  27. Objectives • Examine the fundamental problems any security evaluation process has to address. • Propose a framework for comparing evaluation criteria. • Overview of the major evaluation criteria. • Assess the merits of evaluated products and systems.

  28. Agenda • History • Framework for the comparison of criteria • Orange Book • ITSEC • Federal Criteria • Common Criteria • Quality Standards? • Summary

  29. Security Evaluation – History • TCSEC (Orange Book): criteria for the US defense sector, predefined evaluation classes linking functionality and assurance • ITSEC: European criteria separating functionality and assurance so that very specific targets of evaluation can be specified and commercial needs can better addressed • TCSEC and ITSEC no longer in use; replaced by the Common Criteria (CC):http://www.commoncriteria.org/, http://niap.nist.gov/cc-scheme

  30. Framework for Security Evaluation • What is the target of the evaluation? • What is the purpose of an evaluation? • What is the method of the evaluation? • What is the organizational framework for the evaluation process? • What is the structure of the evaluation criteria? • What are the costs and benefits of evaluation?

  31. Target & Purpose • Target of evaluation • Product: “off-the-shelf” software component to be used in a variety of applications; has to meet generic security requirements • System: collection of products assembled to meet the specific requirements of a given application • Purpose of evaluation • Evaluation: assesses whether a product has the security properties claimed for it • Certification: assesses suitability of a product (system) for a given application • Accreditation: decide to use a certain system

  32. Method • Evaluations should not miss problems, different evaluations of the same product should give the same result. • Product oriented: examine and test the product; better at finding problems. • Process oriented: check documentation & product development process; cheaper and better for repeatable results. • Repeatability and reproducibility often desired properties of an evaluation methodology. www.wiley.com/go/gollmann

  33. Organizational Framework • Public service: evaluation by government agency; can be slow, may be difficult to retain qualified staff. • Private service: evaluation facilities usually accredited by a certification agency. • How to make sure that customer pressure does not influence evaluation results? • Contractual relationship between evaluation sponsor, product manufacturer, evaluation facility? • Interpretation drift (criteria creep): meaning of criteria may change over time and differ between evaluators.

  34. Structure • Structure of evaluation criteria: • Functionality: security features • Effectiveness: are mechanisms used appropriate • Assurance: thoroughness of analysis • Orange Book: evaluation classes for a given set of typical DoD requirements, consider all three aspects simultaneously. • ITSEC: flexible evaluation framework that can deal with new security requirements; the three aspects are addressed independently.

  35. Costs and Benefits • Direct costs: fees paid for evaluation. • Indirect costs: employee time, training evaluators in the use of specific analysis tools, impact on development process. • When evaluating a product, the cost of evaluation may be spread over a large number of customers. • Benefits: evaluation may be required, e.g. for government contracts; marketing argument; better security?

  36. Orange Book • Developed for the national security sector, but intended to be more generally applicable; provides • a yardstick for users to assess the degree of trust that can be placed in a computer security system, • guidance for manufacturers of computer security system, • a basis for specifying security requirements when acquiring a computer security system. • Security evaluation of the Trusted Computing Base (TCB), assumes that there is a reference monitor. • Developed for systems enforcing multi-level security. • High assurance linked to formal methods, simple TCBs, and structured design methodologies; complex systems tend to fall into the lower evaluation classes.

  37. Evaluation Classes • Designed to address typical security requirements; combine security feature and assurance requirements: • Security Policy: mandatory and discretionary access control; • Marking of objects: labels specify the sensitivity of objects; • Identification of subjects: authentication of individual subjects; • Accountability: audit logs of security relevant events; • Assurance: operational assurance refers to security architecture, life cycle assurance refers to design methodology, testing, and configuration management; • Documentation: users require guidance on installation and use; evaluators need test and design documentation; • Continuous Protection: security mechanisms cannot be tampered with.

  38. Security Classes • Four security divisions: • D – Minimal Protection • C – Discretionary Protection (‘need to know’) • B – Mandatory Protection (based on labels) • A – Verified Protection • Security classes defined incrementally; all requirements of one class automatically included in the requirements of all higher classes. • Class D for products submitted for evaluation that did not meet the requirements of any Orange Book class. • Products in higher classes provide more security mechanisms and higher assurance through more rigorous analysis.

  39. C1: Discretionary Security Protection • Intended for environments where cooperating users process data at the same level of integrity. • Discretionary access control based on individual users and/or groups. • Users have to be authenticated. • Operational assurance: TCB has its own execution domain; features for periodically validating the correct operation of the TCB. • Life-cycle assurance: testing for obvious flaws. • Documentation: User’s Guide, Trusted Facility Manual (for system administrator), test and design documentation.

  40. C2: Controlled Access Protection • Users individually accountable for their actions. • DAC at the granularity of single users. • Propagation of access rights has to be controlled and object reuse has to be addressed. • Audit trails of the security relevant events that are specified in the definition of C2. • Testing and documentation: covers the newly added security features; testing for obvious flaws only. • C2 was regarded to be the most reasonable class for commercial applications. • C2-evaluated versions of most major operating systems or database management systems.

  41. B1: Labelled Security Protection • Division B for products that handle classified data and enforce mandatory MLS policies (based on security labels). • Class B1 for system high environments with compartments. • Issue: export of labelled objects to other systems or a printer; e.g. human-readable output has to be labelled. • Higher assurance: informal or formal model of the security policy. • Design documentation, source code, and object code have to be analysed; all flaws uncovered in testing must be removed. • No strong demands on the structure of the TCB. • B1 rating for System V/MLS (from AT & T), operating systems from Hewlett Packard, DEC, and Unisys; database management systems: Trusted Oracle 7, INFORMIX-Online/Secure, Secure SQL Server.

  42. B2: Structured Protection • Class B2 increases assurance by adding design requirements. • MAC governs access to physical devices. • Users notified about changes to their security levels. • Trusted Path for login and initial authentication. • Formal model of the security policy and a Descriptive Top Level Specification (DTLS). • Modularization as an important architectural design feature. • TCB provides distinct address spaces to isolate processes. • Covert channel analysis required; events potentially creating a covert channel have to be audited. • Security testing establishes that the TCB is relatively resistant to penetration. • B2 rating for Trusted XENIX operating system.

  43. B3: Security Domain • B3 systems are highly resistant to penetration. • New requirements on security management: support for a security administrator; auditing mechanisms monitor the occurrence or accumulation of security relevant events and issue automatic warnings. • Trusted recovery after a system failure. • More system engineering efforts for to minimize the complexity of the TCB. • A convincing argument for the consistency between the formal model of the security policy and the informal Descriptive Top Level Specification. • B3 rating for versions of Wang’s XTS-300 (and XTS-200) operating system.

  44. A1: Verified Design • Functionally equivalent to B3; achieves the highest assurance level through the use of formal methods. • Evaluation for class A1 requires: • a formal model of the security policy • a Formal Top Level Specification (FTLS), • consistency proofs between model and FTLS (formal, where possible); • TCB implementation (in)formally shown to be consistent with the FTLS; formal covert channels analysis; continued existence of covert channels to be justified, bandwidth may have to be limited. • More stringent configuration management and distribution control. • A1 rating for network components: MLS LAN (from Boeing) and Gemini Trusted Network Processor; SCOMP operating system.

  45. Rainbow Series • The Orange Book is part of a collection of documents on security requirements, security management, and security evaluation published by NSA and NCSC (US National Security Agency and National Computer Security Center). • The documents in this series are known by the colour of their cover as the rainbow series. • Concepts introduced in the Orange Book adapted to the specific aspects of computer networks (Trusted Network Interpretation, Red Book) of, database management systems (Trusted Database Management System Interpretation, Lavender/Purple Book) etc.

  46. Information Technology Security Evaluation Criteria • ITSEC: harmonization of Dutch, English, French, and German national security evaluation criteria; endorsed by the Council of the European Union in 1995. • Builds on lessons learned from using the Orange Book; intended as a framework for security evaluation that can deal with new security requirements. • Breaks the link between functionality and assurance. • Apply to security products and to security systems. • The sponsor of the evaluation determines the operational requirements and threats. www.wiley.com/go/gollmann

  47. ITSEC • The security objectives for the Target of Evaluation (TOE) further depend on laws and regulations; they establish the required security functionality and evaluation level. • The security target specifies all aspects of the TOE that are relevant for evaluation: security functionality of the TOE, envisaged threats, objectives, and details of security mechanisms to be used. • The security functions of a TOE may be specified individually or by reference to a predefined functionality class. • Seven evaluation levels E0 to E6 express the level of confidence in the correctness of the implementation of security functions.

  48. US Federal Criteria • Evaluation of products, linkage between function and assurance in the definition of evaluation classes. • Protection profiles to overcome the rigid structure of the Orange Book; five sections of a protection profile: • Descriptive Elements: ‘name’ of protection profile, description of the problem to be solved. • Rationale: justification of the protection profile, including threat, environment, and usage assumptions, some guidance on the security policies that can be supported. • Functional Requirements: protection boundary that must be provided by the product. • Development Assurance Requirements. • Evaluation Assurance Requirements: type and intensity of the evaluation.

  49. Common Criteria • Criteria for the security evaluation of products or systems, called the Target of Evaluation (TOE). • Protection Profile (PP): a (re-usable) set of security requirements, including an EAL; should be developed by user communities to capture typical protection requirements. • Security Target (ST): expresses security requirements for a specific TOE, e.g. by reference to a PP; basis for any evaluation. • Evaluation Assurance Level (EAL): define what has to be done in an evaluation; there are seven hierarchically ordered EALs.

More Related