1 / 33

Trusted Operating Systems

Trusted Operating Systems. CSCI283/172 Fall 2006 GWU Draws extensively from Memon’s notes, Brooklyn Poly And Pfleeger text, Chapter 5. Need. Policy: description of requirements Model: policy representation: check if policy can be enforced Design: implementation of policy

keyesd
Download Presentation

Trusted Operating Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Trusted Operating Systems CSCI283/172 Fall 2006 GWU Draws extensively from Memon’s notes, Brooklyn Poly And Pfleeger text, Chapter 5

  2. Need • Policy: description of requirements • Model: policy representation: check if policy can be enforced • Design: implementation of policy • Trust: based on features and assurance CS283-172/Fall06/GWU/Vora/ TrustedOS

  3. Design Principles for Secure Systems • Two basic themes: • Simplicity – KISS • Makes design and interactions easy • Easy to prove its safety • Restriction • Minimize the power of entities • Compartmentalization • Common Sense! CS283-172/Fall06/GWU/Vora/ TrustedOS

  4. Principles of design • Principle of least privilege • Principle of fail-safe defaults • Principle of economy of mechanism • Principle of complete mediation • Principle of open design • Principle of separation of privilege • Principle of least common mechanism • Principle of psychological acceptability CS283-172/Fall06/GWU/Vora/ TrustedOS

  5. Principle of least privilege Entity should be given only those privilege needed to finish a task • Function/role should control the rights • Temporary elevation of privilege should be relinquished immediately • Granularity of privileges CS283-172/Fall06/GWU/Vora/ TrustedOS

  6. Principle of fail-safe defaults Unless a subject is given explicit access to an object, it should be denied access to the object. • Default access to an object is none • If subject is unable to complete its task before it terminates it should undo changes made to the state of the system. • Restricting privileges at the time of creation CS283-172/Fall06/GWU/Vora/ TrustedOS

  7. Principle of economy of mechanism Security mechanism should be as simple as possible. • Fewer errors • Testing and verification is easy • Assumptions are less CS283-172/Fall06/GWU/Vora/ TrustedOS

  8. Principle of complete mediation All accesses to objects should be checked to ensure they are allowed. Illegitimate access attempts should be expected and protected against • Security vs. performance issues CS283-172/Fall06/GWU/Vora/ TrustedOS

  9. Principle of open design Security of a mechanism should not depend upon secrecy of its design or implementation • Secrecy != security • Complexity != security • “Security through obscurity” • Cryptography and openness • AES CS283-172/Fall06/GWU/Vora/ TrustedOS

  10. Principle of separation of privilege System should not grant permission based on single condition • Company checks over $75,000 to be signed by two officers. CS283-172/Fall06/GWU/Vora/ TrustedOS

  11. Principle of least common mechanism Mechanisms used to access resources should not be shared • Why? CS283-172/Fall06/GWU/Vora/ TrustedOS

  12. Principle of psychological acceptability • Security mechanism should not make the resource difficult to access • Recognizes the most important element in computer security? Human CS283-172/Fall06/GWU/Vora/ TrustedOS

  13. Design Principles for Privacy • Fair information practices - US Privacy Act of 1974. • Openness and transparency: No secret record keeping. This includes both the publication of existence, as well as contents. • Individual participation: The subject of a record should be able to see and correct the record. • Collection limitation: Data collection should be proportional to the purpose of the collection. CS283-172/Fall06/GWU/Vora/ TrustedOS

  14. Design Principles for Privacy -2 • Data quality: Data should be relevant to the purposes for which they are collected and should be kept up to date. • Use limitation: Data should only be used for their specific purpose by authorized personnel. • Reasonable security: Adequate security safeguards should be put in place, according to the sensitivity of the data collected. • Accountability: Record keepers must be accountable for compliance with the other principles. CS283-172/Fall06/GWU/Vora/ TrustedOS

  15. Need • Policy: description of requirements • Model: policy representation: check if policy can be enforced • Design: implementation of policy • Trust: based on features and assurance CS283-172/Fall06/GWU/Vora/ TrustedOS

  16. Assurance Methods • Testing • Problems: • Cannot check for all possible problems • Difficult to characterize exactly what is going on • Testing involves modification which might change the system: observation changes the observed • Budget and time limitations • Ethical hacking/Penetration testing • Formal verification CS283-172/Fall06/GWU/Vora/ TrustedOS

  17. minimum(A, n) min := A[1]; for i=1:n if A[i] < min min := A[i]; Assertion P (initial conditions): n > 0 Assertion Q (true for all loops) n > 0; 1  i  n; min  A[1] Assertion R (for a particular loop i) n > 0; 1  i  n; j, 1  j  i-1, min  A[j] Assertion S (on loop exit) n > 0; i=n+1; j, 1  j  n, min  A[j] Formal Verification CS283-172/Fall06/GWU/Vora/ TrustedOS

  18. Trusted OS

  19. Typical OS Flaws • I/O • Talking to independent, intelligent systems/devices • Code complex and dependent on h/w interface • Might bypass OS functions, might end up bypassing security • Some OS’s eliminate the security features associated with transfering a single character • Not enough isolation (shared libraries etc.) • Incomplete mediation (e.g. access policy checked only every I/O operation or process execution etc. not continuously) • OS hooks for external s/w provide access powers identical to that of OS. CS283-172/Fall06/GWU/Vora/ TrustedOS

  20. I/O Exploits • Time-of-check to time-of-use mismatch: access permission checked for user to access particular object X. Between the time the access is approved to the time the access occurs, user changes the designation of the object, so now she accesses an unapproved object. • I/O source/destination address (often resides in user memory) can be changed after checking, while I/O in progress. • Common system buffers can contain data accessible to multiple users • Better to use simple security mechanims with clearly defined access control policies. CS283-172/Fall06/GWU/Vora/ TrustedOS

  21. Trusted OS Features • ID and Authentication • Mandatory and Discretionary Access Control • Object Reuse Protection • Complete Mediation • Trusted Path (makes Trojan Horse intercepting data/man in the middle more difficult) • Accountability and Audit Log • Audit Log Reduction – issues: too much, too little, how to make sense? • Intrusion Detection – statistical analysis of logs CS283-172/Fall06/GWU/Vora/ TrustedOS

  22. Security Kernel Enforces security mechanisms of entire OS; provides security interfaces among h/w, OS, other parts Covers all object access Can isolate security mechanisms Compactness – sometimes Modularity: change, test etc. Can be verified, analyzed through forma methods, etc. CS283-172/Fall06/GWU/Vora/ TrustedOS

  23. Parts of the security kernel • Reference Monitor: controls access. Needs to be small and tamperproof. Part of TCB. • Trusted Computing Base (TCB): everything necessary to enforce security policy. • TCB constituted of: • h/w • processes and interprocess communication • primitive files • protected memory CS283-172/Fall06/GWU/Vora/ TrustedOS

  24. Parts of TCB • TCB constituted of: • h/w • processes and interprocess communication • primitive files • protected memory • Not in TCB: user apps, user environment, directories, procedures, memory management CS283-172/Fall06/GWU/Vora/ TrustedOS

  25. TCB function • Process Activation • Requires complete change of registers, file access lists, process status info. • Execution domain switch when processes in one domain invoke processes in other domains • Memory Protection • I/O CS283-172/Fall06/GWU/Vora/ TrustedOS

  26. Market Need • Problem: current systems insecure because vast code base; many vulnerabilities, lay users • Examples: financial systems vulnerable to attacks; do not who talking to • Need: Secure, interoperable, open system • Examples of secure closed systems: games, set-top boxes, smart cards, cell phones. PC will never be replaced by these? CS283-172/Fall06/GWU/Vora/ TrustedOS

  27. Commercial Need • Open architecture • Allows addition of arbitrary s/w and h/w without requiring central authority • Needs to operate in legacy environment • Low cost for modifications CS283-172/Fall06/GWU/Vora/ TrustedOS

  28. NGSCBNext Generation Secure Computing Base • Can use in both trusted and untrusted form. • Isolation among OS’s and processes, particularly from I/O using machine monitors. Normal/Trusted OS – each protected from surveillance by other. • h/w and s/w security primitives; no tamperproof h/w because attacks are mostly s/w attacks (also s/w attacks are those that are Break Once Run Everywhere (BORE) • Authenticated operation: All entities say their identity and prove it using cryptographic techniques; program executable code hash is its ID. If anything changes, the hash changes. CS283-172/Fall06/GWU/Vora/ TrustedOS

  29. Sealed storage/attestation • Sealed Storage: • Long-lived secrets are “sealed” by owner, with a list of those allowed to “unseal”. Owner’s ID associated with the secret. • Example of sealing: • Why owner’s ID associated with secret? • Attestation • Use public key of generating platform to authenticate code • Equivalent to code bearing a certificate issued by the platform • Platform itself bears a certificate provided by a CA • Can use other anonymous methods CS283-172/Fall06/GWU/Vora/ TrustedOS

  30. Authenticated Boot • OS also has to identify itself in this manner • h/w must authenticate the boot kernel • Cryptographic keys stored securely • In cryptographic co-processor – does authentication operations in h/w for OS • OS does similar operations for other applications • Crypto co-processor: Also boots virtual machine monitor; has PRNG; TCPA’s TPM is an example • (What is TCPA?) CS283-172/Fall06/GWU/Vora/ TrustedOS

  31. Upgrades and Openness Upgrades • What happens when OS upgraded? • Need a single sealed secret to have more than one kernel allowed access. That kernel can then reseal. • Openness • Manferdelli has announced that the kernel code will be available for review • Will his management continue to support this? CS283-172/Fall06/GWU/Vora/ TrustedOS

  32. Provides security and authenticity of data • Provides privacy only in so much as privacy is security of personal information • Protects against malware because? • Applications: • secure shopping, banking, taxes • rights management of enterprise data (email rules, document rules) • entertainment media distribution too widespread (single media asset in too many places at a given time) to benefit from this, but technically can be done CS283-172/Fall06/GWU/Vora/ TrustedOS

  33. Trusted by whom? • Trusted by authenticator but … • Public key provides a tracking means • Suggested fixes: • Pseudonyms issued by trusted third party (trusted parties might collude, usually few of them) • Secret sharing • Anonymous credentials (e-cash-like) • Privacy community? Conflicts? • What else required for a Trusted O/S? CS283-172/Fall06/GWU/Vora/ TrustedOS

More Related