html5-img
1 / 24

Software and Security

Software and Security. Butler Lampson Microsoft. Real-World Security. It’s about value, locks, and punishment. Locks good enough that bad guys don’t break in very often. Police and courts good enough that bad guys that do break in get caught and punished often enough.

jacob
Download Presentation

Software and Security

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software and Security Butler Lampson Microsoft

  2. Real-World Security • It’s about value, locks, and punishment. • Locks good enough that bad guys don’t break in very often. • Police and courts good enough that bad guys that do break in get caught and punished often enough. • Less interference with daily life than value of loss. Security is expensive—buy only what you need. • People do behave this way • We don’t tell them this—a big mistake • Perfect security is the worst enemy of real security

  3. Dangers and Vulnerabilities • Dangers • Vandalism or sabotage that • damages information • disrupts service • Theft of money • Theft of information • Loss of privacy integrity availability integrity secrecy secrecy • Vulnerabilities • Bad (buggy or hostile) programs • Bad (careless or hostile) peoplegiving instructions to good programs

  4. Defensive strategies • Control the bad guys • Coarse: Isolate—keep everybody out • Medium: Exclude—keep the bad guys out • Fine: Restrict—Keep them from doing damage Recover—Undo the damage • Catch the bad guys and punish them • Auditing, police

  5. The Access Control Model • Guards control access to valued resources. Do Reference Object Principal operation monitor Source Request Guard Resource

  6. Mechanisms—The Gold Standard Authenticating principals • Mainly people, but also channels, servers, programs(encryption implements channels, so key is a principal) Authorizing access • Usually for groups, principals that have some property, such as “type-safe” or “safe for scripting” Auditing • Assurance • Trusted computing base

  7. Assurance: Making Security Work • Trusted computing base • Limit what has to work to ensure security • Ideally, TCB is small and simple • Includes hardware and software • Also includes configuration, usually overlooked • What software has privileges • Database of users, passwords, privileges, groups • . . . • The unavoidable price of reliability is simplicity.—Hoare

  8. Why We Don’t Have “Real” Security • A. People don’t buy it: • Danger is small, so it’s OK to buy features instead. • Security is expensive. • Configuring security is a lot of work. • Secure systems do less because they’re older. • Security is a pain. • It stops you from doing things. • Users have to authenticate themselves. • B. Systems are complicated, so they have bugs. • Especially the configuration

  9. End-to-End Security • Be explicit about trust • Audit all security decisions • Take account of channels, machines, and software • Delegate authority (to groups or systems) • Work uniformly between organizations • Microsoft can securely accept Intel’s authentication • Groups can cross organization boundaries

  10. End-to-End example • Alice is at Intel, working on Atom, a joint Intel-Microsoft project • Alice connects to Spectra, Atom’s web page, with SSL • Chain of responsibility: KSSLKtempKAliceAlice@IntelAtom@Microsoftr/wSpectra Microsoft says Intel Alice@Intel Atom@Microsoft says SpectraACL says KAlice KAlice Ktemp KSSL Alice’s smart card Alice’s login system Spectraweb page

  11. Principals • Authentication: Who sent a message? • Authorization: Who is trusted? • Principal — abstraction of “who”: • People Alice, Bob • Services microsoft.com, Exchange • Groups UW-CS, MS-Employees • Secure channels key #678532E89A7692F, console • Principals say things: • “Read file foo” • “Alice’s key is #678532E89A7692F”

  12. Speaks For • Principal A speaks for B: AÞTB • Meaning: if A says something in set T, B says it too. • Thus A is stronger than B, or responsible for B, about T • Examples • Alice ÞAtom group of people • Key #7438 ÞAlice key forAlice • We trust A to delegate its own authority. • Delegation rule: If Asays “BÞA” then BÞA • Why shouldA delegate to B? Analyze case by case. • Next: four examples of “speaks for”.

  13. Microsoft says Intel Alice@Intel Atom@Microsoft says SpectraACL says KAlice KAlice Ktemp KSSL Alice’s smart card Alice’s login system Spectraweb page Authenticating Channels • Chain of responsibility: • KSSLKtempKAlice Alice@Intel … • KtempsaysKAlicesays • (SSL setup) (via smart card)

  14. Microsoft says Intel Alice@Intel Atom@Microsoft says SpectraACL says KAlice KAlice Ktemp KSSL Alice’s smart card Alice’s login system Spectraweb page Authenticating Names: SDSI/SPKI • A name is in a name space, defined by a principal P • P is like a directory. The root principals are keys. • P speaks for any name in its name space KIntelÞKIntel / Alice (which is just Alice@Intel) KIntelsays • … KtempKAliceAlice@Intel …

  15. Microsoft says Intel Alice@Intel Atom@Microsoft says SpectraACL says KAlice KAlice Ktemp KSSL Alice’s smart card Alice’s login system Spectraweb page Authenticating Groups • A group is a principal; its members speak for it • Alice@Intel Þ Atom@Microsoft • Bob@Microsoft Þ Atom@Microsoft • … • Evidence for groups: Just like names and keys. • … KAliceAlice@IntelAtom@Microsoftr/w …

  16. Microsoft says Intel Alice@Intel Atom@Microsoft says SpectraACL says KAlice KAlice Ktemp KSSL Alice’s smart card Alice’s login system Spectraweb page Authorization with ACLs • View a resource object O as a principal • An ACL entry for P means P can speak for O • Permissions limit the set of things P can say for O • If Spectra’s ACL saysAtom can r/w, that means Spectrasays • … Alice@IntelAtom@Microsoftr/wSpectra

  17. Microsoft says Intel Alice@Intel Atom@Microsoft says SpectraACL says KAlice KAlice Ktemp KSSL Alice’s smart card Alice’s login system Spectraweb page End-to-End Example: Summary • Request on SSL channel: KSSLsays “read Spectra” • Chain of responsibility: KSSLKtempKAliceAlice@IntelAtom@Microsoftr/wSpectra

  18. Authenticating Systems: Loading • A digest X can authenticate a program SQL: • KMicrosoftsays “If image I has digest X then I is SQL”formallyXÞKMicrosoft/SQL • This is just like KAliceAlice@Intel • But a program isn’t a principal: it can’t say things • To become a principal, a program must be loaded into a host H • Booting is a special case of loading • XÞSQL makes H want to run I if H likes SQL • It also makes H assert that the running I is SQL

  19. Authenticating Systems: Quoting • A loaded program depends on the host it runs on. • We write H | SQL for SQL running on H • H | SQL sayss = HsaysSQLsayss • H can’t prove that it’s running SQL • But H can be trusted to run SQL • KMicrosoftsaysH | SQL ÞKMicrosoft/SQL • This lets H convince others that it’s running SQL

  20. Certifying Properties • Need a trusted authority: CAÞ “type-safe” • Actually KMSsays CA ÞKMS / “type-safe” • Usually done manually • Can also be done by a program P • A compiler • A class loader • A more general proof checker • Logic is the same: PÞ “type-safe” • Someone must authorize the program: • KMSsays P ÞKMS / “type-safe”

  21. Compound Principals • A Bsayss = (Asayss)  (Bsayss) • H | Psayss = HsaysPsayss • A Bsayss = (Asayss)  (Bsayss) • Useful for weakening a principal: • A Bsays “read f” needs both A ÞRf and B ÞRf • Example: Java rule—callee Þ caller  callee-code • Example: NT restricted tokens—if process P is running untrusted-code for blampson thenP Þblampson untrusted-code

  22. Auditing • Checking access: • Given a request KAlicesays “readSpectra” an ACL Atom may r/wSpectra • Check KAlice speaks KAliceÞAtom for Atom rights suffice r/wread • Auditing: Each step is justified by • A signed statement (certificate), or • A delgation rule

  23. Assurance: NGSCB (Palladium) • A cheap, convenient, physically separate machine • A high-assurance OS stack (we hope) • A systematic notion of program identity • Identity = digest of (code image + parameters) • Can abstract this: KMSsays digest KMS / SQL • Host certifies the running program’s identity:HsaysK  H | P • Host grants the program access to sealed data • H seals (data, ACL) with its own secret key • H will unseal for P if P is on the ACL

  24. Learn more • Computer Security in the Real World • at research.microsoft.com/lampson • (slides, paper; earlier papers by Abadi, Lampson, Wobber, Burrows) • Ross Anderson – www.cl.cam.ac.uk/users/rja14 • Bruce Schneier – Secrets and Lies

More Related