1 / 58

Proving Security Protocols Correct— Correctly

Proving Security Protocols Correct— Correctly. Jonathan Herzog 21 March 2006.

hana
Download Presentation

Proving Security Protocols Correct— Correctly

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification purposes only, and is not intended to convey or imply MITRE's concurrence with, or support for, the positions, opinions or viewpoints expressed by the author.

  2. Introduction • This talk: soundness of symbolic proofs for security protocols • Think: Are proofs in an ‘ideal’ world meaningful in the real world? Even when national secrets are on the line? • Answer: mostly ‘yes,’ but sometimes ‘no’ • But first: what are security protocols? • Scenario: A and B want to create shared secret key • Must communicate over unsecured network

  3. EKB(A || Na) EKA(Na || Nb) EKB(Nb) Needham-Schroeder protocol (Prev: A, B get other’s public encryption keys) A B B,K A,K

  4. Security goals • Authentication of A to B: • “If B outputs (A,K), then A outputs (B,K’)” • Mutual authentication: both A to B and B to A • Key agreement: • If A outputs (X,K) and B outputs (Y,K’), then K=K’ • Secrecy: surprisingly tricky to define • Intuition: only people who can know K should be A, B Does Needham-Schroeder achieve any of these?

  5. EKM(A || Na) EKM(Nb) EKB(A || Na) EKB(Nb) EKA(Na || Nb) Needham-Schroeder: broken • A = Alice, B = Alice’s bank, M = on-line merchant • Alice buys goods from merchant • Merchant masquerades as Alice to her bank • (Lowe, 1995) M A B M,K A,K

  6. EKB(A || Na) EKA(Na || Nb || B) EKB(Nb) Needham-Schroeder-Lowe protocol ‘Fix’ by Lowe (1995) A B B,K A,K • Added B’s name to 2nd message • Is this secure? Is TLS? Kerberos? SSH? • More importantly: how to analyze?

  7. The symbolic model • Analysis framework for security protocols • Originally proposed by Dolev & Yao (1983) • General philosophy: be as high-level as possible • Three general intuitions: • Axiomatize the messages • Axiomatize the adversary • Security is unreachability

  8. Axiomatize the message space Messages are parse trees • Use symbols to represent atomic messages • Countable symbols for keys (K, K’, KA, KB, KA-1, KB-1 …) • Countable symbols for nonces (N, N’, Na, Nb, …) • Countable symbols for names (A, B,…) Just symbols: no a priori relationships or structure • Helper functions: keyof(A) = KA, inv(KA)= KA-1 • Encryption ( EK(M) ) pairing ( M || N ) are constructors • Protocols described (mostly) by messages sent/received

  9. Axiomatize the adversary Described by explicitly enumerated powers • Interact with countable number of participants • Each participant can play any role • Adversary also legitimate participant • Knowledge of all public values, non-secret keys • Limited set of re-write rules: Adversary can (non-deterministically) compose atomic abilities

  10. Security is unreachability Some state is unreachable via chain of adversary actions • Secrecy (symbolic model): “If A or B output (X,K), then no composition of adversary actions can result in K” • Authentication of A to B: “If B outputs (A,K), then no composition of adversary actions can result in A outputting (X,K’) where X≠B” Main advantage of symbolic model: security proofs are simple • Automatable, in fact! • Demo 1-- NSL provides both: • Mutual authentication • Key agreement • Secrecy for both Na, Nb

  11. A biased sample of previous work (symbolic model) • Analysis methods/mathematical frameworks • Many, many, many proposed • Two main survivors: spi calculus [AG] & strand spaces [THG] • Automation • Undecidable in general [EG, HT, DLMS] but: • Decidable with bounds [DLMS, RT] • Also, general case can be automatically verified in practice • Cryptographic Protocol Shape Analyzer [DHGT] • Many others • Extensions • Diffie-Hellman [MS, H] • Trust-management / higher-level applications [GTCHRS] • Compilation • Cryptographic Protocol Programming Language (CPPL) [GHRS]

  12. Central issue of this talk So what? • Symbolic model has weak adversary, strong assumptions • No a priori guarantees about stronger adversaries • Real adversaries can make up new “ciphertexts” • Real adversaries can try decrypting with wrong key • Real adversaries can exploit relationships between nonces/keys Symbolic proofs may not apply! This talk: ways in which symbolic proofs are (and are not) meaningful in the computational model Can we trust symbolic security proofs in the ‘real world’?

  13. The computational model Outgrowth of complexity theory

  14. K, K-1 G K m0, m1 b U(0,1) K, mb c E g Example: semantic security [GM] Described as game between ref and adversary: • Ref generates fresh key-pair • Ref gives public key to adversary • Adversary provides two messages: m0 and m1 • Ref chooses one randomly, encrypts it • Adversary gets resulting ciphertext • Adversary guesses which was encrypted Semantic security: no adversary can do better than chance R A  poly-time A: Pr[b=g] ≈ .5

  15. K K K real/random Example II: real-or-random secrecy (‘universally composable’ version) P2 Another game, between adversary and protocol participants • Participants engange in protocol • Adversary has control over network • When any participant finishes protocol, outputs either real key or random key • Other participants continue protocol, output same key • Adversary guesses ‘real’ or ‘random’ Real-or-random secrecy: no adversary can do better than chance P1 P3 A  poly-time A: Pr[A is correct] ≈ .5

  16. Hard Easy Soundness • Computational properites are strong, but complex and hard to prove • Symbolic proofs are much easier, but unconvincing • Soundness: symbolic proofs imply computational properties Result: automated proof-methods yield strong properties! Protocol Protocol Symbolic property Computational property Protocol Hard, but done once Protocol

  17. Previous work (soundness) • [AR]: soundness for indistinguishability • Passive adversary • [MW, BPW]: soundness for general trace properties • Includes mutual authentication; active adversary • Many, many others Remainder of talk: 2 non-soundness results • Key-cycles (joint work with Adao, Bana, Scedrov) • Secrecy (joint work with Canetti)

  18. EK(K) EK1(…K2…) EK2(…K3…) … EKn(…K1…) Key cycles • When a key is used to encrypt itself • More generally: K1 encrypts K2, K2 encrypts K3 … until Kn encrypts K1 • Problem for soundness • Symbolic model: key-cycles are like any other encryption • Computational model: standard security defs don’t apply

  19. K, K-1 G K m0, m1 b U(0,1) K, mb c E g Semantic security, revisited • Adversary generates m0 and m1 based on public key only! • Doesn’t talk about messages based on private keys • Easy to devise semantically secure schemes that fail in presence of key-cycles R A

  20. EK(M), if M≠K E’K(M) = K, if M=K Counter-example • Let E be a semantically-secure encryption algorithm • Let E’ be: • Semantically secure, unless encounters a key-cycle • Contrived example, but valid counterexample • Symbolic encryption stronger than semantic security Soundness requires new computational security definition

  21. Resolution: ‘KDM security’ ‘Key-dependent message security’ • Proposed by [BRS/AC] • Implies soundness in presence of key cycles [ABHS] Future work • Devise a KDM-secure encryption algorithm • Find a non-contrived non-KDM algorithm • Define & implement KDM-secure hashing • Note: hash-based key-cycles occur in TLS and SSH!

  22. Soundness for secrecy • Does symbolic secrecy imply computational secrecy? • Implies weakened notion [CW], but… • Unfortunately, not the UC definition • Counter-example: • Demo: NSL satisfies symbolic secrecy for Nb • Cannot provide UC real-or-random secrecy

  23. EKB( A || Na) EKA( Na || Nb || B ) EKB(Nb) EKB(K) ? K =? Nb The ‘Rackoff attack’ (on NSL) A B Adv

  24. Achieving soundness • Every single symbolic secrecy proof has been wrong weak • Symbolic secrecy implies only weak computational properties • ‘Real’ soundness requires new symbolic definition of secrecy • [BPW]: ‘traditional’ secrecy + ‘non-use’ • Thm: new definition implies secrecy • But: must analyze infinite concurrent sessions and all resulting protocols • Here: ‘traditional’ secrecy + symbolic real-or-random • Non-interference property; close to ‘strong secrecy’ [B] • Thm: new definition equivalent to UC real-or-random • Demonstrably automatable (Demo 2)

  25. Decidability of secrecy Side effect of proof method: • Computational crypto automagically prevents cross-session interaction • Thus, suffices to analyze single session in isolation

  26. More future work • Soundness • Implement decision procedure for symbolic real-or-random • Extend result past public-key encryption (e.g., hashing, symmetric encrypion) • Apply analysis to real-world protocols (TLS, SSH, etc) • What is traditional symbolic secrecy good for? • Symbolic model • Apply methods to new problems (crypto APIs) • Unify compilation, analysis tools • Symbolic notions for new properties (e.g., anonymity)

  27. Conclusion • Want to prove protocols secure • Easy to prove security in ‘ideal’ setting (symbolic model) • Meaningful to prove security in ‘real’ setting (computational model) • Soundness: ‘ideal’ proof implies ‘real’ security • Two aspects of symbolic model are not sound • Key-cycles: must strengthen computational encryption • Secrecy: must strengthen symbolic definition • Important side-effect: soundness for new definition implies decidability

  28. Thanks!

  29. K, K-1 G K b U(0,1) g KDM-secure encryption (oversimplified) • Adversary provides two functionsf0 and f1 • Referee chooses one, applies to private key, encrypts result • KDM security: no adversary can do better than random • Strictly stronger than semantic security R A f0, f1 K, fb(K-1) c E

  30. Overview • This talk: symbolic analysis can guarantee universally composable (UC) key exchange • (Paper also includes mutual authentication) • Symbolic (Dolev-Yao) model: high-level framework • Messages treated symbolically; adversary extremely limited • Despite (general) undecidability, proofs can be automated • Result: symbolic proofs are computationally sound (UC) • For some protocols • For strengthened symbolic definition of secrecy • With UC theorems, suffices to analyze single session • Implies decidability!

  31. Two approaches to analysis • Standard (computational) approach: reduce attacks to weakness of encryption • Alternate approach: apply methods of the symbolic model • Originally proposed by Dolev & Yao (1983) • Cryptography without: probability, security parameter, etc. • Messages are parse trees • Countable symbols for keys (K, K’,…), names (A, B,…) and nonces (N, N’, Na, Nb, …) • Encryption ( EK(M) ) pairing ( M || N ) are constructors • Participants send/receive messages • Output some key-symbol

  32. The symbolic adversary • Explicitly enumerated powers • Interact with countable number of participants • Knowledge of all public values, non-secret keys • Limited set of re-write rules:

  33. ‘Traditional’ symbolic secrecy • Conventional goal for symbolic secrecy proofs: “If A or B output K, then no sequence of interactions/rewrites can result in K” • Undecidable in general [EG, HT, DLMS] but: • Decidable with bounds [DLMS, RT] • Also, general case can be automatically verified in practice • Demo 1: analysis of both NSLv1, NSLv2 • So what? • Symbolic model has weak adversary, strong assumptions • We want computational properties! • …But can we harness these automated tools?

  34. Two challenges • Traditional secrecy is undecidable for: • Unbounded message sizes [EG, HT] or • Unbounded number of concurrent sessions (Decidable when both are bounded) [DLMS] • Traditional secrecy is unsound • Cannot imply standard security definitions for computational key exchange • Example: NSLv2 (Demo)

  35. Prior work: BPW New symbolic definition Theory Practice Implies UC key exchange (Public-key & symmetric encryption, signatures)

  36. + Finite system Our work New symbolic definition: ‘real-or-random’ Theory Practice Automated verification! Equiv. to UC key exchange (Public-key encryption [CH], signatures [P]) UC suffices to examine single protocol run Decidability? Demo 3: UC security for NSLv1

  37. Our work: solving the challenges • Soundness: requires new symbolic definition of secrecy • Ours: purely symbolic expression of ‘real-or-random’ security • Result: new symbolic definition equivalent to UC key exchange • UC theorems: sufficient to examine single protocol in isolation • Thus, bounded numbers of concurrent sessions • Automated verification of our new definition is decidable!… Probably

  38. Summary • Summary: • Symbolic key-exchange sound in UC model • Computational crypto can now harness symbolic tools • Now have the best of both worlds: security and automation! • Future work

  39. K K Secure key-exchange: UC ? P P A Answer: yes, it matters • Negative result [CH]: traditional symbolic secrecy does not imply universally composable key exchange

  40. F S K K Secure key-exchange: UC P ? ? P A Adversary gets key when output by participants • Does this matter? (Demo 2)

  41. K, K’ Secure key-exchange [CW] P P A • Adversary interacts with participants • Afterward, receives real key, random key • Protocol secure if adversary unable to distinguish • NSLv1, NSLv2 satisfy symbolic def of secrecy • Therefore, NSLv1, NSLv2 meet this definition as well

  42. F S KE ? P P A Adversary unable to distinguish real/ideal worlds • Effectively: real or random keys • Adversary gets candidate key at end of protocol • NSL1, NSL2 secure by this defn.

  43. Natural translation for large class of protocols Would like Main result of talk (Need only be done once) Simple, automated Analysis strategy Dolev-Yao protocol Dolev-Yao key-exchange Concrete protocol UC key-exchange functionality

  44. Proof overview (soundness) Symbolic key-exchange • Construct simulator • Information-theoretic • Must strengthen notion of UC public-key encryption • Intermediate step: trace properties(as in [MW,BPW]) • Every activity-trace of UC adversary could also be produced by symbolic adversary • Rephrase: UC adversary no more powerful than symbolic adversary Single session UC KE (ideal crypto) UC w/ joint state [CR] (Info-theor.) Multi-session UC KE (ideal crypto) UC theorem Multi-session KE (CCA-2 crypto)

  45. {P1, N1}K2 {P2, N1, N2}K1 {N2}K2 “Simple” protocols • Concrete protocols that map naturally to Dolev-Yao framework • Two cryptographic operations: • Randomness generation • Encryption/decryption • (This talk: asymmetric encryption) • Example: Needham-Schroeder-Lowe P1 P2

  46. (P2 P1) (P1 P2) (P1 P2) (P2 P1) Key P1 Key P2 Key k Key k X Key P2 UC Key-Exchange Functionality FKE (P1 P2) A P1 k  {0,1}n (P2 P1) P2

  47. M1 L M2 Local output: Not seen by adversary The Dolev-Yao model • Participants, adversary take turns • Participant turn: A P1 P2

  48. Application of deduction The Dolev-Yao adversary • Adversary turn: A P1 P2 Know

  49. Dolev-Yao adversary powers • Always in Know: • Randomness generated by adversary • Private keys generated by adversary • All public keys

  50. The Dolev-Yao adversary A Know M P1 P2

More Related