1 / 45

Symbolic methods for cryptography

Symbolic methods for cryptography. Bogdan Warinschi. University of Bristol. Toy example. B. A. A, N 1. {N 1 , N 2 , Ks } K. {B, N 2 } Ks { D } Ks. K. K. Is the data D secret?. Security Models. Mathematical model. Security property. Proof method. Abstraction Levels.

marlin
Download Presentation

Symbolic methods for cryptography

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Symbolic methods for cryptography Bogdan Warinschi University of Bristol Computational Soundness

  2. Toy example B A A, N1 {N1, N2, Ks} K {B, N2}Ks{D}Ks K K Is the data D secret? Computational Soundness

  3. Security Models Mathematical model Security property Proof method Computational Soundness

  4. Abstraction Levels Computational Soundness

  5. Abstraction Levels Insecurity Computational Soundness

  6. Abstraction Levels Security Computational Soundness

  7. Two types of security models Security property Model Security property Model Proof method Proof method Security property Security property Model Model Proof method Proof method Computational Soundness

  8. Outline • A gap between models for encryption: • security definitions • proofs • Bridging the gap: • The passive adversaries case: • the Abadi-Rogaway logic • extensions • The active adversaries case (tomorrow) Computational Soundness

  9. Two views of security for encryption schemes Computational Soundness

  10. Symbolic treatment of encryption • Messages are elements from a term algebra: • Data = {D1,D2,…}, • Keys = {K1,K2,…}, • Random nonces = {N1,N2,…}, • Identities = {A,B,…} • BASIC := Data | Keys | Random nonces | Identities • TERM := BASIC | (TERM, TERM) | {TERM}Keys • Messages are terms, e.g. N2 , {((B, N1), Ks)}K Computational Soundness

  11. Symbolic treatment of encryption • Security for encryption is axiomatized • Given {M}K adversary can compute M only if it has K (M1,M2) {M}K, K M1,M2 M, K {M}K M1, M2 (M1, M2) M Computational Soundness

  12. Computational treatment for encryption • Messages are bitstrings • Symmetric encryption scheme  = (Kg, Enc, Dec) • Kg(η) outputs a random bitstring k in {0,1}η • Enc: {0,1}η× {0,1}* → {0,1}* (distribution on {0,1}*) • Dec: {0,1}η× {0,1}* → {0,1}* • It holds that: Dec (k, Enc(k,m) ) = m • E.g. AES-CBC Computational Soundness

  13. Computational treatment for encryption M0,M1 (|M0|=|M1|) Enc(K,_) b Enc (K,Mb) •  = (Kg,Enc,Dec) ; b=? Encryption scheme is IND-CPA secure if for all adversaries, Pr [ Adversary guessess b]  ½ + negligible function (η) Computational Soundness

  14. Security of double encryption: B A • Is the message M secret? { {M} K }K K K Computational Soundness

  15. Security of double encryption: symbolically • Does there exist a derivation: {M}K, K M, K {{M}K}K {M}K M using only: ……… (M1,M2) M1,M2 M M1, M2 (M1, M2) Computational Soundness

  16. Security of double encryption: computationally M0,M1 (|M0|=|M1|) Enc(K,(Enc(K,_)) b b=? Enc(K,Enc (K,Mb)) Computational Soundness

  17. Security of double encryption: computationally M0,M0 Enc(K,_) b C0=Enc(K, M0) M0,M1 M1,M1 C C1=Enc(K, M1) C0,C1 C=Enc(K,(Enc(K, Mb) b=? Computational Soundness

  18. Two Paradigms for Protocol Analysis Symbolic Approach Computational Approach • Concrete model • Powerful PPT adversaries • Clear definitions for the security of primitives • Complex protocols are difficult to analyze • Abstract model • D-Y adversaries • Unclear how to ensure security of primitives • Proofs can potentially be automatized (theorem provers, model checkers) Computational Soundness

  19. Two types of security models Security property Model Security property Model Proof method Proof method Security property Security property Model Model Proof method Proof method Computational Soundness

  20. Two ways of bridging the gap Security property Model Security property Model Proof method Proof method Security property Security property Model Model Proof method Proof method Apply methods/techniques from the red world directly in the blue world: Bruno, Sylvain, Marion’s talks Show that security in the red world implies security in the blue world Computational Soundness

  21. Computational Soundness Soundness Theorems • Prove security in the symbolic model • Apply the soundness theorem • Deduce security in the computational model Security property Security property Computational model Symbolic model Computational proof Symbolic proof Computational Soundness

  22. Two types of security models Security property Model Security property Model Proof method Proof method Security property Security property Model Model Proof method Proof method Security InSecurity Security Computational Soundness

  23. Toy example B A A, N1 {N1, N2, Ks} K {B, N2} Ks{D}Ks K K Is the data D secret? Computational Soundness

  24. Passive adversaries • A protocol run: • Two interleaved sessions: • Two interleaved sessions with corruption: A, N1, {N1, N2, Ks}K, {B, N2}Ks{D1}Ks A, N1, {N1, N2, Ks}K, A, N3, {N3, N4, Ks’}K, {B, N4}Ks’, {D2}Ks’,{B,N2}Ks,{D1}Ks A, N1, {N1, N2, Ks}K, Ks, A, N3, {N3, N4, Ks’}K, {B, N4}Ks’, {D2}Ks’,{B,N2}Ks, {D1}Ks Computational Soundness

  25. Defining secrecy, symbolically To each expression associate a pattern: For E={N1}K1,{{K1}K2}K3,K3,{K3}K2,{{K1,N2}K3,K3}K2 patt(E)= ▓, {▓}K3, K3, ▓, ▓ (tentative definition) patt(E)={N}K1,{{K0}K2}K3,K3,{K0}K2,{{K0,N}K0,K0}K2 Computational Soundness

  26. Defining secrecy, symbolically Is D1 secret in A, N1, {N1, N2, Ks}K, {B, N2}Ks{D1}Ks • Definition:D is hidden in E if D does not occur in patt(E) Computational Soundness

  27. Defining secrecy, computationally • Given: • a valuation f: {D1,D2,...}  {0,1}n • an encryption scheme = (Kg, Enc, Dec) • Define: A, N1, {N1, N2, Ks}K, {B, N2}Ks{D1}Ks [[ _ ]] :Expressions  Distributions f Computational Soundness

  28. Mapping expressions to (distributions on) bitstrings Kg 01000100…11011 Kg 111101100…11101 Rand 00110100…11110 f Blah…blah…(in binary) Enc( , ) 01000100…11011 111101100…11101 11010101100…10001 00110100…11110 Enc( , ) 01000100…11011 Blah…blah…(in binary) 11010101100…10001 [[ _ ]] :Expressions  Distributions f {D1,{K5,N}K1}K1 11010101100…1000101010010100101111111111110100100101110100001101110000001010100001011101001 Computational Soundness

  29. Defining secrecy, computationally Kg 01000100…11011 Kg 111101100…11101 Rand 00110100…11110 f1 f0 100110110001110 000101010000111 b=?  [[ E ]] fb [[ _ ]] :Expressions  Distributions f E={D1,{K5,N}K1}K1 Computational Soundness

  30. Defining secrecy, computationally Let E be an expression and an encryption scheme The set T Data is computationally hidden in E if for any valuations f0,f1 : Data  {0,1}n f0(D) = f1(D) for D  Data -T    [[ E ]] ~ [[ E ]] f1 f0 “~” means computational indistinguishability Computational Soundness

  31. Relation between two very different worlds? • Is there a relation between the two notions of secrecy? • More generally: what does security proved in the symbolic world mean for the computational world? • Many symbolic versions of the same notion (e.g. two notions of patterns). Which one is right? • Many security notions for the same primitive in the concrete world. Which one is right? Computational Soundness

  32. Main technical result   [[ E ]]f~ [[ patt(E) ]]f {K}K {K1}K2, {K2}K1 are not acyclic expressions Let • E be an acyclic expression • be an IND-CPA secure encryption scheme • arbitrary f: {D1,D2,…,Dn}  {0,1}n . Then: Computational Soundness

  33. Proof idea • Standard (but very general) hybrid argument • Construct E1, E2, …, En such that • E1 = E • En = patt(E) • [[Ei]] ~ [[ Ei+1]] • It is essential that E is acyclic Computational Soundness

  34. Soundness Theorem (Abadi, Rogaway (2000)) Let • Let E be an acyclic expression • be an IND-CPA secure encryption scheme • Then: T symbolically hidden in E T is computationally hidden in E Computational Soundness

  35. Proof  [[ E ]] f0 f0 E f1  [[ E ]] f1  [[ patt(E) ]] f0 f0  patt(E)  f1 [[ E ]] [[ E ]]  f0 f1 [[ patt(E) ]] f1 Given: T is symbolically hidden in E (any D  T does not occur in the pattern of E). Want: Given any f0,f1 : Data  {0,1}n f0(D) = f1(D) if D  T then indistinguishable from Computational Soundness

  36. Previous result an instance of: Soundness Theorems Security property Security property Computational model Symbolic model Computational proof Symbolic proof Computational Soundness

  37. E0 = {K1}K2, {K3}K1, {D}K3 E1 = {K0}K2, {K3}K1, {D}K3 E2 = {K0}K2, {K0}K1, {D}K3 E3 = {K0}K2, {K0}K1, {D0}K3 (One) Hybrid argument Computational Soundness

  38. (One) Hybrid argument An adversary that distinguishes between [[E0]] and [[E3]] must distinguish between [[Ei]] and [[Ei+1]] for some i • E0 = {K1}K2, {K3}K1, {D}K3 • E1 = {K0}K2, {K3}K1, {D}K3 • E2 = {K0}K2, {K0}K1, {D}K3 • E3 = {K0}K2, {K0}K1, {D0}K3 Computational Soundness

  39. (One) Hybrid argument • E0 = {K1}K2, {K3}K1, {D}K3 • E1 = {K0}K2, {K3}K1, {D}K3 • E2 = {K0}K2, {K0}K1, {D}K3 • E3 = {K0}K2, {K0}K1, {D0}K3 Computational Soundness

  40. E0 = {K1}K2, {K3}K1, {D}K3 E1 = {K0}K2, {K3}K1, {D}K3 (One) Hybrid argument k0,k1 Enc(k,_) b Enc (k,kb) c • Generate k0, k1, k3 • Send k0, k1 • Receive c • Compute c1=Enc(k1, k3) • Compute c2=Enc(k3,d) • Output (c,c1,c2) Computational Soundness

  41. Questions: • Is D1 secret in: • Is D1 secret in : • Are D1 and D2 secret in: A, N1, {N1, N2, Ks}K, {B, N2}Ks{D1}Ks A, N1, {N1, N2, Ks}K, A, N3, {N3, N4, Ks’}K, {B, N4}Ks’, {D2}Ks’,{B,N2}Ks,{D1}Ks A, N1, {N1, N2, Ks}K, Ks, A, N3, {N3, N4, Ks’}K, {B, N4}Ks’, {D2}Ks’,{B,N2}Ks, {D1}Ks Computational Soundness

  42. Some difficulties • The usefulness of a soundness theorem increases with its generality • Is D1 secret in • gx, N1, gy, {N1, Ks}gxy, {D1}Ks • gx, N1, gy, {N1, Ks}gx+y, {D1}Ks • gx, gy, gz, gxy, {Ks}gxyz, {D1}Ks • Deal with protocols where gx1x2+x2x3+…+xnx1 occurs • How about in • gx, gy, {N1, Ks}gxy, {D1}Ks, H(N1, D1) • gx, gy, N1, {Ks}gxy, {D1}Ks, H(N1, D1) Computational Soundness

  43. Some difficulties • Intuition a la Dolev Yao models may not always be right! • patt({D}K1 {D,D}K2) = ▓ , ▓ = patt({D}K1 {D}K1) • There exists IND-CPA encryption schemes for which encryption with the same key can be observed • Strengthen the notion of security for encryption in the computational world • Refine the notion of patterns in the symbolic world Computational Soundness

  44. Acyclicity • Intuition a la Dolev Yao models may be wrong! • Is D secret in {K}K, {D}K? • There exist IND-CPA encryption schemes which are completely insecure if used as above • Is D secret in {K1}K2, {K2}K1, {D}K? • …? • Solutions: • declare the above use insecure • define and construct key-dependent encryption Computational Soundness

  45. Computational soundness • Relates symbolic and computational models so that security results transfer • Why should we care • Symbolic formalisms: • Gives insight into models • Justifies the use of symbolic models in a very strong sense • Cryptography: • Symbolic models are simpler, easier to understand • For large protocols with complex interactions life is simpler Computational Soundness

More Related