1 / 34

Computational Soundness for PCL

Computational Soundness for PCL. Dilsun Kaynar Carnegie Mellon University Foundations of Security and Privacy October 11, 2007. Security protocol analysis. Protocol. Property. Attacker model. Analysis Tool. Security proof or attack. SSL. authentication.

tyme
Download Presentation

Computational Soundness for PCL

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computational Soundness for PCL Dilsun Kaynar Carnegie Mellon University Foundations of Security and Privacy October 11, 2007

  2. Security protocol analysis Protocol Property Attacker model Analysis Tool Security proof or attack

  3. SSL authentication Our tool: Protocol Composition Logic (PCL) -Complete control over network -Perfect crypto axiomatic proof PCL Methodology Protocol Property Attacker model Analysis Tool Security proof or attack

  4. Alternative view SSL authentication Protocol Property Attacker model Alternative formalism Analysis Tool -Any ppt algorithm -Explicit crypto Security proof or attack Associated proof techniques

  5. Two worlds Can we get the best of both worlds?

  6. PCL Approach • Protocol Composition Logic (PCL) • Syntax • Proof System • Computational PCL • Syntax ±  • Proof System ±  • Symbolic “Dolev-Yao” model • Semantics • Complexity-theoretic model • Semantics Leverage PCL success… Lectures on PCL so far

  7. PCL Approach High-level proof principles • PCL • Syntax (Properties) • Proof System (Proofs) • Computational PCL • Syntax ±  • Proof System±  Soundness Theorem (Induction) Soundness Theorem (Reduction) [BPW, MW,…] • Symbolic Model • PCL Semantics • (Meaning of formulas) • Cryptographic Model • PCL Semantics • (Meaning of formulas) Unbounded # concurrent sessions Polynomial # concurrent sessions

  8. Reminder: PCL Protocol Language • Terms • Constants, variables, name, key, (t,t), sigK{t}, encK{t} • Actions • new t, send t, receive x, sign(t,K), • verify (t,t,K) , enc(t,K), dec(t,K) • Strand, cord • [a; ..., a]P • Role, protocol

  9. Challenge-Response as Cords m, A n, sigB {m, n, A} A B sigA {m, n, B} InitCR(A, X) = [ new m; send A, X, {m, A}; receive X, A, {x, sigX{m, x, A}}; send A, X, sigA{m, x, X}}; ] RespCR(B) = [ receive Y, B, {y, Y}; new n; send B, Y, {n, sigB{y, n, Y}}; receive Y, B, sigY{y, n, B}}; ]

  10. Attacker capabilities • Controls complete network • Can read, remove, inject messages • Fixed set of operations on terms • Pairing • Projection • Encryption with known key • Decryption with known key Commonly referred to as “Dolev-Yao” attacker

  11. Execution model • Initial configuration • Protocol is a finite set of roles • Set of principals and keys • Assignment of 1 role to each principal • Run (produces a symbolic trace) send {x}B new x A receive {x}B receive {z}B B send {z}B new z C

  12. Protocol Logic • Action formulas a ::= Send(P,m) | Receive (P,m) | New(P,t) | Encrypt(P,t) | Decrypt(P,t) | Sign (P,t) | Verify (P,t) • Formulas • ::= a | a < a| Has(P,t) | Fresh(P,t) | Honest(N) | Contains(t, t) |  |   | x  | Start(P) • Modal formulas •  [a; …a]P

  13. Reasoning about knowledge • Pairing • Has(X, {m,n})  Has(X, m)  Has(X, n) • Encryption • Has(X, encK(m))  Has(X, K-1)  Has(X, m)

  14. Semantics • Protocol Q • Defines set of roles (e.g, initiator, responder) • Run R of Q is sequence of actions by principals following roles, plus attacker • Satisfaction Q, R |=  • Q, R |= Send(A, m) If in R, thread A executed send(m) • Q, R | [ actions ] P  If some role of P in R does exactly actions starting from state where  is true, then  is true in state after actions completed • Q | [ actions ] P  Q, R | [ actions ] P  for all runs R of Q

  15. Proof System • Goal: formally prove security properties • Axioms • Simple formulas provable by hand • Inference rules • Proof steps • Theorem • Formula obtained from axioms by application of inference rules

  16. Soundness • Soundness Theorem: • if Q |-  then Q |=  • If  is a theorem then  is a valid formula •  holds in any step in any run of protocol Q • Unbounded number of participants • Dolev-Yao intruder

  17. Computational PCL • Symbolic proofs about complexity-theoretic model of cryptographic protocols!

  18. Towards Computational PCL • Revisit • Protocol language and execution model • Protocol logic • New semantics for formulas: • Interpret formulas as operators on probability distributions on traces • Meaning of  on a set of traces T is a subset T' in which  holds. Probability that  holds is |T'| / |T|.

  19. Main result • Computational PCL • Symbolic logic for proving security properties of network protocols using public-key encryption • Soundness Theorem: • If a property is provable in CPCL, then property holds in computational model with overwhelming asymptotic probability. • Benefits • Symbolic proofs about computational model • Computational reasoning in soundness proof (only!) • Different axioms rely on different crypto assumptions

  20. Protocol execution model • Initialization • Set roles, identify honest principals, provide encryption keys (polynomially-bounded) and random coins • Execution phase • Adversary has complete control of the network, acts according to the standard cryptographic model • Running time is polynomially-bounded • Computational trace <e, λ> • e is the symbolic description of the trace • λ maps terms in e to bitstrings

  21. PCL  Computational PCL • Syntax, proof system mostly the same • Significant difference • Symbolic “knowledge” • Has(X,t) : X can produce t from msgs that have been observed, by symbolic algorithm • Computational “knowledge” • Possess(X,t) : can produce t by ppt algorithm • Indistinguishable(X,t) : can distinguish from random in ppt • More subtle system: some axioms rely on CCA2, some are info-theoretically true, etc.

  22. Example: Simple secrecy Init(A, X) = [ new m; y := enc(m,X); send (A, X, y); ] Resp(B, X) = [ receive z; match z/ <X,B,z’>; z’’ := dec(z’, B); ] When A completes proof with B, m will be a shared secret: Start(A) [Init] A Honest(A)  Honest(B)  (Z ≠A,B)  Indist(Z,m)

  23. Example: axiomatic proof • Т [new m] X Z  X  Indist(Z,m) • At the point m is generated by X, it is indistinguishable from random by everyone else • Source(Y,u,{m}X)  Decr(X, {m}X)  Honest(X,Y)  (Z  X)  Indist(Z, u) • X remains indistinguishable since it appears encrypted with a private key

  24. Central axioms • Cryptographic security property of encryption scheme • Ciphertext indistinguishability • Cryptographic security property of signature scheme • Unforgeability (used for authentication)

  25. Sample axioms • Т [new x] X Y  X  Indist(Y,x) • Source(Y,u,{m}X)  Decr(X, {m}X)  Honest(X,Y)  (Z  X,Y)  Indist(Z,u) • Verify(X,sigY{m})  Honest(X,Y)  Y. Sign(Y,m)

  26. IND-CCA2-Secure encryption {m}K Challenger Attacker • Messages are bit-strings • Attacker is a PPT Turing Machine m m0, m1 {mb}K C {m}K m d Attacker wins if d = b IND-CCA2 security:  PPT attackers A  negligible function f  n0  security parameters n ≥ n0 Prob [d = b | A plays by the rules] <= ½ + f(n) Intuition: Encryption reveals no information about message

  27. CMA-Secure Signatures Challenger Attacker mi Sig(Y,mi) Sig(Y,m) Attacker wins if m  mi

  28. Computational Semantics • [[]] (T,D, ε) • T is a set of traces, D is a Distinguisher, • ε is tolerance • Inductive definition • [[a(.,.)]] (T,D,ε) : for basic actions a, is simply the collection of traces in which action a(.,.) occurs • [[Indist(X,u)]] (T,D,ε) : uses T, D, ε • T if D cannot distinguish u from random, empty set otherwise • Not a trace property

  29. Inductive Semantics • [[1  2]] (T,D,) = [[1]] (T,D,) [[2]] (T,D,) • [[1  2]] (T,D,) = [[1]] (T,D,) [[2]] (T,D,) • [[ ]] (T,D,) = T - [[]] (T,D,) Implication uses conditional probability • [[1  2]] (T,D,) = [[1]] (T,D,)  [[2]] (T’,D,) where T’ = [[1]] (T,D,) Formula defines transformation on probability distributions over traces

  30. Complexity-theoretic semantics • Q |=  if  adversary A  distinguisher D  negligible function f  n0 n > n0 s.t. Fraction represents probability [[]](T,D,f(n))|/|T| > 1 – f(n) • Fix protocol Q, PPT adversary A • Choose value of security parameter n • Vary random bits used by all programs • Obtain set T=T(Q,A,n) of equi-probable traces T(Q,A,n) [[]](T,D,f)

  31. Soundness of Proof System • Prove soundness of individual axioms • Show all rules preserve validity

  32. Soundness of proof system • Example axiom • Source(Y,u,{m}X)  Decrypts(X, {m}X)  Honest(X,Y)  (Z  X,Y)  Indistinguishable(Z, u) • Proof idea: crypto-style reduction • Assume axiom not valid:  A  D  negligible f  n0  n > n0 s.t. [[]](T,D,f)|/|T| < 1 –f(n) • Construct attacker A’ that uses A, D to break IND-CCA2 secure encryption scheme

  33. Correspondence theorems • If “secure” in abstract model, then “secure” in computational model. • Strong assumptions about cryptographic functions • No abstractions for Diffie-Hellman exponentiation • Negative results: symmetric encryption, hash functions, xor • CPCL can potentially sidestep these problems by appropriate axiomatizations of properties

  34. Logic and Cryptography: Big Picture Protocol security proofs using proof system Axiom in proof system Semantics and soundness theorem Complexity-theoretic crypto definitions (e.g., IND-CCA2 secure encryption) Crypto constructions satisfying definitions (e.g., Cramer-Shoup encryption scheme)

More Related