1 / 43

Provable Security – An Introduction

Provable Security – An Introduction. Kenny Paterson kenny.paterson@rhul.ac.uk Information Security Group Royal Holloway, University of London. Overview. Some history Digital signatures as a case study Scope of application for provable security Strengths and weaknesses Concluding remarks.

ziya
Download Presentation

Provable Security – An Introduction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Provable Security – An Introduction Kenny Paterson kenny.paterson@rhul.ac.uk Information Security Group Royal Holloway, University of London Dagstuhl

  2. Overview • Some history • Digital signatures as a case study • Scope of application for provable security • Strengths and weaknesses • Concluding remarks Dagstuhl

  3. The Ad Hoc Approach • The prevailing approach until quite recently: • Prof. A proposes a protocol (e.g. for key agreement). • He shows that some obvious attacks are ruled out. • Problem of “friendly cryptanalysis”. • After a while, Ph.D. student B comes along and finds a clever attack not anticipated by Prof. A. • PhD student B also proposes a fix to the scheme of Prof. A (and gets her PhD). • A bit later, Dr. C shows another attack that applies to the fixed scheme. • Repeat ad nauseam. • Good for publications and PhD theses, probably bad science. Dagstuhl

  4. Shannon and the One-time Pad • Claude Shannon, “Communication Theory of Secrecy Systems”, Bell System Technical Journal, Vol 28, Oct 1949, pp 656-715. • Put symmetric key encryption on a firm theoretical foundation. • Analysed the security of the one-time pad: • M=m1m2 …mt – the message bits to be encrypted • K=k1k2… kt – a sequence of random key bits. • C=c1c2… ct – the ciphertext, where: ci=mi + ki mod 2 Dagstuhl

  5. Shannon and the One-time Pad • Shannon proved that the security of the one-time pad encryption system is unconditional • Provided K is uniformly random and is never re-used. • Security irrespective of the computing power of the attacker. • Proof uses concepts of entropy and information introduced by Shannon in 1948. • So provable security is not a new subject. • But use of one-time pad creates a serious key management problem… Dagstuhl

  6. Washington-Moscow Hotline “The USSR shall provide for preparation and delivery of keying tapes to the terminal point of the link in the United States for reception of messages from the USSR. The United States shall provide for the preparation and delivery of keying tapes to the terminal point of the link in the USSR for reception of messages from the United States. Delivery of prepared keying tapes to the terminal points of the link shall be effected through the Embassy of the USSR in Washington (for the terminal of the link in the USSR) and through the Embassy of the United States in Moscow (for the terminal of the link in the United States).” Extracted from “Memorandum of Understanding Between the United States of America and the Union of Soviet Socialist Republics Regarding the Establishment of a Direct Communications Link; June 20, 1963”. Dagstuhl

  7. From Shannon to the 1990s • Cryptography began to develop into a subject of academic study in the mid-late 1970’s. • Key events: • Development of US national standard for encryption (DES), 1976. • (Public) discovery of public key cryptography, Diffie and Hellman, 1976. • RSA algorithm, Rivest, Shamir, Adleman, 1977. • Until the late 1980’s, most analysis was conducted using the ad hoc approach. • With exceptions such as Shannon’ work noted. • In the early 1990’s provable security came to the fore. Dagstuhl

  8. Security Models and Proofs Typical approach: • Define (generically) the functionality of the cryptographic scheme. • Encryption, signature, message authentication, authenticated key exchange,… • Define the capabilities of the adversary. • Define the goal of the adversary. • Propose a concrete scheme (realise functionality). • Provide a proof that any adversary against the scheme can be used to produce an algorithm to break some computational problem. • Assume that the computational problem and its hardness have been well-studied. Dagstuhl

  9. Digital Signatures • We will use digital signatures as a case-study to illustrate this process. • Initial work for signatures by Goldwasser, Micali, Rivest, 1988. • Informally: • Signer produces signatures using a private key, verifier can check signatures using a matching public key. • Anyone can verify, but only legitimate signer with the private key can sign. • No-one other than the legitimate signer should be able to produce signatures. Dagstuhl

  10. Functionality of Digital Signatures • Functionality of a digital signature scheme is described by three algorithms: • Key Generation: • Given a security parameter k, produces a key-pair (SK,PK). • Sign: • Given a message M and the private key SK, produces a signature S. • Verify: • Given a message M, signature S, and public key PK, outputs 1 or 0 (corresponding to valid or invalid). • Consistency requirement: if S was generated using the Sign algorithm on M and SK, then Verify on input S, M and PK outputs 1 (and 0 otherwise). Dagstuhl

  11. RSA Signatures • Key Generation(k): • N=pq, PK= N,e, SK=d with ed=1 mod (p-1)(q-1) • Assume N has k bits. • Sign: • S=H(M)dmod N • Here H is a collision-resistant hash function mapping messages of arbitrary length onto messages of some fixed length. • Precise definition of H to follow. • Verify: • Compare H(M) with Semod N Dagstuhl

  12. The Role of the Security Parameter • Key generation, sign and verify modelled by probabilistic, polynomial-time algorithms (in the security parameter k). • Sign and Verify may be deterministic (e.g. RSA). • Can think of them as Turing machines equipped with random tapes. • k can informally be thought of as defining the size (in bits) of the various parameters of the scheme. • More formally, provides a means to measure properties of algorithms and adversaries in the framework of polynomial-time algorithms. • Concrete security as an alternative. Dagstuhl

  13. Security of Digital Signatures • We can model the capabilities of an adversary in various ways: • Adversary is given the public key only. • Adversary is given the public key and signatures on various messages. • Adversary is given the public key and access to a signature oracle giving him signatures for messages of his choice. • Adversary as above, and choice of messages adaptive. • We can define the adversarial goal in various ways: • Adversary has to find the private key. • Adversary has to produce a signature on any message (universal forgery). • Adversary has to produce a signature on some message (existential forgery). Dagstuhl

  14. Security of Digital Signatures • Most conservative approach: take the strongest capabilities in combination with the weakest goal. • For signatures: EUF-CMA, Existential UnForgeability against (adaptive) Chosen-Message Attacks. • Model attack as a game between adversary and challenger: • Challenger supplies public key PK to adversary. • Adversary is an arbitrary algorithm which receives PK, makes signing oracle queries on messages of its choice and finally outputs a message M and a string S. • Adversary wins the game if S is a valid signature on M and M was not the subject of any signing query during the game. Dagstuhl

  15. St S2 PK S1 Mt M2 M1 M,S Security of Digital Signatures Adversary Challenger KeyGen Sign query Sign Sign Sign query Sign Sign query Dagstuhl

  16. Security of Digital Signatures • Adversary wins this game if S is a valid signature for M and if adversary did not make sign query on M. • We say that a signature scheme is secure if there is no polynomial-time adversary having a non-negligible success probability in this game. • Polynomial time as a function of security parameter k. • Probability of success measured over randomness used by challenger. • Negligible means smaller than 1/p(k) for any polynomial p for all sufficiently large k(p). Dagstuhl

  17. Security of RSA Signatures • Sign: S=H(M)dmod N • Verify: Compare H(M) with Semod N • We cannot (currently) prove security without making an additional assumption about the hardness of some computational problem. • For RSA, the appropriate problem is the RSA inversion problem: • Given N, e and a value X, find X1/emod N. • Believed, but not known, to be as hard as factoring. Dagstuhl

  18. Proving Security of RSA Signatures • Main ideas: • Replace challenger with a simulator which tries to solve RSA inversion problem. • Simulator is given N, e and a value X. • Model hash function by a random oracle. • A random function with access to function mediated by challenger. • Simulator now provides values of hash function to adversary. • Adversary makes hash queries in addition to usual signing queries. • Simulator must provide simulation of challenger that is indistinguishable for the adversary. • Can do so in such a way that simulator can answer all signing queries AND use adversary’s forgery to solve the RSA inversion problem! • Simulate responses to hash queries with values satisfying H(M)=Seexcept one response, which is set to X. Dagstuhl

  19. St H(M2) S1 Mt M2 M1 M,S Security of Digital Signatures RSA inversion Adversary N,e,X PK=N,e Sign query Simulation of signing and hashing Hash query Sign query X1/e Dagstuhl

  20. Security Reductions • There’s really no such thing as a security proof in cryptography, only security reductions: • “If an adversary can break this scheme, then we can construct an algorithm to solve some computational problem.” • Since the computational problem is assumed to be hard, we conclude that the scheme is secure. • Reduction concept borrowed from computational complexity theory. • Information-theoretic basis for security is possible but less common for schemes to be used in practice. Dagstuhl

  21. Scope of Application • Provable security approach has been widely applied to basic cryptographic primitives: • Encryption, signature, message authentication, key exchange. • Symmetric and asymmetric. • Plus many variations: threshold, proxy, blind, designated verifier, deniable,… • Identity-based and usual public key settings, certificateless setting. • Also used to study more complex primitives: • Group signatures, broadcast encryption, auction protocols, e-cash systems, and even general multi-party secure computation. Dagstuhl

  22. Strengths and Weaknesses • Security model and proof now almost de rigueur in academic cryptography papers. • ProvabIe security having increasing influence on cryptographic standards and practice. • ISO, PKCS, ANSI • DAA in Trusted Computing, IKE • The provable security paradigm has both strengths and weaknesses… Dagstuhl

  23. Strengths • Builds security from bottom-up. • Security for high-level primitives phrased in terms of hardness of some low-level mathematical problem. • Obtain a clear, well-defined and self-contained statement of basis for security. • e.g. hardness of integer factorisation, hardness of discrete logarithm problem in a particular class of groups. • Hard problems can then be studied in isolation by experts (e.g. computational number theorists). • A form of layering or “separation of concerns”. Dagstuhl

  24. Strengths • Modelling steps clarify what is (and is not) expected of cryptographic primitives. • Clarify functionality of primitive AND its security. • Reduces possibility of unanticipated attack (c.f. heuristic analysis). • Quantities in proof can be used to drive selection of parameters in real-world applications. • Can relate time and success probability for solving hard problem to that of adversary in breaking scheme. • Requires detailed accounting of adversarial actions and simulator’s responses. • Often ignored in practice, with proof used as guide only. Dagstuhl

  25. Strengths • Provable security approach allows a degree of modularity. • Composition results have been slow in coming, but universal composability offers an interesting way forward. • Modular approach of Bellare, Canetti, Krawczyk for key exchange. • Allows re-use of protocol components and easy construction of protocols with “semi-automatic” security proofs. • Allows study of relationships between different security notions for a given primitive, and between different primitives. Dagstuhl

  26. Weaknesses • The proof of security may not be correct. • Related to “cultural” effects: • Main venue for publication is conferences. • Tight reviewing schedules, little time for referees to check details of proofs. • Proofs often placed in appendices to meet page limits, or relegated to the “full version”. • Standards of rigour arguably not as high as in pure mathematics. Dagstuhl

  27. Example: RSA-OAEP • RSA-OAEP: • RSA = RSA! • OAEP = Optimal Asymmetric Encryption Padding • A method for transforming “raw” RSA encryption into a method offering suitably strong security guarantees (IND-CCA security) • Solving a long-standing open problem. • Proposed and proved secure by Bellare and Rogaway (1994). • Widely standardised (e.g. in SET). Dagstuhl

  28. Example: RSA-OAEP m 0 r s = (m||0) + G(r) Padding t = r + H(s) x s t Encryption xe modulo N Dagstuhl

  29. Example: RSA-OAEP • Bellare and Rogaway (1994) proved that an adversary who can break RSA-OAEP (in a well-defined and strong sense) can solve the RSA-inversion problem. • Proof actually works for any trapdoor one-way function. • The proof was well-written, the construction simple and the result was rightly celebrated. Dagstuhl

  30. Example: RSA-OAEP • But Shoup (2001) discovered a flaw in Bellare and Rogaway’s proof. • The proof was in the literature for seven years before the problem was spotted. • Fortunately, Shoup and Fujiskai et al. were able to repair the proof. • Simpler constructions with security proofs were subsequently discovered. Dagstuhl

  31. Weaknesses • The reduction from the adversary to the computational problem may not be “tight”. • Time and success probability of algorithm to solve underlying hard problem may not be closely related to time and success probability of adversary. • Can only get meaningful security for scheme by increasing security parameter k, leading to much less efficient schemes. • Or ignore this and work with usual sizes of cryptographic parameters and use proof only as a heuristic guide? Dagstuhl

  32. Example: Blum-Blum-Shub • Blum-Blum-Shub pseudo-random bit generator: • N =pq is an RSA modulus with p,q = 3 mod 4. • Initial seed x_0 • xi = (xi-1)2 mod N • Output the j least significant bits of xi • The larger j is, the faster we can generate bits. • Security result: assuming factoring N is intractable, j=O(loglogN) bits can be securely extracted per iteration. • Vazirani and Vazirani; • Alexi, Chor, Goldreich and Schnorr; • Fischlin and Schnorr; • Sidorenko and Schoenmakers. Dagstuhl

  33. Example: Blum-Blum-Shub • IETF RFC 1750 (Eastlake et al.) states: “If you use no more than the log2log2(xi) low order bits, then predicting any additional bits from a sequence generated in this manner is provable [sic] as hard as factoring N.” • Is this statement justified by the security proof? Dagstuhl

  34. Example: Blum-Blum-Shub • Analysis by Koblitz and Menezes: • Take the best bounds on security and hardness of factoring known in the literature. • Apply them for j=9 and N with 768 bits, extracting M=109 bits from the generator. • Allowing a success probability of 0.01 for the adversary, what is the time bound on the adversary? • Answer: 2-264 • Yes, that is a negative sign in the exponent! • Concrete security analysis does not always give us results that are useful in practice. • In this instance, we need N with > 10000 bits for useful security guarantee. Dagstuhl

  35. Weaknesses • The underlying computational problem might turn out to be easier than expected. • Significant advances in algorithms for integer factorisation and discrete logs are rare, but do happen. • The pairing-based cryptography zoo of hard problems: • BDHP, BDHE, q-BDHI, q-SDH,… • Decisional variants, gap variants, multi-input variants. • Dozens of new problems, all assumed to be hard. • But these problems have much a shorter track record than factoring/DLP. Dagstuhl

  36. Further Weaknesses • The model itself may not be correct. • The “right” models for apparently simple primitives like encryption have taken a long time to emerge. • Good models for more complex primitives are hard to establish. • Proxy signatures, certificateless encryption, intrusion-resilient cryptography as examples. • How do we know when the model is finally right? Dagstuhl

  37. Further Weaknesses • The model of security may not be comprehensive enough to take into account all practical attacks. • Side-channel attacks on SSL/TLS. Dagstuhl

  38. Side-channel Analysis of SSL/TLS • SSL/TLS uses symmetric cryptography as the workhorse for bulk data protection. • The plaintext data is integrity-protected first, then encrypted. • c.f Horton principle. • Typically using the HMAC algorithm and a block cipher in CBC-mode. • This combination was claimed to be proven secure in an appropriate model by Krawczyk (Crypto 2001). Dagstuhl

  39. Side-channel Analysis of SSL/TLS • Vaudenay (Eurocrypt 2002) introduced the notion of a padding oracle attack. • CBC mode operates on blocks of data. • Plaintext first needs to be padded with redundant data to make it fit into blocks. • A padding oracle tells an attacker whether or not a ciphertext was correctly padded. • Vaudenay showed that an attacker can leverage such an oracle to decrypt arbitrary ciphertexts. • Provided the oracle is available. • For certain padding schemes in CBC mode. Dagstuhl

  40. Side-channel Analysis of SSL/TLS • Canvel et al. (Crypto 2003) showed that SSL/TLS as implemented in OpenSSL reveals a padding oracle. • Time difference in generation of error messages for failure of padding and failure of MAC (checked later than padding). • Error messages are in encrypted form and only differ in time by a few milliseconds. • Still enough of a cryptanalytic toe-hold to allow recovery of static authentication credentials in SSL/TLS-protected sessions. Dagstuhl

  41. Side-channel Analysis of SSL/TLS • We have a security proof, so what went wrong? • An example where the model in which the proof holds is not sufficiently broad to capture all practical attacks. • Padding oracle not part of security model. • (Worse, Krawczyk’s proof does not actually apply to the combination of MAC and CBC-mode used in SSL/TLS! • Proof assumes PAD, then MAC, then CBC • Needed to get MAC to sit inside a single block. • SSL/TLS mandates MAC, PAD, then CBC). Dagstuhl

  42. Further Weaknesses • A security proof is no guarantee of correct implementation. • A protocol with a proof may not compose well with further protocols to produce a secure system. • A security proof using Random Oracles may not give security when the random function is instantiated with a real hash function. • Pathological examples of this have been produced. • But schemes with proofs in the standard model tend to be less efficient. • A scheme with a security proof may be less efficient than an ad hoc design. Dagstuhl

  43. Concluding Remarks • Provable security provides a means to rigorize cryptography, replacing ad hoc approach. • It’s far from perfect, but it’s the best formal approach for cryptography that we have at the moment. • Many of the weaknesses are not unique to the provable security approach. • The scope of security proofs is increasing to cover side-channel attacks of various kinds and to encompass more complex primitives and systems. Dagstuhl

More Related