1 / 44

Cryptography

Cryptography. Lecture 2 Stefan Dziembowski www.dziembowski.net stefan@dziembowski.net. Plan. Information-theoretic cryptography Introduction to cryptography based on the computational assumptions Provable security Pseudorandom generators. The scenario from the previous lecture. Alice.

Download Presentation

Cryptography

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cryptography Lecture 2Stefan Dziembowskiwww.dziembowski.net stefan@dziembowski.net

  2. Plan • Information-theoretic cryptography • Introduction to cryptography based on the computational assumptions • Provable security • Pseudorandom generators

  3. The scenario from the previous lecture Alice Bob Eve Shannon’s theorem perfect secrecy is possible only if the key is as long as the plaintext In real-life it is completely impractical

  4. What to do? Idea: limit the power of the adversary. How? Classical (computationally-secure) cryptography: bound his computational power. Alternative options exists (but are not very practical)

  5. Quantum cryptography Stephen Wiesner (1970s), Charles H. Bennett and Gilles Brassard (1984) quantum link Alice Bob Quantum indeterminacy: quantum states cannot be measured without disturbing the original state. Hence Eve cannot read the bits in an unnoticeable way. Eve

  6. Warning: Quantum cryptography should not be confused with quantum computing. Quantum cryptography Advantage: security is based on the laws of quantum physics Disadvantage: needs a dedicated equipment. Practicality? Currently: successful transmissions for distances of length around 150 km. Commercial products are available.

  7. Does it help? No... (Shannon’s theorem of course also holds in this case.) A satellite scenario A third party (a satellite) is broadcasting random bits. 000110100111010010011010111001110111 111010011101010101010010010100111100 001001111111100010101001000101010010 001010010100101011010101001010010101 Alice Bob Eve

  8. Ueli Maurer (1993): noisy channel. some bits get flipped (because of the noise) Assumption: the data that the adversary receives is noisy. (The data that Alice and Bob receive may be even more noisy.)

  9. Bounded-Storage Model Another idea: bound the size of adversary’s memory 000110100111010010011010111001110111 111010011101010101010010010100111100 001001111111100010101001000101010010 001010010100101011010101001010010101 too large to fit in Eve’s memory

  10. Real (computationally-secure) cryptography starts here: Eve is computationally-bounded But what does it mean? Ideas: • She has can use at most 1000Intel Core 2 Extreme X6800 Dual Core Processorsfor at most 100 years... • She can buy equipment worth 1 million euro and use it for 30 years... it’s hard to reasonformally about it

  11. A better idea ”The adversary has access to a Turing Machine that can make at most 1030 steps.” More generally, we could have definitions of a type: “a system X is (t,ε)-secure if every Turing Machine that operates in timet can break it with probability at mostε.” This would be quite precise, but... We would need to specify exactly what we mean by a “Turing Machine”: • how many tapes it has? • how does it access these tapes (maybe a “random access memory” is a more realistic model..) • ... Moreover, this approach often leads to ugly formulas...

  12. “(t,ε)-security” What to do? Idea: • t steps of a Turing Machine = “efficient computation” • ε – a value “very close to zero”. How to formalize it? Use the asymptotics!

  13. Efficiently computable? “efficiently computable” = “polynomial-time computable on a Turing Machine” that is: running in time O(nc) (for some c) Here we assume that the Turing Machines are the right model for the real-life computation. Not true if a quantum computer is built...

  14. Very small? “very small” = “negligible” = approaches 0 faster than the inverse of any polynomial Formally:

  15. Negligible or not? no yes yes yes yes no

  16. Security parameter Typically, we will say that aschemeXis secureif The terms “negligible” and “polynomial” make sense only if X (and the adversary) take an additional input n called a security parameter. In other words: we consider an infinite sequence X(1),X(2),... of schemes. A P (M breaks the scheme X) is negligible polynomial-time Turing MachineM

  17. Example Consider the authentication scheme from the last week:

  18. Nice properties of these notions • A sum of two polynomials is a polynomial: poly + poly = poly • A product of two polynomials is a polynomial: poly * poly = poly • A sum of two negligible functions is a negligible function: negl + negl = negl Moreover: • A negligible function multiplied by a polynomial is negligible negl * poly = negl

  19. A new definition of an encryption scheme

  20. Is this the right approach? Advantages • All types of Turing Machines are “equivalent” up to a “polynomial reduction”.Therefore we do need to specify the details of the model. • The formulas get much simpler. Disadvantage Asymptotic results don’t tell us anything about security of the concrete systems. However Usually one can prove formally an asymptotic result and then argue informally that “the constants are reasonable” (and can be calculated if one really wants).

  21. Provable security We want to construct schemes that are provably secure. But... • why do we want to do it? • how to define it? • and is it possible to achieve it?

  22. Provable security – the motivation In many areas of computer science formal proofs are not essential. For example, instead of proving that an algorithm is efficient, we can just simulate it on a “typicalinput”. In cryptography it’s not true, because there cannot exist an experimental proof that a scheme is secure. Why? Because a notion of a “typicaladversary” does not make sense.

  23. How did we define the perfect secrecy? Idea 1The adversary should not be able to compute k. Experiment (m – a message) • the key k is chosen randomly • message m is encrypted using k:c := Enck(m) • c is given to the adversary Idea 2The adversary should not be able to compute m. Idea 3The adversary should not be able to compute any information about m. Idea 4 The adversary should not be able to compute any additional information about m. makes more sense

  24. Idea The adversary should not be able to compute any additional information about m.

  25. Towards the definition of computational secrecy... A A P(C = c) = P(C = c | M=m) m c A A P(C = c | M = m0) = P(C = c | M = m1) m0,m1 c A A P(Enc(K,M) = c | M = m0) = P(Enc(K,M) = c | M = m1) m0,m1 c A A P(Enc(K,m0) = c | M = m0) = P(Enc(K,m1) = c | M = m1) m0,m1 c A A P(Enc(K,m0) = c) = P(Enc(K,m1) = c) m0,m1 c

  26. Indistinguishability P(Enc(K,m0) = c) = P(Enc(K,m1) = c) A A m0,m1 c In other words: the distributions of Enc(K,m0) = Enc(K,m1) are identical IDEA change it to: are indistinguishable by a polynomial time adversary

  27. m0,m1 c (Gen,Enc,Dec) – an encryption scheme A game security parameter 1n adversary (polynomial-time Turing machine) oracle chooses m0,m1 such that |m0|=|m1| • selects k := G(1n) • chooses a random b = 0,1 • calculatesc := Enc(k,mb) has to guess b Alternative name: semantially-secure (sometimes we will say: “is computationally-secure”, if the context is clear) Security definition: We say that (Gen,Enc,Dec)hasindistinguishable encryptionsif any polynomial timeadversary guesses b correctly with probability at most 0.5 +ε(n), whereεis negligible.

  28. Testing the definition • Suppose the adversary can compute k from some Enc(k,m). Can he win the game? • Suppose the adversary can compute some bit ofm from Enc(k,m). Can he win the game? YES! YES!

  29. Is it possible to prove security? (Gen,Enc,Dec)-- an encryption scheme. For simplicity suppose that: • for a security parameter n the key is of length n. • Enc is deterministic Consider the following language: Q: What if L is polynomial-time decidable? A:Then the scheme is broken (exercise) Is it really true? On the other hand: L is in NP. (k is the NP-witness) So, if P = NP, then any semantically-secure encryption is broken.

  30. “If P=NP, then the semantically-secure encryption is broken” Is it 100% true? Not really... This is because even if P=NP we do not know what are the constants. Maybe P=NP in a very “inefficient way”...

  31. In any case, to prove security of a cryptographic scheme we would need to show a lower bound on the computational complexity of some problem. In the “asymptotic setting” that would mean that at least we show that P ≠ NP. Does the implication in the other direction hold? (that is: does P ≠ NP imply anything for cryptography?) No! (at least as far as we know) Intuitively: because NP is a notion from the “worst case complexity”, and cryptography concerns the “average case complexity”. Therefore proving that an encryption scheme is secure is probably much harder than proving that P ≠ NP.

  32. What can we prove? We can prove conditional results. That is, we can show theorems of a type: Suppose that some “computational assumption A”holds Suppose that some scheme Y is secure then scheme X is secure. then scheme X is secure.

  33. Research program in cryptography Base the security of cryptographic schemes on a small number of well-specified “computational assumptions”. Examples of A: “decisional Diffie-Hellman assumption” “strong RSA assumption” in this we have to “believe” Some “computational assumption A”holds the rest is provable then scheme X is secure.

  34. Example We are now going to show an example of such reasoning: Suppose that some “computational assumption A”holds Suppose that G is a “cryptographic pseudorandom generator” then scheme X is secure. we G can construct a secure encryption scheme

  35. Pseudorandom generators G(s) s

  36. If we use a “normal PRG” – this idea doesn’t work (exercise). It works only with the cryptographic PRGs.

  37. “Looks random” What does it mean? Non-cryptographic applications: should pass some statistical tests. Cryptography: should pass all polynomial-time tests.

  38. a random string R or G(S) (where S random) Cryptographic PRG outputs: 0 if he thinks it’s R 1 if he thinks it’s G(S) a polynomial-timedistinguisher D Should not be able to distinguish...

  39. Constructions There exists constructions of cryptographic pseudorandom-generators, that are conjectured to be secure. Some of them are extremely efficient, and widely used in practice. They are called the “stream ciphers” (we will discuss them later).

  40. Proof (sketch) Suppose that it is not secure. Therefore there exists an adversary that wins the “guessing game” with probability 0.5 + δ(n), where δ(n) is not negligible. Theorem If G is a cryptographicPRG then the encryption scheme constructed before is semantically-secure (i.e. it has indistinguishable encryptions). cryptographic PRGs computationally-secure encryption

  41. chooses m0,m1 • b = 0,1random • c := x xor mb c has to guess b m0,m1 simulates X If the adversary guessed b correctly then output 1: “x is pseudorandom”. Otherwise output 0: “x is random”.

  42. x is a random string R x = G(S) the adversary guesses b correctly with probability 0.5 the adversary guesses b correctly with probability 0.5 + δ(n) prob. 0.5 + δ(n) prob.0.5 - δ(n) prob. 0.5 prob. 0.5 outputs: 1 0 1 0 QED

  43. Moral To construct secure encryption it suffices to construct a secure PRG. cryptographic PRGs semantically-secure encryption

  44. Outlook Cryptography • one time pad, • quantum cryptography, • “the satellite scenario” often called: • “information-theoretic”, • “unconditional” “computationally-secure” based on 2 assumptions: • some problems are computationally difficult • our understanding of what “computational difficulty” means is correct.

More Related