1 / 126

Cryptography and Privacy Preserving Operations Lecture 1

Cryptography and Privacy Preserving Operations Lecture 1. Lecturer: Moni Naor Weizmann Institute of Science. What is Cryptography?. Traditionally: how to maintain secrecy in communication. Alice and Bob talk while Eve tries to listen. Bob. Alice. Eve. History of Cryptography.

Download Presentation

Cryptography and Privacy Preserving Operations Lecture 1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cryptography and Privacy Preserving OperationsLecture 1 Lecturer:Moni Naor Weizmann Institute of Science

  2. What is Cryptography? Traditionally: how to maintain secrecy in communication Alice and Bob talk while Eve tries tolisten Bob Alice Eve

  3. History of Cryptography • Very ancient occupation • Many interesting books and sources, especially about the Enigma • David Kahn, The Codebreakers, 1967 • Gaj and Orlowski, Facts and Myths of Enigma: Breaking Stereotypes, Eurocrypt 2003 • Not the subject of this course

  4. Modern Times • Up to the mid 70’s - mostly classified military work • Exception: Shannon, Turing* • Since then - explosive growth • Commercial applications • Scientific work: tight relationship with Computational Complexity Theory • Major works: Diffie-Hellman, Rivest, Shamir and Adleman (RSA) • Recently - more involved models for more diverse tasks. How to maintain the secrecy, integrity and functionality in computer and communication system.

  5. Complexity Theory - Study the resources needed to solve computational problems computer time, memory Identify problems that are infeasible to compute. Cryptography - Find ways to specify security requirements of systems Use the computational infeasibility of problems in order to obtain security. Cryptography and Complexity The development of these two areas is tightly connected! The interplay between these areas is the subject of the course

  6. Key Idea of Cryptography Use the intractability of some problems for the advantage of constructing secure system

  7. Short Course Outline First part of this short course: • Alice and Bob want to cooperate • Eve wants to interfere • One-way functions, • pseudo-random generators • pseudo-random functions • Encryption Second part • Alice and bob do not quite trust each other • Zero-knowledge protocols • Secure function evaluation

  8. Three Basic Issues in Cryptography • Identification • Authentication • Encryption

  9. Example: Identification • When the time is right, Alice wants to send an `approve’ message to Bob. • They want to prevent Eve from interfering • Bob should be sure that Alice indeed approves Alice Bob Eve

  10. Rigorous Specification of Security To define security of a system must specify: • What constitute a failure of the system • The power of the adversary • computational • access to the system • what it means to break the system.

  11. Specification of the Problem Alice and Bob communicate through a channel Bob has two external states {N,Y} Eve completely controls the channel Requirements: • If Alice wants to approve and Eve does not interfere – Bob moves to state Y • If Alice does not approve, then for any behavior from Eve, Bob stays in N • If Alice wants to approve and Eve does interfere - no requirements from the external state

  12. Can we guarantee the requirements? • No – when Alice wants to approve she sends (and receives) a finite set of bits on the channel. Eve can guess them. • To the rescue - probability. • Want that Eve will succeed with low probability. • How low? Related to the string length that Alice sends…

  13. Example: Identification X X Alice Bob ?? Eve

  14. Suppose there is a setup period • There is a setup where Alice and Bob can agree on a common secret • Eve only controls the channel, does not see the internal state of Alice and Bob (only external state of Bob) Simple solution: • Alice and Bob choose a random string X R{0,1}n • When Alice wants to approve – she sends X • If Bob gets any symbols on channel – compares to X • If equal moves to Y • If not equal moves permanently to N

  15. Eve’s probability of success • If Alice did not send X and Eve put some string X’ on the channel, then • Bob moves to Y only if X= X’ Prob[X=X’] ≤ 2-n Good news: can make it a small as we wish • What to do if Alice and Bob cannot agree on a uniformly generated string X?

  16. Less than perfect random variables • Suppose X is chosen according to some distribution Px over some set of symbols Γ • What is Eve’s best strategy? • What is her probability of success

  17. (Shannon) Entropy Let X be random variable over alphabet Γ with distribution Px The (Shannon) entropy of X is H(X) = - ∑ x ΓPx (x) log Px (x) Where we take 0 log 0 to be 0. Represents how much we can compressX

  18. Examples • If X=0 (constant) then H(x) = 0 • Only case where H(x) = 0 is when x is constant • All other cases H(x) >0 • If X {0,1} and Prob[X=0] = p and Prob[X=1]=1-p, then H(X) = -p log p + (1-p) log (1-p) ≡ H(p) If X {0,1}n and is uniformly distributed, then H(X) = - ∑ x  {0,1}n1/2n log 1/2n =2n/2n n = n

  19. Properties of Entropy • Entropy is bounded H(X) ≤ log | Γ | with equality only if X is uniform over Γ

  20. Does High Entropy Suffice for Identification? • If Alice and bob agree on X {0,1}n where X has high entropy (say H(X) ≥ n/2 ), what are Eve’s chances of cheating? • Can be high: say • Prob[X=0n ] = 1/2 • For any x1{0,1} n-1 Prob[X=x ] = 1/2n Then H(X) = n/2+1/2 But Eve can cheat with probability at least ½ by guessing that X=0n

  21. Another Notion: Min Entropy Let X be random variable over alphabet Γ with distribution Px The min entropy of X is Hmin(X) = - log max x ΓPx (x) The min entropy represents the most likely value of X Property: Hmin(X) ≤ H(X) Why?

  22. High Min Entropy and Passwords Claim: if Alice and Bob agree on such that Hmin(X) ≥ m, then the probability that Eve succeeds in cheating is at most 2-m Proof: Make Eve deterministic, by picking her best choice, X’ = x’. Prob[X=x’] = Px (x’) ≤ max x ΓPx (x) = 2 –Hmin(X) ≤ 2-m Conclusion: passwords should be chosen to have high min-entropy!

  23. Good source on Information Theory: T. Cover and J. A. Thomas, Elements of InformationTheory

  24. One-time vs. many times • This was good for a single identification. What about many identification? • Later…

  25. A different scenario – now Charlie is involved • Bob has no proof that Alice indeed identified • If there are two possible verifiers, Bob and Charlie, they can each pretend to each other to be Alice • Can each have there own string • But, assume that they share the setup phase • Whatever Bob knows Charlie know • Relevant when they are many of possible verifiers!

  26. The new requirement • If Alice wants to approve and Eve does not interfere – Bob moves to state Y • If Alice does not approve, then for any behavior from Eve and Charlie, Bob stays in N • Similarly if Bob and Charlie are switched Charlie Alice Bob Eve

  27. Can we achieve the requirements? • Observation: what Bob and Charlie received in the setup phase might as well be public • Therefore can reduce to the previous scenario (with no setup)… • To the rescue - complexity Alice should be able to perform something that neither Bob nor Charlie (nor Eve) can do Must assume that the parties are not computationally all powerful!

  28. Function and inversions • We say that a function f is hard to invert if given y=f(x) it is hard to find x’ such that y=f(x’) • x’ need not be equal to x • We will use f-1(y) to denote the set of preimages of y • To discuss hard must specify a computational model • Use two flavors: • Concrete • Asymptotic

  29. One-way functions - asymptotic A function f: {0,1}* → {0,1}* is called aone-way function, if • f is a polynomial-time computable function • Also polynomial relationship between input and output length • for every probabilistic polynomial-time algorithm A, every positive polynomial p(.), and all sufficiently large n’s Prob[A[f(x)] f-1(f(x)) ] ≤ 1/p(n) Where x is chosen uniformly in {0,1}nand the probability is also over the internal coin flips of A

  30. One-way functions – concrete version A function f:{0,1}n → {0,1}n is called a (t,ε)one-way function, if • f is a polynomial-time computable function (independent of t) • for every t-time algorithm A, Prob[A[f(x)] f-1(f(x)) ] ≤ ε Where x is chosen uniformly in {0,1}nand the probability is also over the internal coin flips of A Can either think oft and εas being fixed or ast(n), ε(n)

  31. Complexity Theory and One-way Functions • Claim: if P=NP then there are no one-way functions • Proof: for any one-way function f: {0,1}n → {0,1}n consider the language Lf : • Consisting of strings of the form {y, b1, b2…bk} • There is an x  {0,1}n such that y=f(x) and • The first k bits of x are b1, b2…bk Lfis NP – guess x and check If Lfis P then f is invertible in polynomial time: Self reducibility

  32. A few properties and questions concerning one-way functions • Major open problem: connect the existence of one-way functions and the P=NP? question • If f is one-to-one it is a called a one-way permutation. In what complexity class does the problem of inverting one-way permutations reside? • good exercise! • If f’ is a one-way function, is f’ where f’(x) is f(x) with the last bit chopped a one-way function? • If f is a one-way function, is fL where fL(x) consists of the first half of the bits of f(x) a one-way function? • good exercise! • If f is a one way function is g(x) = f(f(x)) necessarily a one-way function? • good exercise!

  33. Solution to the password problem • Assume that • f: {0,1}n → {0,1}n is a (t,ε) one-way function • Adversary’s run times is bounded by t • Setup phase: • Alice chooses xR{0,1}n • computes y=f(x) • Gives y to Bob and Charlie • When Alice wants to approve – she sends x • If Bob gets any symbols on channel – call them z; compute f(z) and compares to y • If equal moves to state Y • If not equal moves permanently to state N

  34. Eve’s and Charlie’s probability of success • If Alice did not send x and Eve (Charlie) put some string x’ on the channel to Bob, then: • Bob moves to state Y only if f(x’)=y=f(x) • But we know that Prob[A[f(x)] f-1(f(x)) ] ≤ ε or else we can use Eve to break the one-way function Good news: if ε can be made as small as we wish, then we have a good scheme. • Can be used for monitoring • Similar to the Unix password scheme • f(x) stored in login file • DES used as the one-way function. A’ Eve y y y x’ x’

  35. Reductions • This is a simple example of a reduction • Simulate Eve’s algorithm in order to break the one-way function • Most reductions are much more involved

  36. Cryptographic Reductions Show how to use an adversary for breaking primitive 1 in order to break primitive 2 Important • Run time: how does T1 relate to T2 • Probability of success: how does 1 relate to 2 • Access to the system 1 vs. 2

  37. Are one-way functions essential to the two guards password problem? • Precise definition: • for every probabilistic polynomial-time algorithm Acontrolling Eve and Charlie • every polynomial p(.), • and all sufficiently large n’s Prob[Bob moves Y | Alice does not approve] ≤ 1/p(n) • Recall observation: what Bob and Charlie received in the setup phase might as well be public • Claim: can get rid of interaction: • given an interactive identification protocol possible to construct a noninteractive one. In new protocol: • Alice’ sends Bob’ the random bits Alice used to generate the setup information • Bob’ simulates the conversation between Alice and Bob in original protocol and accepts only if simulated Bob accepts. • Probability of cheating is the same

  38. One-way functions are essential to the two guards password problem • Are we done? Given a noninteracive identification protocol want to define a one-way function • Define function f(r) as the mapping that Alice does in the setup phase between her random bits r and the information y given to Bob and Charlie • Problem: the function f(r) is not necessarily one-way… • Can be unlikely ways to generate it. Can be exploited to invert. • Example: Alice chooses x, x’{0,1}n if x’=0n set y=x o.w. set y=f(x) • The protocol is still secure, but with probability 1/2n not complete • The resulting function f(x,x’) is easy to invert: • given y{0,1}n set inverse as (y, 0n )

  39. One-way functions are essential to the two guards password problem… • However: possible to estimate the probability that Bob accepts on a given string from Alice • Second attempt: define function f(r) as • the mapping that Alice does in the setup phase between her random bits r and the information given to Bob and Charlie, • plus a bit indicating that probability of Bob accepts given r is greater than 2/3 Theorem: the two guards password problem has a solution if and only if one-way functions exist

  40. Examples of One-way functions Examples of hard problems: • Subset sum • Discrete log • Factoring (numbers, polynomials) into prime components How do we get a one-way function out of them? Easy problem

  41. Subset Sum • Subset sum problem: given • n numbers 0 ≤ a1,a2 ,…,an ≤2m • Target sum T • Find subset S⊆ {1,...,n} ∑ i S ai,=T • (n,m)-subset sum assumption: for uniformly chosen • a1,a2 ,…,an R{0,…2m -1} and S⊆{1,...,n} • For any probabilistic polynomial time algorithm, the probability of finding S’⊆{1,...,n} such that ∑ i S ai= ∑ i S’ ai is negligible, where the probability is over the random choice of the ai‘s, S and the inner coin flips of the algorithm • Subset sum one-way function f:{0,1}mn+n → {0,1}m f(a1,a2 ,…,an , b1,b2 ,…,bn ) = (a1,a2 ,…,an , ∑ i=1nbi ai mod 2m )

  42. Exercise • Show a function f such that • if f is polynomial time invertible on all inputs, then P=NP • f is not one-way

  43. Discrete Log Problem • Let Gbe a group andgan element inG. • Let y=gzand xthe minimal non negative integer satisfying the equation. xis called the discrete log ofyto baseg. • Example: y=gx mod pin the multiplicative group ofZp • In general: easy to exponentiate via repeated squaring • Consider binary representation • What about discrete log? • If difficult,f(g,x) = (g, gx ) is a one-way function

  44. Integer Factoring • Consider f(x,y) = x • y • Easy to compute • Is it one-way? • No: if f(x,y) is even can set inverse as (f(x,y)/2,2) • If factoring a number into prime factors is hard: • Specifically given N= P • Q , the product of two random large (n-bit) primes, it is hard to factor • Then somewhat hard – there are a non-negligible fraction of such numbers ~ 1/n2 from the density of primes • Hence a weak one-way function • Alternatively: • let g(r) be a function mapping random bits into random primes. • The function f(r1,r2) = g(r1) • g(r2) is one-way

  45. Weak One-way function A function f: {0,1}n → {0,1}n is called aweak one-way function, if • f is a polynomial-time computable function • There exists a polynomial p(¢), forevery probabilistic polynomial-time algorithm A, and all sufficiently large n’s Prob[A[f(x)] f-1(f(x)) ] ≤ 1-1/p(n) Where x is chosen uniformly in {0,1}nand the probability is also over the internal coin flips of A

  46. Exercise: weak exist if strong exists Show that if strong one-way functions exist, then there exists a a function which is a weak one-way function but not a strong one

  47. What about the other direction? • Given • a function f that is guaranteed to be a weak one-way • Let p(n) be such that Prob[A[f(x)] f-1(f(x)) ] ≤ 1-1/p(n) • can we construct a function g that is (strong) one-way? An instance of a hardness amplification problem • Simple idea: repetition. For some polynomial q(n) define g(x1,x2 ,…,xq(n) )=f(x1), f(x2), …, f(xq(n)) • To invert g need to succeed in inverting f in all q(n) places • If q(n) = p2(n) seems unlikely (1-1/p(n))p2(n) ≈ e-p(n) • But how to we show? Sequential repetition intuition – not a proof.

  48. Want: Inverting g with low probability implies inverting f with high probability • Given a machine A that inverts g want amachine A’ • operating in similar time bounds • inverts f with high probability • Idea: given y=f(x) plug it in some place in g and generate the rest of the locations at random z=(y, f(x2), …, f(xq(n))) • Ask machine Ato invert g at point z • Probability of success should be at least (exactly) A’s Probability of inverting g at a random point • Once is not enough • How to amplify? • Repeat while keeping y fixed • Put y at random position (or sort the inputs to g )

  49. Proof of Amplification for Repetition of Two • Concentrate on repetition of two g(x1,x2 )=f(x1), f(x2) • Goal: show that the probability of inverting g is roughly squared the probability of inverting f just as would be sequentially • Claim: Let (n) be a function that for some p(n) satisfies 1/p(n)≤ (n) ≤ 1-1/p(n) Let ε(n) be any inverse polynomial function suppose that for every polynomial timeAand sufficiently large n Prob[A[f(x)] f-1(f(x)) ] ≤ (n) Then for every polynomial timeB and sufficiently large n Prob[B[g(x1,x2 )] g-1(g(x1,x2 )) ] ≤ 2(n)+ ε(n)

  50. Proof of Amplification for Two Repetition Suppose not, then given a better than 2+εalgorithm B for inverting g construct the following: • B’(y) Inversion algorithm forf • Repeatttimes • Choose x’at random and computey’=f(x’) • Run B(y,y’). • Check the results • If correct: Halt with success • Output failure Inner loop Helpful for constructive algorithm

More Related