1 / 61

Simple Extractors for all Min-Entropies

Simple Extractors for all Min-Entropies. R.Shaltiel and C.Umans. Definitions. Def (min-entropy): The min-entropy of a random variable X over {0, 1} n is defined as:

Download Presentation

Simple Extractors for all Min-Entropies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Simple Extractors for all Min-Entropies R.Shaltiel and C.Umans

  2. Definitions Def (min-entropy): The min-entropy of a random variable X over {0, 1}n is defined as: Thus a random variable X has min-entropy at least k if Pr[X=x]≤2-k for all x. The maximum possible min-entropy for such a R.V. is n Def (statistical distance): Two distributions on a domain D are e-close if the probabilities they give to any AD differ by at most e (namely, using norm 1)

  3. Weak random source n E Random string m Seed t Definitions Def (extractor): A (k,e)-extractor is a functionE:{0,1}n  {0,1}t{0,1}ms.t. for any R.V. X with min-entropy ≥kE(X,Ut) is e-close to Um(where Um denotes the uniform distribution over {0,1}m)

  4. Weak random source n E Random string m Seed t Parameters The relevant parameters are: • min entropy of the weak random source input – k. Relevant values log(n) k  n(the seed length is t ≥ log(n), hence useless to consider lower min entropy). • seed lengtht ≥ log(n) . • Quality of the output: e. • Size of the output m=f(k). The optimum is m=k.

  5. Extractors High Min-Entropy distribution Uniform-distribution seed 2t 2n 2m E Close to uniform output

  6. Next Bit Predictors Claim: to prove E is an extractor, it suffices to prove that for all 0<i<m+1 and all predictorsf:{0,1}i-1{0,1} Proof: Assume E is not an extractor; then exists a distribution s.t. X s.t. E(X,Ut) is note-close to Um, that is:

  7. Proof Now define the following hybrid distributions:

  8. Proof Summing the probabilities for the event corresponding to the set A for all distributions yields: And because |∑ai|≤ ∑|ai| there exists an index 0<i<m+1 for which:

  9. The Predictor We now define a function f:{0,1}i-1{0,1} that can predict the i’th bit with probability at least ½+e/m (“a next bit predictor”): The function f uniformly and independently draws the bits yi,…,ym and outputs: Note: the above definition is not constructive, as A is not known!

  10. Proof And fis indeed a next bit predictor: Q.E.D.

  11. Basic Example – Safra, Ta-Shma, Zukerman Construction: • Let BC:F{0,1}s be a (inefficient) binary-code • Given • x, a weak random source, interpreted as a polynomial x:F2F and • s, a seed, interpreted as a random point (a,b), and an index j to a binary code. • Def:

  12. (a,b+1) (a,b+m) x(a,b+m) x(a,b) x(a,b+1) 001 001 110 110 000 000 101 101 110 110 Basic Example – Illustration of Construction • x  x, s = ((a,b), 2) • E(x,s)=01001 (a,b) (inefficient) binary code

  13. Basic Example – Proof Sketch • Assume, by way of contradiction:exists a next bit predicator function f. • Next, show a reconstructionfunction R • Conclude, a contradiction!(to the min-entropy assumption of X)

  14. h ~ n1/2 j ~ lgn m ~ desired entropy Basic Example – Reconstruction Function Random line “advice” “Few” red points: a=mjO(h) Repeat using the new points, until all Fd is evaluated List decoding by the predictor f Resolve into one value on the line

  15. Counting Argument For Y X, let (Y)=yYPr[y] (“the weight of Y”) Let R:{0,1}a{0,1}n, s.t. Prx~X[z R(z)=x] 1/2 • (for a uniform X, |R(S)|  |X|/2 ) • For an arbitrary distribution X, (R(S)) (X)/2 • Let X ~ min-entropy  k, • then (R(S))2a-k(there are at most 2a strings in R(S), and xX Pr[x]  2-k) • and therefore k  a - log2(1/2)(1 = (X)  (R(S)) 22a-k 2-1  a-k hence k  a+1) 2nX R(S) R 2aS

  16. Problems with Safra, Ta-Shma, Zukerman • Curse of dimensionality - too many lines!Solution: generator matrix.

  17. Next-q-it List-Predictor f is allowed to output a small list of l possible next elements

  18. q-ary Extractor Def: Let F be a field with q elements. A (k, l)q-ary extractor is a functionE:{0,1}n  {0,1}tFms.t. for all R.V. X with min-entropy ≥k and all 0<i<m and all list-predictors f:Fi-1Fl

  19. Generator Matrix Def: Define the generator matrix for the vector space Fd as a matrix Ad×d, s.t. for any non-zero vector vFd: (that is, any vector 0≠vFd multiplied by all powers of A generates the entire vector space Fd except for 0) Lemma: Such a generator matrix exists and can be found in time qO(d).

  20. Note that for such a polynomial, the number of coefficients is exactly: (“choosing where to put d-1 bars between h-1 balls”) Construction • Let F be a field with q elements, • Let Fd be a vector space over F. • Let h be the smallest integer s.t. • For x {0,1}n, let xdenote the unique d-variate polynomial of total degree h-1 whose coefficients are specified by x.

  21. x(Aiv) Amv x(v) x(Amv) Aiv Fd v v Aiv Amv Construction • The definition of the q-ary extractor: E:{0,1}n {0,1}d log qFm seed, interpreted as a vector v Fd Generator matrix

  22. Main Theorem Thm: For any n,q,d and h as previously defined, E is a (k, l)q-ary extractor if: Alternatively, E is a (k, l)q-ary extractor if:

  23. What’s Ahead • Proving existence of a generator matrix • How the counting argument works • The reconstruction paradigm • Basic example – Safra, Ta-Shma, Zukerman • Proof of the main theorem • From extractors to PRGs

  24. Extension Fields A field F2 is called an extension of another field F if F is contained in F2 as a subfield. Thm: For every power pk (p prime, k>0) there is a unique (up to isomorphism) finite field containing pk elements. These fields are denoted GF(pk).All finite fields’ cardinality have that form. Def: A polynomial is called irreducible in GF(p) if it does not factor over GF(p) Thm: Let f(x) be an irreducible polynomial of degree k over GF(p). The finite field GF(pk) can be constructed using the set of degree k-1 polynomials over Zp, with addition and multiplication carried out modulo f(x)

  25. Extension Fields - Example Construct GF(25) as follows: Let the irreducible polynomial be: Represent every k degree polynomial as a vector of k+1 coefficient: Addition over this field:

  26. Extension Fields - Example And multiplication: And now modulo the irreducible polynomial:

  27. Generator Matrix – Existence Proof Denote by GF*(qd) the multiplicative group of the Galois Field GF(qd). This multiplicative group of the Galois Field is cyclic, and thus has a generator g: Let jbe the natural isomorphism between the Galois Field GF(qd) and the vector space Fd, which matches a polynomial with its vector of coefficients:

  28. Generator Matrix – Existence Proof Now define the generator matrix A of Fd as the linear transformation that corresponds to multiplication by the generator in GF*(qd) : A is a linear transformation because of the distributive property of both the vector space and the field GF(qd), according to the isomorphism properties:

  29. Generator Matrix – Existence Proof It remains to show that the generator matrix A of Fd can be found in time qO(d). And indeed: • The Galois Field GF(qd) can be constructed in time qO(d) using an irreducible polynomial of degree d over the field Zq (and such a polynomial can also be found in time qO(d) by exhaustive search). • The generator of GF(qd) can be found in time qO(d) by exhaustive search • Using the generator, for any basis of Fd, one can construct d independent equations so as to find the linear transformation A.This linear equation system is also solvable in time qO(d) .

  30. “Reconstruction Proof Paradigm” Proof sketch: For a certain R.V. X with min-entropy at least k, assume a function f that violates the properties of a q-ary extractor, construct another function, R :{0,1}a{0,1}n, the “reconstruction function”. This function, using f as a procedure, has the property that: Applying the “counting argument”, this is a contradiction to the assumption that X has min-entropy at least k

  31. Proof Sketch • Let X be a random variable with min-entropy at least k • Assume, by way of contradiction:exists a next bit predicator function f. • Next, show a reconstructionfunction R • Conclude, a contradiction!(to the min-entropy assumption of X)

  32. Main Lemma Lemma: Let n,q,d,h be as in the main theorem. There exists a probabilistic function R:{0,1}a{0,1}n with a = O(mhd logq) such that for every x on which: The following holds (the probability is over the random coins of R):

  33. The Reconstruction Function (R) • Task: allow many strings x in the support of X to be reconstructed from very short advice strings. • Outlines: • Use f in a sequence of prediction steps to evaluate z on all points of Fd,. • Interpolate to recover coefficients of z, • which gives x Next We Show: there exists a sequence of prediction stepsthat works for manyx in the support of X and requires few advice strings

  34. Curves • Let r=Q(d), • Pick random vectors and values • 2r random points y1,…,y2rFd, and • 2r values t1,…,t2rF, and • Define degree 2r-1 polynomials p1,p2 • p1:FFd defined by p1(ti)=yi, i=1,..,2r. • p2:FFd defined by p2(ti)=Ayi, i=1,..,r, and p2(ti)=yi, i=r+1,..,2r. • Define vector sets P1={p1(z)}zF and P2={p2(z)}zF • i>0 define P2i+1=AP2i-1 and P2i+2=AP2i({Pi}, the sequence of prediction stepsare low-degree curves in Fd, chosen using the coin tosses of R)

  35. Ai*(y2) A(y2) A2(y2) A3(y2) A2(y1) Ai*(y1) A(y1) A3(y1) A2(yr) Ai*(yr) A(yr) A3(yr) Ai*(y2) Ai*(yr+1) Ai*(y1) Ai*(y2r) Ai*(yr) Amv A(y2) A2(y2) A2(yr+1) A(yr+1) A(y1) A2(y1) A2(y2r) A(y2r) A(yr) A(yr) Amv y2 Aiv yr+1 Aiv y1 y2r yr A2(yr+1)) yr+1 A(yr+1)) Ai*-1(yr+1)) v t1 t2 tr tr+1 A(y2r) Ai*-1(y2r) A2(y2r) y2r t2r v Curves Fd F

  36. Simple Observations • A is non-singular linear-transform, hence i • Pi is 2r-wise independent collection of points • Pi and Pi+1 intersect at r random points • z|Pi is a univariate polynomial of degree at most 2hr. • Given evaluation of z on Av,A2v,…,Amv, we may use the predictor function f to predict z(Am+1v) to within l values. • We needadvice string: 2hr coefficients of z|Pi for i=1,…,m. (length: at most mhr log q ≤ a)

  37. A(y2) A2(y2) A3(y2) Ai*(y2) Ai*(y1) A2(y1) A(y1) A3(y1) Ai*(yr) A(yr) A2(yr) A3(yr) Ai*(y2) Ai*(yr+1) Ai*(y1) Ai*(y2r) Ai*(yr) Amv A2(y2) A(y2) A2(yr+1) A(yr+1) A(y1) A2(y1) A(y2r) A2(y2r) A(yr) A(yr) y2 Aiv yr+1 y1 y2r yr A(yr+1)) yr+1 A2(yr+1)) Ai*-1(yr+1)) v t1 t2 tr tr+1 y2r A2(y2r) Ai*-1(y2r) A(y2r) t2r Using N.B.P. Cannot resolve into one value! Fd F

  38. Ai*+1(y2) A2(y2) A(y2) Ai*(y2) A3(y2) Ai*+1(y1) Ai*+1(yr) Ai*(y1) A2(y1) A(y1) A3(y1) A3(yr) Ai*(yr) A2(yr) A(yr) Ai*(y2) Ai*(yr+1) Ai*(y1) Ai*(y2r) Ai*(yr) Amv A2(y2) A(y2) A(yr+1) A2(yr+1) A(y1) A2(y1) A(y2r) A2(y2r) A(yr) A(yr) y2 Aiv yr+1 y1 y2r yr yr+1 Ai*-1(yr+1)) A(yr+1)) A2(yr+1)) v t1 t2 tr tr+1 Ai*-1(y2r) A(y2r) y2r A2(y2r) t2r Using N.B.P. Can resolve into one value using the second curve! Fd F

  39. yr+1 y2r Ai*+1(y2) Ai*(y2) A3(y2) A(y2) A2(y2) Ai*+1(y1) Ai*+1(yr) A(y1) Ai*(y1) A3(y1) A2(y1) A(yr) Ai*(yr) A2(yr) A3(yr) Ai*(y2) Ai*(yr+1) Ai*(y1) Ai*(y2r) Ai*(yr) Amv A(y2) A2(y2) A(yr+1) A2(yr+1) A2(y1) A(y1) A(y2r) A2(y2r) A(yr) A(yr) y2 Aiv yr+1 y1 y2r yr yr+1 A(yr+1)) Ai*-1(yr+1)) A2(yr+1)) v t1 t2 tr tr+1 y2r A2(y2r) Ai*-1(y2r) A(y2r) t2r Using N.B.P. Can resolve into one value using the second curve! Fd F

  40. Main Lemma Proof Cont. • Claim: with probability at least 1-1/8qd over the coins tosses of R: • Proof: We use the following tail bound: Let t>4 be an even integer, and X1,…,Xn be t-wise independent R.V. with values in [0,1]. Let X=Xi, =E[X], and A>0. Then:

  41. Main Lemma Proof Cont. • According to the next bit predictor, the probability for successful prediction is at least 1/2√l. • In the i’th iteration we make q predictions (as many points as there are on the curve). • Using the tail bounds provides the result. Q.E.D (of the claim). Main Lemma Proof (cont.): Therefore, w.h.p. there are at least q/4√l evaluations points of Pithat agree with the degree 2hr polynomial on the i’th curve (out of a total of at most lq).

  42. Main Lemma Proof Cont. • A list decoding bound: given n distinct pairs (xi,yi) in field F and Parameters k and d, with k>(2dn)1/2, There are at most 2n/k degree d polynomials g such that g(xi)=yi for at least k pairs. Furthermore, a list of all such polynomials can be computed in time poly(n,log|F|). • Using this bound and the previous claim, at most 8l3/2degree 2rh polynomials agree on this number of points (q/4√l ).

  43. Lemma Proof Cont. • Now, • Pi intersect Pi-1 at r random positions, and • we know the evaluation of z at the points in Pi-1 • Two degree 2rh polynomials can agree on at most 2rh/q fraction of their points, • So the probability that an “incorrect” polynomial among our candidates agrees on all r random points in at most

  44. Main Lemma Proof Cont. • So, with probability at leastwe learn points Pi successfully. • After 2qd prediction steps, we have learned z on Fd\{0} (since A is a generator of Fd\{0}) • by the union bound, the probability that every step of the reconstruction is successful is at least ½. Q.E.D (main lemma)

  45. Proof of Main Theorem Cont. • First, • By averaging argument: • Therefore, there must be a fixing of the coins of R, such that:

  46. Ai*+1(y2) A2(y2) A(y2) Ai*(y2) A3(y2) Ai*+1(y1) Ai*+1(yr) Ai*(y1) A2(y1) A(y1) A3(y1) A3(yr) Ai*(yr) A2(yr) A(yr) Ai*(y2) Ai*(yr+1) Ai*(y1) Ai*(y2r) Ai*(yr) Amv A2(y2) A(y2) A(yr+1) A2(yr+1) A(y1) A2(y1) A(y2r) A2(y2r) A(yr) A(yr) y2 Aiv yr+1 y1 y2r yr yr+1 Ai*-1(yr+1)) A(yr+1)) A2(yr+1)) v t1 t2 tr tr+1 Ai*-1(y2r) A(y2r) y2r A2(y2r) t2r Using N.B.P. – Take 2 Unse N.B.P over all points in F, so that we get enough ”good evaluation” Fd F

  47. Proof of Main Theorem Cont. • According to the counting argument, this implies that: • Recall that r=Q(d). • A contradiction to the parameter choice: Q.E.D (main theorem)!

  48. From q-ary extractors to (regular) extractors The simple technique - using error correcting codes: Lemma: Let F be a field with q elements. Let C:{0,1}k=log(q){0,1}n be a binary error correcting code with distance at least 0.5-O(2) . If E: {0,1}n *{0,1}t ->Fm is a (k,O(r)) q-ary extractor, then E’: {0,1}n *{0,1}t+log(n) ->Fm defined by: Is a (k,rm) binary extractor.

  49. From q-ary extractors to (regular) extractors A more complex transformation from q-ary extractors to binary extractors achieves the following parameters: Thm: Let F be a field with q<2m elements. There is a polynomial time computable function: Such that for any (k,r) q-ary extractor E, E’(x;(y,j))=B(E(x;y),j) is a (k,r log*m) binary extractor.

More Related