1 / 35

Simple Extractors for All Min-Entropies and a New Pseudo-Random Generator

Simple Extractors for All Min-Entropies and a New Pseudo-Random Generator. Ronen Shaltiel (Hebrew U) & Chris Umans (MSR) 2001. Motivation. Good extractors exist, but are either: Very complex (recursive, iterated, composed) Work only in high min-entropy (TZS)

teo
Download Presentation

Simple Extractors for All Min-Entropies and a New Pseudo-Random Generator

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Simple Extractors for All Min-Entropies and a New Pseudo-Random Generator Ronen Shaltiel (Hebrew U) & Chris Umans (MSR) 2001

  2. Motivation • Good extractors exist, but are either: • Very complex (recursive, iterated, composed) • Work only in high min-entropy (TZS) • Either W(n), or n1/c with logn+O(c2logm) seed • All previously-known PRGs are based on the original NW construction • One other construction exists but requires stronger assumptions

  3. Contributions of This Paper • New extractor construction • Similar to TZS • Requires less min-entropy • New PRG construction • Based on the above extractor • No big improvement in parameters • Both match the current best • But simpler, self-contained construction

  4. Overview of This Talk • Introduction • TZS Reminder • New extractors • New ideas • Construction • Proof • Introduction to PRGs • New PRGs

  5. TZS Extractors • Basic idea: view input x as a bivariate* polynomial 2Fq[y1,y2] • View seed y as a pair <y1,y2> • Extractor output is: • This is a q-ary extractor (output alphabet is Fq)

  6. Reconstruction Paradigm • Assume a next-symbol predictor f:Fi!F c, for small c=r-2 • Show there exists a function Rf(z), s.t.: • For large fraction of x2X, • There exists z s.t. Rf(z)=x • If k>|z|, we get a contradiction.

  7. TZS Reconstruction • Let L be a random line in F2 • x|L is a low-degree univariate polynomial: need only h=deg(x|L) points to know value of x on all L. • Get h(i-1) values from advice string for i-1 successive parallel lines • Use predictor f to predict next line

  8. Details, Details… • Predictor f is often wrong • Points on L are pairwise-independent • Can use Chebyshev to bound prob. that less than h will be correct • f predicts lists of r-2 possible values • add to advice string true value of x on random point on L • W.h.p., agrees only with true candidate • Requires O(m) more values

  9. Last Comments • We described a bivariate extractor; this can be generalized to d-variate • Reduces h, which is good • However, we need to predict hd values, so we end up losing more than we gain • We’ve already seen how to convert a q-ary extractor to a binary one.

  10. Pseudo-Random Generators • The computational equivalent of extractors: • Many (theoretical) applications

  11. PRG: Formal Definition • An e-PRG for size s is a function G:{0,1}t!{0,1}m, s.t. for any circuit C of size < s: • Equivalent to next-bit predictors: no function f of size s can satisfy:

  12. q-ary PRGs • Analogous to q-ary extractors • A r-q-ary PRG has no next-symbol predictor f:Fqi-1!Fqc s.t.: Where c = r2 • Like extractors, q-ary PRG’s can be converted to binary ones.

  13. Main Idea • Basically, same as extractor • Use a hard predicate x(i) instead of a weak random source • PRGs imply hard predicates: polytime function that require large circuits. • Prove using reconstruction paradigm • A predictor implies we can compute the hard function with a small circuit

  14. Problem… And Solution • We need too many prediction steps • Need to compute x for anyi • Increases circuit size • Solution: predict in jumps of growing sizes 1,m,m2,…,m`-1 • Use ` different PRG “candidates” • Each uses different step size • If none is really a PRG, we can predict • The XOR of all candidates is a PRG.

  15. Some Definitions • Let x:{0,1}log n!{0,1} be a hard function (no circuit smaller than s) • Let F’ be a subspace of F, |F’|=h • Need hd>n • Let A be the successor matrix of Fd, and A’ of F’d • Let 1 be the all-ones vector in Fd • 12F’d as well

  16. Construction • Define, for j=1,…,`: • Each of these corresponds to one of the jump lengths. • To get a PRG, we XOR all of them.

  17. Proof • Need h to be a prime power, q a power of h. • We want a polynomial x(A’i1)=x(i) • F’d is big enough to find one • x has degree  h in all variables, total degree hd • Only takes values in F’, and these have order h

  18. Proof (Cont.) • Assume none of ` candidates is good • Let f(j) be the predictor for Gx(j) • We will reconstruct x from those using a small circuit (contradiction!) • Advice string contains value of x on m consecutive places • Actually m consecutive curves • Use same overlapped prediction process as before (almost…)

  19. Stepping Scheme • Denote first advice value by Aa1, and we want to get to i = Ab1 • First, predict Ac11, where c1 has the same lowest m-ary digit as b • Now, predict Ac21, where c2 has the same two lowest m-ary digits as b • Go on, until we can predict i.

  20. Stepping Scheme: Example a a+1 a+m-1 m=5 (a)m=134 (b)m=302

  21. Stepping Scheme: Example a a+1 a+m-1 f (0) m=5 (a)m=134 (b)m=302

  22. Stepping Scheme: Example a a+1 a+m-1 a+m f (0) m=5 (a)m=134 (b)m=302

  23. Stepping Scheme: Example a a+1 a+m-1 a+m a+m+1 m=5 (a)m=134 (b)m=302

  24. Stepping Scheme: Example a a+1 a+m-1 a+m a+2m-1 m=5 (a)m=134 (b)m=302

  25. Stepping Scheme: Example a a+1 a+m-1 a+m a+m2 m=5 (a)m=134 (b)m=302

  26. Stepping Scheme: Example a a+1 c1 a+m a+m2 m=5 (a)m=134 (c1)m=142 (b)m=302

  27. Stepping Scheme: Example c1+m a a+1 c1 a+m c1+3m a+m2 c1+(m-1)m m=5 (a)m=134 (c1)m=142 (b)m=302

  28. Stepping Scheme: Example c1 c1+m c1+4m c1+m2 f (1) m=5 (a)m=134 (c1)m=142 (b)m=302

  29. Stepping Scheme: Example c1 c1+m c1+4m c1+m2 c1+m2+m m=5 (a)m=134 (c1)m=142 (b)m=302

  30. Stepping Scheme: Example c1 c1+m c1+4m c1+m2 c1+m3 m=5 (a)m=134 (c1)m=142 (b)m=302

  31. Stepping Scheme: Example c1 c2 c1+4m c1+m2 c1+m3 m=5 (a)m=134 (c1)m=142 (c2)m=202 (b)m=302

  32. Stepping Scheme: Example c1 c2 c1+4m c1+m2 c1+m3 C2+(m-1)m2 m=5 (a)m=134 (c1)m=142 (c2)m=202 (b)m=302

  33. Stepping Scheme: Example c1 c2 c1+4m c1+m2 b c1+m3 C2+(m-1)m2 m=5 (a)m=134 (c1)m=142 (c2)m=202 (b)m=302

  34. One More Snag • We’re predicting along curves in interleaved fashion • Curves need to intersect randomly • But now we are changing step sizes • For all i, and all step sizes S=mj, need Aip1 and Ai+Sp2 to intersect at r random points. • Can be done if curve degree is `r.

  35. Results • Given a hard predicate on logn bits • Computable in poly(n) • Minimum circuit size s • We construct a 1/m-PRG for size m • m=sW(1) • Seed length t=O(log2n/logs) • Output length m • Computable in poly(n)

More Related