1 / 66

How to Grow Your Lower Bounds

How to Grow Your Lower Bounds. Mihai P ătra ș cu. Tutorial, FOCS’10. A Personal Story. MIT freshman, 2002 … half year and no solution later. What problem could I work on?. P vs. NP. How far did you guys get, anyway?. What lower bounds can we prove?. “Partial Sums” problem:

fagan
Download Presentation

How to Grow Your Lower Bounds

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How to Grow Your Lower Bounds Mihai Pătrașcu Tutorial, FOCS’10

  2. A Personal Story MIT freshman, 2002 … half year and no solution later What problem could I work on? P vs. NP How far did you guys get, anyway?

  3. What lower bounds can we prove? • “Partial Sums” problem: • (Augmented) Binary search trees: tu = tq = O(lgn) • Open problem: max { tu, tq} = Ω(lgn) • Ω(lgn) not knownfor any dynamic problem Maintain an array A[n] under:update(k, Δ): A[k] = Δquery(k): return A[1] + … + A[k]

  4. What kind of “lower bound”? Lower bounds you can trust.TM • Memory: array of S words • Word: w = Ω(lgS) bits • Unit-time operations: • random access to memory • +, -, *, /, %, <, >, ==, <<, >>, ^, &, |, ~ address Internal state: O(w) bits Hardware: TC0 Mem[address]

  5. What kind of “lower bound”? Lower bounds you can trust.TM • Memory: array of S words • Word: w = Ω(lgS) bits • Unit-time operations: • random access to memory • any function of two words (nonuniform) ell Probe Model address Mem[address]

  6. Maintain an array A[n] under:update(k, Δ): A[k] = Δquery(k): return A[1]+…+A[k] Theorem: max { tu,tq} = Ω(lgn) [Pătrașcu, Demaine SODA’04] I will give the full proof.

  7. Maintain an array A[n] under:update(k, Δ): A[k] = Δquery(k): return A[1]+…+A[k] A[1] A[n] k Δ1 Δ2 The hard instance: π = random permutation for t = 1 to n:query(π(t))Δt= random() mod 2wupdate(π(t), Δt) Δ3 Δ4 Δ5 Δ6 Δ7 Δ8 Δ9 Δ10 Δ11 Δ12 Δ13 Δ14 Δ15 Δ16 time

  8. Δ1 Δ2 Δ3 Δ4 Δ5 W = written cells Δ6 Δ7 Δ8 Δ9 R = read cells Δ10 Δ11 Δ12 Δ13 • How can Mac help PC run ? Δ14 t = 9,…,12 Δ15 Δ16 Address and contents of cells W ∩ R time

  9. Δ1 Δ2 Δ3 Δ4 Δ5 Δ6 Δ7 Δ8 Δ1+Δ5+Δ3+Δ7+Δ2 Δ1 Δ1+Δ5+Δ3 Δ13 How much information needs to be transferred? Δ1+Δ5+Δ3+Δ7+Δ2+Δ6+Δ4 Δ14 Δ16 Δ17 time PC learns Δ5,Δ5+Δ7,Δ5+Δ6+Δ7 ⇒ entropy ≥ 3 words

  10. The general principle k updates Message entropy ≥ w · # down arrows E[down arrows] = (2k-1) ∙ Pr[ ] ∙ Pr[ ] = (2k-1) ∙ ½ ∙ ½ = Ω(k) k queries * read during * written during mauve period # memory cells • = Ω(k) beige period

  11. Putting it all together • Ω(n/8) • Ω(n/4) Every read instruction counted once @ lowest_common_ancestor( , ) • Ω(n/8) • Ω(n/2) write time read time • Ω(n/8) • Ω(n/4) • Ω(n/8) • totalΩ(nlgn) time

  12. Q.E.D. The optimal solution for maintaining partial sums = binary search trees

  13. What were people trying before? A[1] A[n] Δ1 Δ2 k [Fredman, Saks STOC’89]Ω(lgn / lglgn) The hard instance: π = random permutation for t = 1 to n:Δt= random() mod 2wupdate(π(t), Δt) query(random() mod n) Δ3 Δ4 Δ5 Δ6 Δ7 Δ8 Δ9 Δ10 Δ11 Δ12 Δ13 Δ14 Δ15 query time

  14. Epochs Δ1 Δ2 Build epochs of (lgn)i updates Wi = cells last written in epoch i Claim: E[#cells read by query from Wi] = Ω(1) ⇒ E[tq] = Ω(lgn / lglgn) Δ3 Δ4 Δ5 Δ6 Δ7 Δ8 Δ9 Δ10 Δ11 Δ12 Δ13 Δ14 Δ15 query time

  15. Epochs Δ1 Δ2 Focus on some epoch i x = E[#cells read by query from Wi] Δ3 Δ4 Δ5 Δ6 Δ7 Δ8 Δ9 Δ10 i Δ11 Δ12 Δ13 Δ14 Δ15 Generate (lgn)i random queries time

  16. Δ1 Δ2 Δ3 Δ4 Δ5 Δ6 Δ7 Δ8 Δ9 Δ10 i Δ11 Δ12 Δ13 Δ14 Δ15 time

  17. Δ1 Δ2 Entropy = Ω(w · lgin) bits Possible message: W0 ∪ W1 ∪ … ∪ Wi-1 Σj<i (lgn)jtu= O(lgi-1n·tu) cells cells read by queries from Wi E[#cells] = x · lgin ⇒ x = Ω(1) Q.E.D. Δ3 Δ4 Δ5 Δ6 Δ7 Δ8 Δ9 Δ10 i Δ11 Δ12 Δ13 Δ14 Δ15 time

  18. Dynamic Lower Bounds 1989 [Fredman, Saks]partial sums, union-find 1991 [Ben-Amram, Galil] 1993 [Miltersen, Subramanian, Vitter, Tamassia] 1996 [Husfeldt, Rauhe, Skyum] 1998 [Fredman, Henzinger]dynamic connectivity [Husfeldt, Rauhe]nondeterminism [Alstrup, Husfeldt, Rauhe] marked ancestor 1999 [Alstrup, Ben-Amram, Rauhe] union-find 2001 [Alstrup, Husfeldt , Rauhe] dynamic 2D NN 2004 [Pătrașcu, Demaine] partial sums Ω(lg n) [Pătrașcu, Demaine] dynamic connectivity 2005[Pătrașcu, Tarniță] Ω(lg n) by epochs 2010[Pătrașcu]proposal for nΩ(1) [Verbin, Zhang]buffer trees 2011[Iacono, Pătrașcu]buffer trees [Pătrașcu, Thorup]dynamic connectivity, union-find

  19. Marked Ancestor Maintain a perfect B-ary tree under: mark(v) / unmark(v) query(v): does v have a marked ancestor? query time

  20. Marked Ancestor query time

  21. Marked Ancestor P[marked] ≈ 1/ logBn query version 2 … Wi = cells written by all versions time version lgn

  22. Reductions from Marked Ancestor Dynamic 1D stabbing: Marked ancestor ↦ Dynamic 1D stabbing Dynamic 1D stabbing ↦ Dynamic 2D range reporting Maintain a set of segments S = { [a1,b1], [a2,b2], … }insert / deletequery(x): is x ∈ [ai, bi] for some [ai, bi] ∈ S ?

  23. Dynamic Lower Bounds 1989 [Fredman, Saks]partial sums, union-find 1991 [Ben-Amram, Galil] 1993 [Miltersen, Subramanian, Vitter, Tamassia] 1996 [Husfeldt, Rauhe, Skyum] 1998 [Fredman, Henzinger]dynamic connectivity [Husfeldt, Rauhe]nondeterminism [Alstrup, Husfeldt, Rauhe] marked ancestor 1999 [Alstrup, Ben-Amram, Rauhe] union-find 2001 [Alstrup, Husfeldt , Rauhe] dynamic 2D NN 2004 [Pătrașcu, Demaine] partial sums Ω(lg n) [Pătrașcu, Demaine] dynamic connectivity 2005[Pătrașcu, Tarniță] Ω(lg n) by epochs 2010[Pătrașcu]proposal for nΩ(1) [Verbin, Zhang]buffer trees 2011?[Iacono, Pătrașcu]buffer trees [Pătrașcu, Thorup]dynamic connectivity, union-find

  24. Dynamic Lower Bounds tq [Pătraşcu, Demaine’04] Partial sums:max { tu, tq } = B·lgnmin { tu, tq } = logBn n1-o(1) nε max=Ω(lgn) lg n tu tu nε lg n

  25. Dynamic Lower Bounds tq [Pătraşcu, Thorup’10] Dynamic connectivity: • tu = B·lgn, tq = logBn • tu= o(lgn) ⇒ tq ≥ n1-o(1) n1-o(1) nε max=Ω(lgn) lg n tu tu nε lg n Maintain an acyclic undirected graph under:insert / delete edgesconnected(u,v): is there a path from u to v?

  26. R ∩ W Entropy lower bound = Ω(k·w) bits ≥ W = cells written k updates k queries R = cells read

  27. R ∩ W • Entropy lower bound • ≤ k/n1-ε ≥ tu=o(lgn), tq=n1-ε ? W = cells written ? k updates k updates k/n1-εqueries ? R = cells read ?

  28. Partial sums: Mac doesn’t care about PC’s updates ⇒ communication complexity ≈ k/n1-ε Dynamic connectivity: nontrivial interaction between Mac’s and PC’s edges ⇒ communication complexity =Ω(k lg n) public coins ? W = cells written ? k updates communication complexity k updates k/n1-εqueries ? R = cells read ?

  29. ? W = cells written ??? ≥ ? k updates communication complexity k updates k/n1-εqueries ≥ Ω(k lg n) ? R = cells read ?

  30. Note: |R|, |W| = o(klgn) Trivial protocol: O(|R|·lgn) bits ? W = cells written ? ≥ k updates communication complexity k updates k/n1-εqueries ≥ Ω(k lg n) ? R = cells read ?

  31. Note: |R|, |W| = o(klgn) or |R ∩ W| = Ω(k) |R∩W|·lgn+ |R| + |W| ? W = cells written ≥ ? k updates nondeterministic communication complexity k updates k/n1-εqueries ≥ Ω(k lg n) ? R = cells read ?

  32. Dynamic Lower Bounds 1989 [Fredman, Saks]partial sums, union-find 1991 [Ben-Amram, Galil] 1993 [Miltersen, Subramanian, Vitter, Tamassia] 1996 [Husfeldt, Rauhe, Skyum] 1998 [Fredman, Henzinger]dynamic connectivity [Husfeldt, Rauhe]nondeterminism [Alstrup, Husfeldt, Rauhe] marked ancestor 1999 [Alstrup, Ben-Amram, Rauhe] union-find 2001 [Alstrup, Husfeldt , Rauhe] dynamic 2D NN 2004 [Pătrașcu, Demaine] partial sums Ω(lg n) [Pătrașcu, Demaine] dynamic connectivity 2005[Pătrașcu, Tarniță] Ω(lg n) by epochs 2010[Pătrașcu]proposal for nΩ(1) [Verbin, Zhang]buffer trees 2011?[Iacono, Pătrașcu]buffer trees [Pătrașcu, Thorup]dynamic connectivity, union-find

  33. The Multiphase Problem Maintain a directed graph under:insert / delete edgesconnected(u,v): ∃ path from u to v? Dynamic reachability: 1 S1 1 T Hard-looking instance: Si m k Sk Si ∩T? S1, …, Sk ⊆[m] T ⊆[m] time

  34. The Multiphase Problem Conjecture: If m ·tu≪ k, then tq = Ω(mε) Hard-looking instance: S1 1 T time O(k∙m∙tu) time O(m∙tu) time O(tq) Si ∩T? S1, …, Sk ⊆[m] T ⊆[m] m time Sk

  35. The Multiphase Problem Conjecture: If m ·tu≪ k, then tq = Ω(mε) Follows from the 3SUM Conjecture: 3SUM: Given S = { n numbers }, ∃ a,b,c ∈S: a+b+c = 0? Conjecture: 3SUM requires Ω*(n2) time time O(k∙m∙tu) time O(m∙tu) time O(tq) Si ∩T? S1, …, Sk ⊆[m] T ⊆[m] time

  36. The Multiphase Problem Conjecture: If m ·tu≪ k, then tq = Ω(mε) Attack on unconditional proof: 3-party number-on-forehead communication T S1, …, Sk i time O(k∙m∙tu) time O(m∙tu) time O(tq) Si ∩T? S1, …, Sk ⊆[m] T ⊆[m] time

  37. Static Data Structures

  38. Asymmetric Communication Complexity • lgSbits • w bits Can’t look at total communication. • lgS bits • w bits • Input: O(w) bits • Input: n bits

  39. Indexing i ∈ [m] v ∈ {0,1}m Theorem. Either Alice communicates a = Ω(lgm) bits, or Bob communicates b ≥ m1-εbits. v[i] {0,1}m [m]

  40. Indexing i ∈ [m] v ∈ {0,1}m Theorem. Either Alice communicates a = Ω(lgm) bits, or Bob communicates b ≥ m1-εbits. m/2a positions of v fixed to 1 ⇒ b ≥ m/2a v[i] {0,1}m [m] μ ≈2-a μ ≈2-b

  41. Lopsided Set Disjointness m Theorem. Either Alice communicates Ω(nlgm) bits,or Bob communicates ≥ n · m1-ε bits Direct sum on Indexing: • deterministic: trivial • randomized: [Pătraşcu’08] A={ } B = { } n A ∩ B = ∅ ?

  42. A Data Structure Lower Bound Preprocess a database D = strings in {0,1}d query( x ∈ {0,1,*}d ): does x match anything in D? Partial match: C : [m] → {0,1}3lg mconstant-weight code A={ n’s} = { (1,x1), …, (n,xn) } ↦ query( C(x1) ∘… ∘ C(xn) ) B={ ½mn ’s} ↦ D = { ½mn strings } (i,xi) ↦ 0 ∘… ∘ 0 ∘ C(xi) ∘ 0 ∘ … m n

  43. A Data Structure Lower Bound LSD(n, m) ↦ Partial Match: d = Θ(nlgm) Alice sends Ω(nlgm) bits, CPU sends Ω(d) bits, or Bob sends ≥ n · m1-ε bits or Memory sends ≥ |D|1-ε A={ n’s} = { (1,x1), …, (n,xn) } ↦ query( C(x1) ∘… ∘ C(xn) ) B={ ½mn ’s} ↦ D = { ½mn strings } (i,xi) ↦ 0 ∘… ∘ 0 ∘ C(xi) ∘ 0 ∘ … m n

  44. A Data Structure Lower Bound LSD(n, m) ↦ Partial Match: d = Θ(nlgm) Alice sends Ω(nlgm) bits, CPU sends Ω(d) bits, or Bob sends ≥ n · m1-ε bits or Memory sends ≥ |D|1-ε ⇒ tlgS = Ω(d) or t · w ≥ |D|1-ε ⇒ S = 2Ω(d/t) tq n1-o(1) • upper bound ≈ either: • exponential space • near-linear query time O(d/lg n) 1 S O(n) 2O(d)

  45. Space Lower Bounds for tq=O(1) 1995 [Miltersen, Nisan, Safra, Wigderson] 1999 [Borodin, Ostrovsky, Rabani] partial match 2000 [Barkol, Rabani] randomized exact NNS = 2Ω(d) 2003 [Jayram,Khot,Kumar,Rabani] partial match 2004 [Liu]deterministic O(1)-apx NN S = 2Ω(d) 2006 [Andoni, Indyk, Pătraşcu]randomized (1+ε)-apx NN S =nΩ(1/ε²) [Pătraşcu, Thorup]direct sum for near-linear space 2008[Pătraşcu]partial match S = 2Ω(d) [Andoni, Croitoru, Pătraşcu]ℓ∞: apx=Ω(logρlog d) if S=nρ [Panigrahy, Talwar, Wieder]c-apx NNS ≥ n1+Ω(1/c) 2009 [Sommer, Verbin, Yu]c-apx distance oraclesS ≥ n1+Ω(1/c) 2010[Panigrahy, Talwar, Wieder]

  46. Asymmetric Communication Complexity • lgSbits • w bits No difference between S=O(n) and S=nO(1) • lgS bits • w bits • Input: O(w) bits • Input: n bits

  47. Separation S=nlgO(1)n vs. S=nO(1) 2006 [Pătrașcu, Thorup] predecessor search [Pătrașcu, Thorup]exact near neighbor 2007[Pătrașcu] 2D range counting 2008[Pătrașcu] 2D stabbing, etc. [Panigrahy, Talwar, Wieder] c-apx. near neighbor 2009[Sommer, Verbin, Yu] c-apx. distance oracles 2010[Panigrahy, Talwar, Wieder] c-apx. near neighbor [Greve,Jørgensen,Larsen,Truelsen]range mode 2011 [Jørgensen, Larsen] range median = Tight bounds for space nlgO(1)n

  48. 2D Stabbing Static 2D Stabbing: Goal: If S=nlgO(1)n, the query time must be t = Ω(lgn / lglgn) Remember: Dynamic 1D Stabbing We showed: If tu=lgO(1)n, then tq=Ω(lgn / lglgn) Preprocess D = { n axis-aligned rectangles } query(x,y): is (x,y) ∈ R, for some R ∈ D? Maintain a set of segments S = { [a1,b1], [a2,b2], … }insert / deletequery(x): is x ∈ [ai, bi] for some [ai, bi] ∈ S ?

  49. Persistence Persistent : { dynamic problems } ↦ { static problems } Given dynamic problem ℙ with updateℙ(x), queryℙ(y) Persistent(ℙ) = the static problem Preprocess (x1, x2, …, xT)to support: query(y, t) = the answer of queryℙ(y) after updateℙ(x1), …, updateℙ(xt) v0 v1 v2 v3 x2 x4 x1 x3 … queryℙ(y) on versiont

  50. Persistence Persistent : { dynamic problems } ↦ { static problems } Given dynamic problem ℙ with updateℙ(x), queryℙ(y) Persistent(ℙ) = the static problem Static 2D stabbing ≤ Persistent(dynamic 1D stabbing) Preprocess (x1, x2, …, xT)to support: query(y, t) = the answer of queryℙ(y) after updateℙ(x1), …, updateℙ(xt) insert delete query this version insert delete insert delete delete insert time

More Related