1 / 44

CS151 Complexity Theory

CS151 Complexity Theory. Lecture 5 April 13, 2004. Introduction. Power from an unexpected source? we know P ≠ EXP , which implies no poly-time algorithm for Succinct CVAL poly-size Boolean circuits for Succinct CVAL ??. Introduction. …and the depths of our ignorance:.

talor
Download Presentation

CS151 Complexity Theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS151Complexity Theory Lecture 5 April 13, 2004

  2. Introduction Power from an unexpected source? • we know P ≠ EXP, which implies no poly-time algorithm for Succinct CVAL • poly-size Boolean circuits for Succinct CVAL ?? CS151 Lecture 4

  3. Introduction …and the depths of our ignorance: Does NP have linear-size, log-depth Boolean circuits ?? CS151 Lecture 4

  4. Outline • Boolean circuits and formulae • uniformity and advice • the NC hierarchy and parallel computation • the quest for circuit lower bounds • a lower bound for formulae CS151 Lecture 4

  5. Boolean circuits  • C computes function f:{0,1}n  {0,1} in natural way • identify C with function f it computes • circuit C • directed acyclic graph • nodes: AND (); OR (); NOT (); variables xi      x1 x2 x3 … xn CS151 Lecture 4

  6. Boolean circuits • size = # gates • depth = longest path from input to output • formula (or expression): graph is a tree • every function f:{0,1}n  {0,1} computable by a circuit of size at most O(n2n) • AND of n literals for each x such that f(x) = 1 • OR of up to 2n such terms CS151 Lecture 4

  7. Circuit families • circuit works for specific input length • we’re used to f:∑* {0,1} • circuit family : a circuit for each input length C1, C2, C3, … = “{Cn}” • “{Cn} computes f” iff for all x C|x|(x) = f(x) • “{Cn} decides L”, where L is the language associated with f CS151 Lecture 4

  8. Connection to TMs • TM M running in time t(n) decides language L • can build circuit family {Cn} that decides L • size of Cn = O(t(n)2) • Proof: CVAL construction • Conclude: L  Pimplies family of polynomial-size circuits that decides L CS151 Lecture 4

  9. Connection to TMs • other direction? • A poly-size circuit family: • Cn = (x1  x1) if Mn halts • Cn = (x1  x1) if Mn loops • decides (unary version of) HALT! • oops… CS151 Lecture 4

  10. Uniformity • Strange aspect of circuit family: • can “encode” (potentially uncomputable) information in family specification • solution: uniformity – require specification is simple to compute • Definition: circuit family {Cn} is logspaceuniform iff TM M outputs Cn on input 1n and runs in O(log n) space CS151 Lecture 4

  11. Uniformity Theorem: P = languages decidable by logspace uniform, polynomial-size circuit families {Cn}. • Proof: • already saw () • () on input x, generate C|x|, evaluate it and accept iff output = 1 CS151 Lecture 4

  12. TMs that take advice • family {Cn}without uniformity constraint is called “non-uniform” • regard “non-uniformity” as a limited resource just like time, space, as follows: • add read-only “advice” tape to TM M • M “decides L with advice A(n)” iff M(x, A(|x|)) accepts  x  L • note: A(n) depends only on |x| CS151 Lecture 4

  13. TMs that take advice • Definition: TIME(t(n))/f(n) = the set of those languages L for which: • there exists A(n) s.t. |A(n)| ≤ f(n) • TM M decides L with advice A(n) • most important such class: P/poly = kTIME(nk)/nk CS151 Lecture 4

  14. TMs that take advice Theorem: L  P/polyiff L decided by family of (non-uniform) polynomial size circuits. • Proof: • () Cn from CVAL construction; hardwire advice A(n) • () define A(n) = description of Cn; on input x, TM simulates Cn(x) CS151 Lecture 4

  15. Approach to P/NP • Believe NP  P • equivalent: “NP does not have uniform, polynomial-size circuits” • Even believeNP  P/poly • equivalent: “NP(or, e.g. SAT) does not have polynomial-size circuits” • implies P≠NP • many believe: best hope for P≠NP CS151 Lecture 4

  16. Parallelism • uniform circuits allow refinement of polynomial time: depth  parallel time circuit C size  parallel work CS151 Lecture 4

  17. Parallelism • the NC(“Nick’s Class”) Hierarchy (of logspace uniform circuits): NCk = O(logk n) depth, poly(n) size NC = kNCk • captures “efficiently parallelizable problems” • not realistic? overly generous • OK for proving non-parallelizable CS151 Lecture 4

  18. Matrix Multiplication • what is the parallel complexity of this problem? • work = poly(n) • time = logk(n)? (which k?) n x n matrix A n x n matrix B n x n matrix AB = CS151 Lecture 4

  19. Matrix Multiplication • two details • arithmetic matrix multiplication… A = (ai, k) B = (bk, j) (AB)i,j = Σk (ai,k x bk, j) … vs. Boolean matrix multiplication: A = (ai, k) B = (bk, j) (AB)i,j = k (ai,k  bk, j) • single output bit: to make matrix multiplication a language: on input A, B, (i, j) output (AB)i,j CS151 Lecture 4

  20. Matrix Multiplication • Boolean Matrix Multiplication is in NC1 • level 1: compute n ANDS: ai,k  bk, j • next log n levels: tree of ORS • n2 subtrees for all pairs (i, j) • select correct one and output CS151 Lecture 4

  21. Boolean formulas and NC1 • Previous circuit is actually a formula. This is no accident: Theorem: L  NC1iff decidable by polynomial-size uniform family of Boolean formulas. CS151 Lecture 4

  22. Boolean formulas and NC1 • Proof: • () convert NC1 circuitinto formula • recursively: • note: logspace transformation (stack depth log n, stack record 1 bit – “left” or “right”)    CS151 Lecture 4

  23. Boolean formulas and NC1 • () convert formula of size n into formula of depth O(log n) • note: size ≤ 2depth, so new formula has poly(n) size    key transformation  D C C1 C0 D 1 D 0 CS151 Lecture 4

  24. Boolean formulas and NC1 • D any minimal subtree with size at least n/3 • implies size(D) ≤ 2n/3 • define T(n) = maximum depth required for any size n formula • C1, C0, D all size ≤ 2n/3 T(n) ≤ T(2n/3) + 3 implies T(n) ≤ O(log n) CS151 Lecture 4

  25. Relation to other classes • Clearly NC P • recall P uniform poly-size circuits • NC1L • on input x, compose logspace algorithms for: • generating C|x| • converting to formula • FVAL CS151 Lecture 4

  26. Relation to other classes • NLNC2:S-T-CONN  NC2 • given G = (V, E), vertices s, t • A = adjacency matrix (with self-loops) • (A2)i, j = 1 iff path of length ≤ 2 from node i to node j • (An)i, j = 1 iff path of length ≤ n from node i to node j • compute with depth log n tree of Boolean matrix multiplications, output entry s, t • log2 n depth total CS151 Lecture 4

  27. NC vs. P • can every efficient algorithm be efficiently parallelized? NC = P • P-complete problems least-likely to be parallelizable • if P-complete problem is in NC, then P = NC • Why? • we use logspace reductions to show problem P-complete; L in NC ? CS151 Lecture 4

  28. NC vs. P • can every uniform, poly-size Boolean circuit family be converted into a uniform, poly-size Boolean formula family? NC1= P ? CS151 Lecture 4

  29. Lower bounds • Recall:“NP does not have polynomial-size circuits” (NP  P/poly) implies P≠NP • major goal: prove lower bounds on (non-uniform) circuit size for problems in NP • believe exponential • super-polynomial enough for P≠NP • best bound known: 4.5n • don’t even have super-polynomial bounds for problems in NEXP CS151 Lecture 4

  30. Lower bounds • lots of work on lower bounds for restricted classes of circuits • we’ll see two such lower bounds: • formulas • monotone circuits CS151 Lecture 4

  31. Shannon’s counting argument • frustrating fact: almost all functions require huge circuits Theorem(Shannon): With probability at least 1 – o(1), a random function f:{0,1}n {0,1} requires a circuit of size Ω(2n/n). CS151 Lecture 4

  32. Shannon’s counting argument • Proof (counting): • B(n) = 22n= # functions f:{0,1}n {0,1} • # circuits with n inputs + size s, is at most C(n, s) ≤ ((n+3)s2)s s gates n+3 gate types 2 inputs per gate CS151 Lecture 4

  33. Shannon’s counting argument • C(n, c2n/n) < ((2n)c222n/n2)(c2n/n) < o(1)22c2n < o(1)22n (if c ≤ ½) • probability a random function has a circuit of size s = (½)2n/n is at most C(n, s)/B(n) < o(1) CS151 Lecture 4

  34. Shannon’s counting argument • frustrating fact: almost all functions require hugeformulas Theorem(Shannon): With probability at least 1 – o(1), a random function f:{0,1}n {0,1} requires a formula of size Ω(2n/log n). CS151 Lecture 4

  35. Shannon’s counting argument • Proof (counting): • B(n) = 22n= # functions f:{0,1}n {0,1} • # formulas with n inputs + size s, is at most F(n, s) ≤ 4s2s(n+2)s n+2 choices per leaf 4s binary trees with s internal nodes 2 gate choices per internal node CS151 Lecture 4

  36. Shannon’s counting argument • F(n, c2n/log n) < (16n)(c2n/logn) < 16(c2n/log n)2(c2n) = (1 + o(1))2(c2n) < o(1)22n (if c ≤ ½) • probability a random function has a formula of size s = (½)2n/log n is at most F(n, s)/B(n) < o(1) CS151 Lecture 4

  37. Andreev function • best lower bound for formulas: Theorem (Andreev, Hastad ‘93): the Andreev function requires (,,)-formulas of size at least Ω(n3-o(1)). CS151 Lecture 4

  38. Andreev function yi selector n-bit string y . . . XOR XOR log n copies; n/log n bits each the Andreev function A(x,y) A:{0,1}2n {0,1} CS151 Lecture 4

  39. Random restrictions • key idea: given function f:{0,1}n {0,1} restrict by ρ to get fρ • ρ sets some variables to 0/1, others remain free • R(n, єn) = set of restrictions that leave єn variables free • Definition: L(f) = smallest (,,) formula computing f (measured as leaf-size) CS151 Lecture 4

  40. Random restrictions • observation: EρR(n, єn)[L(fρ)] ≤ єL(f) • each leaf survives with probability є • may shrink more… • propogate constants Lemma (Hastad 93): for all f EρR(n, єn)[L(fρ)] ≤ O(є2-o(1)L(f)) CS151 Lecture 4

  41. Hastad’s shrinkage result • Proof of theorem: • Recall: there exists a function h:{0,1}log n {0,1} for which L(h) > n/2loglog n. • hardwire truth table of that function into y to get A*(x) • apply random restriction from R(n, m = 2(log n)(ln log n)) to A*(x). CS151 Lecture 4

  42. The lower bound • Proof of theorem (continued): • probability given XOR is killed by restriction is probability that we “miss it” m times: (1 – (n/log n)/n)m ≤ (1 – 1/log n)m ≤ (1/e)2ln log n ≤ 1/log2n • probability even one of XORs is killed by restriction is at most: log n(1/log2n) = 1/log n < ½. CS151 Lecture 4

  43. The lower bound • (1): probability even one of XORs is killed by restriction is at most: log n(1/log2n) = 1/log n < ½. • (2): by Markov: Pr[ L(A*ρ) > 2 EρR(n, m)[L(A*ρ)] ] < ½. • Conclude: for some restriction ρ • all XORs survive, and • L(A*ρ) ≤ 2 EρR(n, m)[L(A*ρ)] CS151 Lecture 4

  44. The lower bound • Proof of theorem (continued): • if all XORs survive, can restrict formula further to compute hard function h • may need to add ’s L(h) = n/2loglogn ≤ L(A*ρ) ≤ 2EρR(n, m)[L(A*ρ)] ≤ O((m/n)2-o(1)L(A*)) ≤ O( ((log n)(ln log n)/n)2-o(1)L(A*) ) • implies Ω(n3-o(1)) ≤ L(A*) ≤ L(A). CS151 Lecture 4

More Related