1 / 37

Isolation Technique

Isolation Technique. April 16, 2001 Jason Ku Tao Li. Outline. Show that we can reduce NP, with high probability, to US . That is: NP randomized reduces to detecting unique solutions. PH  P PP. Isolation Lemma. Definitions Isolation Lemma Example of using Isolation Lemma.

arleen
Download Presentation

Isolation Technique

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Isolation Technique April 16, 2001 Jason Ku Tao Li

  2. Outline • Show that we can reduce NP, with high probability, to US. That is: NP randomized reduces to detecting unique solutions. • PH  PPP

  3. Isolation Lemma • Definitions • Isolation Lemma • Example of using Isolation Lemma

  4. Definition of weight functions A weight function W, maps a finite set U+ For a set SU, W(S)=xSW(x) Let F be a family of non-empty subsets of U. A weight function W is “good for” F if there is a unique minimum weight set in F, and “bad for” F otherwise. Ex: let U={u1, u2, u3}, let F ={(u1), (u2), (u3), (u1u2)} define W1(u1)=1 W2(u1)=1 W1(u2)=2 W2(u2)=1 W1(u3)=3 W2(u3)=2 W1 is good for F while W2 is bad for F.

  5. Isolation Lemma Let U be a finite set Let F1, F2, …Fm be families of non-empty subsets of U Let D = ||U|| Let R > mD Let Z be a set of weight functions s.t. the weight of any individual element of U is at most R Let , 0 <  < 1, be s.t.  > mD/R Then, more than (1- )||Z|| weight functions are good for all of F1, F2, …Fm.

  6. Proof of Isolation Lemma Proof sketch: By definition, a weight function W is bad if there are at least 2 different minimum weight sets in F. Let S1 and S2 be 2 different sets with the same minimum weights, then  xS1 s.t. xS2. Call x ambiguous. If we know the weights of all other elements in U, either x is unambiguous, or there is one specific weight for x that makes x ambiguous.

  7. Lets Count So, for an xU, there are at most RD-1 weight functions s.t. x is ambiguous. There are RD weight functions, m choices for F and D choices for x. Thus the fraction of weight functions that are bad for Fi is at most mDRD-1/RD = mD/R < . So the fraction of weight functions good for Fi is 1- .

  8. Example of Isolating Lemma Let U={u1, u2, u3} D=3 Let F1={(u1), (u1,u3), (u1,u2,u3), (u2)} m=1 R = 4 > mD = 3 ||Z|| = 64 Then at least (1 – ¾)64 = 16 weight functions are good for F. W1(u1)=1 W2(u1)=2 W3(u1)=1 W4(u1)=1 W1(u2)=2 W2(u2)=3 W3(u2)=2 W4(u2)=3 W1(u3)=3 W2(u3)=4 W3(u3)=2 W4(u3)=3 6 variations 6 variations 3 variations 3 variations 18 variations, and more.

  9. Definition of US US = {L | ( NPTM M) (x) x  L  #accM(x)=1}

  10. NP randomized reduces to US NP  RPUS Proof Map: 1) Definitions I, II 2) Apply Isolation Lemma to get a probability 3) Construct an oracle B  US 4) Construct a machine N that uses oracle B 5) show N  RPUS 6) Show x  L  NP implies x  L(N)  RPUS

  11. Definitions I Let A = {<x,y> | NPTML(x) on path y accepts} for L  NP,  a polynomial p, s.t. x*, xL   at least 1 y, |y| = p(|x|), s.t. <x,y>A Encode y as follows: y = y1y2…yn = {i | 1|i| p(n)  yi = 1} ex: y = 1001 = {1, 4} (1 take right branch, 0 take left branch)

  12. Definitions II Let U(x) = {1, 2, …, p(|x|)} D = ||U|| = p(|x|) Let F(x) = y s.t. <x, y>A (collection of accepting paths) m = 1 Let Z(x) = weight functions that assign weights no greater than 4p(|x|) R = 4p(|x|)

  13. Applying Isolation Lemma By the Isolation Lemma: if xL, 3/4 of weights functions assigns F a unique minimum weight set if xL, there are no accepting paths yF so zero weight functions are assigns F a unique minimum weight set

  14. Construct an oracle BUS Let B = {<x, W, j> | WZ(x), 1  j  p2(|x|), and  a unique y  F s.t. W(y) = j} NPTM MB on input u: 1) if u is not of the form <x,W,j> reject 2) else, using p(|x|) non-deterministic moves, selects y and accepts u  <x,y>A and W(y)=j.

  15. Why BUS If uB, there is a unique path y  F s.t. W(y)=j. Thus machine MB will only accept once. If uB, there are either zero, or more than 1 yF s.t. W(y)=j. Thus machine MB will have either zero, or more than 1 accepting path.

  16. Construct an RP machine with oracle B NPTM N on input x: 1) Create random weight functions W properly bounded. 2) For each j, 1  j  4p2(|x|), ask oracle B if <x, W, j>  B. If yes, accept. If no, reject.

  17. NRPUS and xL high probability xN For every x*, - If xL, MB on <x, W, j> accepts with probability ¾, so N accepts with probability ¾. - If xL, MB on <x, W, j> rejects with probability 1, so N rejects with probability 1. So, - xL high probability xN - Since xL implies (N, x) = ¾ > .5 acceptance, and xL implies (N, x) = 1 rejecting, NRP

  18. Definition of P and #P P = {L | ( NPTM M) (x) xL  #accM(x) is odd} #P ={f | ( NPTM M) (x) f(x) = #accM(x) }

  19. Toda’s Theorem PH  PPP Three major parts to prove it: • (Valiant&Vazrani) NP  BPPP • Theorem 4.5 • Lemma 4.13 PPP  PPP, hence BPPP  PPP

  20. (Valiant&Vazrani) NP  BPPP Proof: • Let A  NP, A = L(M) and M runs in time p(|x|). • Let B={(x,w,k): M has an odd # of accepting paths on input x having weight k}, w:{1,…,p(|x|)}----{1,…,4p(|x|)}, B P

  21. (Valiant &Vazrani) Cont. • For a BPPP algorithm, consider On input x Randomly pick w for k:=1 to 4p2(|x|) ask if (x,w,k) is in B if so, then halt and accept end-for if control reaches here, then halt and reject

  22. Note • Valiant &Vazrani Theorem is relativizable. In other words, we have NPA = BPPPA for every oracle A

  23. Theorem 4.5 PHBPPP We prove it by induction Three steps for induction step: • Apply Valiant & Vazrani to the base machine • Swap BPP and P in the middle • Collapse BPPBPP  BPP, PP  P

  24. Step 1 for Thm. 4.5 • Induction hypothesis: • Since NPA = BPPPA for every oracle A, Hence,

  25. Step 2: Swapping • By lemma 4.9 PBPPA  BPPPA Hence

  26. Step 3: Collapse • Proposition 4.6: BPPBPPA = BPPA • Proposition 4.8: PP = P • Hence

  27. Toda’s Theorem • Proposition 4.15 PPP = P#P • Toda’s Theorem: PP is Hard for the polynomial Hierarchy PH  PPP = P#P

  28. Proof for BPPP  P#P • Let A  BPPP, where A is accepted by MB and let f be the #P function for B. Let nk be the running time of M. • Assume first that M makes only one query along any path. Then let g(x,y) be a #P function that is defined to be the number of accepting paths of the following machine:

  29. Proof cont. 1 On input x,y run M(x) along path y when a query “w is in B?” is made then flip a coin c in {0,1} and use this as the oracle answer and continue simulating M(x) if the simulation accepts, then generate f(w)+(1-c) paths and accept

  30. Proof Cont. 2 • g(x,y) is odd if and only if MB (x) accepts along y • For g(x,y), consider a #P function g’(x,y) such that : g(x,y) is odd, then g’(x,y) =1(mod 2^nk ) g(x,y) is even, then g’(x,y) = 0(mod 2^nk ) • Define h(x)=

  31. Proof Cont.3 • The value h(x) (mod 2^nk )represents the number of y’s such that MB (x) accepts along path y • Our P#P algorithm: on input x, using the oracle h(x), decides if the following holds: • if so, x is accepted, and if not x is rejected

  32. Proof Cont. 4 If M makes more than one query, modify g(x,y) as follows: on input x,y repeat run M(x) along with path y when a query “w is in B?’’ is made then flip a coin c in {0,1} and generate f(w)+(1-c) paths use c as the oracle answer and continue simulating M(x) until no more queries are asked; if the simulation of M(x) along path y accepts with this sequence of guessed oracle queries then accept else reject

  33. Proof Cont. 5 • Call the above machine as N • Claim : MB accepts x along y if and only if #accN(x,y) = g(x,y) is odd

  34. Fact 1 • Let k in N, f in #P, then there exists g in #P such that f(x) is odd then g(x) = 1 (mod 2^nk ) f(x) is even than g(x) = 0 (mod 2^nk )

  35. Fact 2 • Let f(x,y) be a #P function, then • Let M be the machine such that #accM(x,y)=f(x,y). Consider the following machine M’: on input x compute |x|k guess y of length |x|k run M(x,y) • g(x)= #accM’ (x,y)

  36. Discussions • UL/Poly = NL/Poly • ? UL= NL • ? UP = NP • NPPSPACE = UPPSPACE = PSPACE • There is an oracle relative to which NP<>UP.

  37. Conclusions We’ve shown, by use of the isolation lemma, that NP  RPUS  BPPP. This was the base case of an inductive proof to show PH  BPPP. From there we extended to Toda’s theorem: PH  PPP = P#P.

More Related