1 / 55

Venkatesan Guruswami (CMU) Yury Makarychev (TTI-C) Prasad Raghavendra (Georgia Tech)

Finding Almost-Perfect Graph Bisections. Venkatesan Guruswami (CMU) Yury Makarychev (TTI-C) Prasad Raghavendra (Georgia Tech) David Steurer (MSR) Yuan Zhou (CMU). Bipartite graph recognition. Depth-first search/breadth-first search With some noise?

lapis
Download Presentation

Venkatesan Guruswami (CMU) Yury Makarychev (TTI-C) Prasad Raghavendra (Georgia Tech)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Finding Almost-Perfect Graph Bisections Venkatesan Guruswami (CMU) Yury Makarychev (TTI-C) Prasad Raghavendra (Georgia Tech) David Steurer (MSR) Yuan Zhou (CMU)

  2. Bipartite graph recognition • Depth-first search/breadth-first search • With some noise? • Given a bipartite graph with 1% noisy edges, can we remove a small fraction of edges (10% say) to get a bipartite graph, i.e. can we divide the vertices into two parts, so that 90% of the edges go accross the two parts?

  3. MaxCut G=(V,E) • cut(A, B) = edges(A, B) / |E| where B = V - A • exact one of i, j in A : edge (i, j) "on the cut" • MaxCut: find A, B such that cut(A, B) is maximized • Bipartite graph recognition: MaxCut = 1 ? • Robust bipartite graph recognition: given MaxCut ≥ 0.99, to find cut(A, B) ≥ 0.9 B = V - A A cut(A, B) = 4/5 subject to

  4. c vs. s approximation for MaxCut • Given a graph with MaxCut value at least c, can we find a cut of value at least s ? • Robust bipartite graph recognition: given MaxCut ≥ 0.99, to find cut(A, B) ≥ 0.9 • 0.99 vs 0.9 approximation • "approximating almost perfect MaxCut"

  5. Robust bipartite graph recognition • Task: given MaxCut ≥ 0.99, find cut(A, B) ≥ 0.9 • We can always find cut(A, B) ≥ 1/2. • Assign each vertex -1, 1 randomly • For any edge (i, j), E[(1 - xixj)/2] = 1/2 vi vj

  6. Robust bipartite graph recognition (cont'd) • Task: given MaxCut ≥ 0.99, find cut(A, B) ≥ 0.9 • We can always find cut(A, B) ≥ 1/2. • Better than 1/2? • DFS/BFS/greedy? • Linear Programming? No combinatorial algorithm known until very recent [KS11] Natural LPs have big Integrality Gaps [VK07, STT07, CMM09]

  7. Robust bipartite graph recognition (cont'd) • Task: given MaxCut ≥ 0.99, find cut(A, B) ≥ 0.9 • We can always find cut(A, B) ≥ 1/2. • Better than 1/2? • The GW Semidefinite Programming relaxation [GW95] • 0.878-approximation • Given MaxCut , can find a cut • vs approximation, tight under Unique Games Conjecture [Kho02, KKMO07, MOO10] subject to

  8. Robust satisfiability algorithms • Given an instance which can be satisfied by removing ε fraction of constraints, to make the instance satisfiable by removing g(ε) fraction of constraints • g(ε) -> 0 as ε -> 0 • Examples • vs. algorithm for MaxCut [GW95] • vs. algorithm for Max2SAT [Zwick98] • vs. algorithm for MaxHorn3SAT [Zwick98]

  9. MaxBisection G = (V, E) Objective: A B subject to

  10. MaxBisection (cont'd) • Approximating MaxBisection? • No easier than MaxCut • Reduction: take two copies of the MaxCut instance G = (V, E) Objective: A B

  11. MaxBisection (cont'd) • Approximating MaxBisection? • No easier than MaxCut • Strictly harder than MaxCut? • Approximation ratio: 0.6514 [FJ97], 0.699 [Ye01], 0.7016 [HZ02], 0.7027 [FL06] • Approximating almost perfect solutions? Not known G = (V, E) Objective: A B

  12. Finding almost-perfect MaxBisection • Question • Is there a vs approximation algorithm for MaxBisection, where ? • Answer. Yes. • Our result. • Theorem. There is a vs approximation algorithm for MaxBisection. • Theorem. Given a satisfiable MaxBisection instance, it is easy to find a (.49, .51)-balanced cut of value .

  13. Extension to MinBisection • MinBisection • minimize edges(A, B)/|V|, s.t. B = V - A, |B| = |A| • Our result • Theorem. There is a vs approximation algorithm for MaxBisection. • Theorem. Given a MinBisection instance of value , it is easy to find a (.49, .51)-balanced cut of value .

  14. The rest of this talk... • Previous algorithms for MaxBisection. • Theorem. There is a vs approximation algorithm for MaxBisection.

  15. Previous algorithms for MaxBisection

  16. The GW algorithm for (almost perfect) MaxCut [GW95] • MaxCut objective • SDP relaxation subject to MaxCut = 2/3 -1 0 1 subject to SDP ≥ MaxCut In this example: SDP = 3/4 > MaxCut

  17. The "rounding" algorithm subject to • Lemma. We can (in poly time) get a cut of value when • Algorithm. Choose a random hyperplane, the hyperplane divides the vertices into two parts. • Analysis.

  18. The "rounding" algorithm (cont'd) subject to • Lemma. We can (in poly time) get a cut of value when • Algorithm. Choose a random hyperplane, the hyperplane divides the vertices into two parts. • Analysis. • implies for most edges (i, j), their SDP contribution is large • Claim. If , then • Therefore, the random hyperplane cuts many edges (in expectation)

  19. The "rounding" algorithm (cont'd) • Claim. If , then • Proof. vi, vj seperated by the hyperplane vi vj vi, vj not seperated by the hyperplane

  20. Known algorithms for MaxBisection • The standard SDP (used by all the previous algorithms) • Gives non-trivial approximation gaurantee • But does not help find almost perfect MaxBisection , subject to Bisection condition

  21. Known algorithms for MaxBisection (cont'd) • The standard SDP (used by all the previous algorithms) • The "integrality gap" , subject to OPT < 0.9 SDP = 1

  22. Known algorithms for MaxBisection (cont'd) • The standard SDP (used by all the previous algorithms) • The "integrality gap" : instances that OPT < 0.9, SDP = 1 • Why is this a bad news for SDP? • Instances that OPT > 1 - ε, SDP > 1 - ε • Instances that OPT < 0.9, SDP > 1 - ε • SDP cannot tell whether an instance is almost satisfiable (OPT > 1 - ε) or not. , subject to

  23. Our approach

  24. Theorem. There is a vs approximation algorithm for MaxBisection.

  25. A simple fact • Fact. -balanced cut of value bisection of value . • Proof. Get the bisection by moving fraction of random vertices from the large side to the small side. • fraction of cut edges affected : at most in expectation • Only need to find almost bisections.

  26. Almost perfect MaxCuts on expanders • λ-expander: for each , such that , we have , where G=(V,E) S

  27. Almost perfect MaxCuts on expanders (cont'd) • λ-expander: for each , such that , we have , where • Key Observation. The (volume of) difference between two cuts on a λ-expander is at most . • Proof. C X A B Y D

  28. Almost perfect MaxCuts on expanders (cont'd) • λ-expander: for each , such that , we have , where • Key Observation. The (volume of) difference between two cuts on a λ-expander is at most . • Approximating almost perfect MaxBisection on expanders is easy. • Just run the GW alg. to find the MaxCut.

  29. The algorithm (sketch) • Decompose the graph into expanders • Discard all the inter-expander edges • Approximate OPT's behavior on each expander by finding MaxCut (GW) • Discard all the uncut edges • Combine the cuts on the expanders • Take one side from each cut to get an almost bisection. (subset sum) Step 2: find MaxCut Step 1: decompose into expanders Step 3: combine pieces G=(V,E)

  30. Expander decomposition • Cheeger's inequality. Can (efficiently) find a cut of sparsity if the graph is not a -expander. • Corollary. A graph can be (efficiently) decomposed into -expanders by removing edges (in fraction). • Proof. • If the graph is not an expander, divide it into small parts by sparsest cut (cheeger's inequality). • Process the small parts recursively. G=(V,E) λ-expander

  31. The algorithm • Decompose the graph into -expanders. • Lose edges. • Apply GW algorithm on each expander to approximate OPT. • OPT(MaxBisection) = • GW finds cuts on these expanders • different from behavior of OPT • Lose edges. • Combine the cuts on the expanders (subset sum). • -balanced cut of value • a bisection of value

  32. Proved: • Theorem. There is a vs approximation algorithm for MaxBisection. • Will prove: • Theorem. There is a vs approximation algorithm for MaxBisection. short story

  33. Eliminating the factor • Recall. Only need to find almost bisections ( -close to a bisection) • Observation. Subset sum is "flexible with small items" • Making small items more biased does not change the solution too much. (101, 304) (397, 201) (8, 0) (3, 5) (8, 0) (6, 2) (6, 0) (5, 1) (5, 0) (3, 2) sum (515, 515)

  34. Eliminating the factor • Recall. Only need to find almost bisections ( -close to a bisection) • Observation. Subset sum is "flexible with small items" • Making small items more biased does not change the solution too much. (101, 304) (397, 201) (8, 0) (8, 0) (6, 0) (5, 0) sum (498, 505)

  35. Eliminating the factor • Recall. Only need to find almost bisections ( -close to a bisection) • Observation. Subset sum is "flexible with small items" • Making small items more biased does not change the solution too much. (101, 304) (397, 201) (8, 0) (8, 0) (6, 0) (5, 0) sum (506, 505)

  36. Eliminating the factor • Recall. Only need to find almost bisections ( -close to a bisection) • Observation. Subset sum is "flexible with small items" • Making small items more biased does not change the solution too much. (101, 304) (397, 201) (8, 0) (0, 8) (6, 0) (5, 0) sum (506, 513)

  37. Eliminating the factor • Recall. Only need to find almost bisections ( -close to a bisection) • Observation. Subset sum is "flexible with small items" • Making small items more biased does not change the solution too much. (101, 304) (397, 201) (8, 0) (0, 8) (6, 0) (5, 0) sum (512, 513)

  38. Eliminating the factor • Recall. Only need to find almost bisections ( -close to a bisection) • Observation. Subset sum is "flexible with small items" • Making small items more biased does not change the solution too much. (101, 304) (397, 201) (8, 0) (0, 8) (6, 0) (5, 0) sum (517, 513)

  39. Eliminating the factor • Recall. Only need to find almost bisections ( -close to a bisection) • Observation. Subset sum is "flexible with small items" • Making small items more biased does not change the solution too much. • However, making small items more balanced might be a bad idea. (200, 0) (0, 2) (0, 2) 100 copies (0, 2) sum (200, 200)

  40. Eliminating the factor • Recall. Only need to find almost bisections ( -close to a bisection) • Observation. Subset sum is "flexible with small items" • Making small items more biased does not change the solution too much. • However, making small items more balanced might be a bad idea. (200, 0) (1, 1) (1, 1) 100 copies (1, 1) sum (300, 100)

  41. Eliminating the factor (cont'd) • Idea. Terminate early in the decomposition process. Decompose the graph into • -expanders (large items), or • subgraphs of vertices (small items). • Corollary. Only need to discard edges. • Lemma. We can find an almost bisection if the MaxCuts we get for small sets are more biased than those in OPT.

  42. MaxBisection Biased MaxCut Finding a biased MaxCut • To find a cut that is as biased as OPT and as good as OPT (in terms of cut value). • Lemma. Given G=(V,E), if there exists a cut (X, Y) of value , then one can find a cut (A, B) of value , such that .

  43. The algorithm • Decompose the graph into -expanders or small parts. • Lose edges. • Apply GW algorithm on each expander to approximate OPT. • Lose edges, different from OPT • Find biased MaxCuts in small parts. • Lose edges, at most less biased than OPT • Combine the cuts on the expanders and small parts (subset sum). • -balanced cut of value • a bisection of value

  44. Finding a biased MaxCut -- A simpler task • Lemma. Given G=(V,E), if there exists a cut (X, Y) of value , then one can find a cut (A, B) of value , such that . • SDP. • Claim. SDP ≥ |X|/|V| --- Bias maximize subject to --- Cut value

  45. Rounding algorithm (sketch) • Goal: given SDP solution, to find a cut (A, B) such that • For most ( fraction) edges (i, j), we have • vi, vj are almost opposite to each other: vi≈ - vj, • Indeed,

  46. Rounding algorithm (sketch) (cont'd) for most edges (i, j): • Project all vectors to v0 • Divide v0 axis into intervals • length = • Most ( fraction ) edges' incident vertices fall into opposite intervals (good edges) • Discard all bad edges v0 I(-4) I(-3) I(-2) I(-1) I(1) I(2) I(3) I(4)

  47. Rounding algorithm (sketch) (cont'd) • Let the cut (A, B) be • for each pair of intervals I(k) and I(-k), let A include the one with more vertices, B include the other • (A, B) cuts all good edges v0 -4 -3 -2 -1 1 2 3 4

  48. Rounding algorithm (sketch) (cont'd) • Let the cut (A, B) be • for each pair of intervals I(k) and I(-k), let A include the one with more vertices, B include the other • For each i in I(k) For each i in I(-k)

  49. Finding a biased MaxCut • Lemma. Given G=(V,E), if there exists a cut (X, Y) of value , then one can find a cut (A, B) of value , such that . • SDP. --- Bias maximize subject to --- Cut value -triangle inequality

  50. Future directions • vs approximation? • "Global conditions" for other CSPs. • Balanced Unique Games?

More Related