490 likes | 708 Views
Approximations for Combinatorial Optimization. Moses Charikar. based on joint work with Amit Agarwal, Adriana Karagiozova, Konstantin Makarychev and Yury Makarychev. Optimization Problems. Shortest paths Minimum cost network Scheduling, Load balancing Graph partitioning problems
E N D
Approximations for Combinatorial Optimization Moses Charikar based on joint work with Amit Agarwal, Adriana Karagiozova, Konstantin Makarychev and Yury Makarychev
Optimization Problems • Shortest paths • Minimum cost network • Scheduling, Load balancing • Graph partitioning problems • Constraint satisfaction problems
Approximation Algorithms • Many optimization problems NP-hard • Alternate approach: heuristics with provable guarantees • Guarantee: Alg(I) OPT(I) (maximization)Alg(I) OPT(I) (minimization) • Complexity theory gives bounds on best approximation ratios possible
i ¢ x m n c b A ¸ x Mathematical Programming approaches • Sophisticated tools from convex optimization • e.g. Linear programming • Can find optimum solution in polynomial time
Relax and Round • Express solution in terms of decision variables, typically {0,1} or {-1,+1} • Feasibility constraints on decision variables • Objective function • Relax variables to get mathematical program • Solve program optimally • Round fractional solution
LP is a widely used tool in designing approximation algorithms • Interpret variables values as probabilities, distances, etc. integer solutions fractional solutions
Distance function from LPs • [Leighton, Rao ’88] Distance function d. • Triangle ineq.:d(a, b) + d(b, c) ¸ d(a, c) d = 1 d = 0 d = 0 1 a c 0 0 b
Quadratic programming • Linear expressions in xi xj ? • NP-hard • Workaround: Mij = xi xj • What can we say about M ? • M is positive semidefinite (psd) • Can add psd constraint • Semidefinite programming • Can solve to any desired accuracy
Positive Semidefinite Matrices • M is psd iff • xT M x 0 for all x • All eigenvalues of M are non-negative • M = VT V (Cholesky decomposition) • Mij = vi vj
Vector Programming • Variables are vectors • Linear constraints on dot products • Linear objective on dot products
2 2 ( ( ) ) ¡ ¡ x v v x X X i i j j m m a a x x 4 4 ( ( ) ) E E i i j j 2 2 ; ; f l l 2 i 1 f l l i 1 ¢ v v o r a = i i x o r a = i Relaxation for Max Cut
SDP solution • Geometric embedding of vertices • Hyperplane rounding
Can we do better ? • Better analysis ? rounding algorithm ? • Add constraints to the relaxation. • -inequality constraints: • (vi –vj)2 + (vj –vk)2 (vi – vk)2
j i Sparsest Cut uniform demands ( ) ± S T ; i m n j j j j S T ¢ S T
i j ( ) ± i j ; Cut Metric 1 0 0 S T Use relaxations of cut metrics
Approximate cut metrics • How well can relaxed metrics be mapped into cut metrics ? • Metrics from LPs: log n distortion gives log n approximation[Bourgain ’85] [LLR ’95] [AR ’95] • SDP with -inequalities ? • geometry of l22 metrics • Goemans-Linial conjecture:l22 metrics embed into l1 with constant distortion.
p l o g n Arora-Rao-Vazirani • [ARV ’04] • Breakthrough for SDPs with -inequalities • approximation for balanced cut and sparsest cut
8 ( ) d 2 1 3 3 1 7 + ´ x x m o 1 4 > > > ( ) d 1 6 4 1 7 + < ´ x x m o 3 2 > : : : > > ( ) d : 5 3 9 1 7 + ´ x x m o 1 9 Unique Games • Linear equations mod p • 2 variables per equation • maximize number of satisfied constraints • In every constraint, for every value of one variable, unique value of other variable satisfies the constraint. • If 99% of equations are satisfiable, can we satisfy 1% of them ?
Unique Games Conjecture • [Khot ’02]Given a Unique Games instance where 1-fraction of constraints is satisfiable, it is NP-hard to satisfy evenfraction of all constraints. (for every constant positive and and sufficiently large domain size k).
Implications of UGC • 2 is best possible for Vertex cover [KR ’03] • 0.878 is best possible for Max Cut [KKMO ’04] [MOO ’05] • (1) for sparsest cut(1) for min 2CNF deletion[CKKRS ’05] [KV ’05]
Algorithms for Unique Games • Domain size k, OPT = 1- • Random solution satisfies 1/k • Non-trivial results only for = 1/poly(k)[AEH ’01] [Khot ’02] [Trevisan ’05] [GT ’06] 1 - 0 1
³ ´ p l k O 1 ¡ " o g " ¡ ¡ ¢ k ¡ 2 " New results for Unique Games • [CMM ’05] • Given an instance where 1-fraction of constraints is satisfiable, we satisfy • We can also satisfy:
³ ´ p l k O 1 ¡ " o g = l k 1 1 0 c o g = 2 1 5 ( ) k O 1 = ( ) ¡ 2 1 ¡ ¡ ¡ " " " k k New results for Unique Games • Algorithms cover the entire range of .
Seems distant from UGC setting • Optimal if UGC is true ![KKMO ’05] [MOO ’05] • Any improvement will disprove UGC 1 - 0 1
p h l k G i t t > ( ¢ v e n a g u o g " ¡ k ¡ 2 " p [ ] h l k ? P i t > ¢ w a s r g v o g p ( ) l k O 1 ¡ " o g Matching upper and lower bounds ? g Gaussian random vector v u u · v = 1
If pigs could whistle … • UGC seems to predict limitations of SDPs correctly • UGC based hardness for many problems matching best SDP based approximation • UGC inspired constructions of gap examples for SDPs • Disproof of Goemans-Linial conjecturel22 metrics do not embed into l1 with constant distortion. [KV ’05]
Is UGC true ? • Points to limitations of current techniques • Focuses attention on common hard core of several important optimization problems • Motivates development of new techniques
Approaches to disproving UGC • Lifting procedures for SDPs • Lovasz-Schrijver, Sherali-Adams, Lasserre • Simulate products of k variables • Can we use them ?
Moment matrices • SDP solution gives covariance matrix M • There exist normal random variables with covariances Mij • Basis for SDP rounding algorithms • There exist {+1,-1} random variables with covariances Mij/log n • Is something similar possible for higher order moment matrices ?
Glimpse of research directions • Whirlwind tour • Quick intro to variety of problems my group has looked at recently • No details !
Tighter local versus global properties of metric spaces Moses Charikar Konstantin Makarychev Yury Makarychev
Local versus Global • Local properties: properties of subsets • Global properties: properties of entire set • What do local properties tell us about global properties ? • Property of interest: embeddability in normed spaces
Motivations • Natural mathematical question • Many Questions of similar flavor have been studied • Lift-and-project methods in optimization • Can guarantee local properties • Need guarantee on global property
Local versus global distortion • Metric on n points • Property : Embeddability into l1 • Dloc : distortion for embedding any subset of size k • Dglob : distortion for embedding entire metric • What is the relationship between Dloc and Dglob ?
µ µ ¶ ¶ 2 ( = ) l l k µ ¶ ( = ) ( = ) k l k 1 2 o g o g n n ( ( ( ( ( ) ( = ( = = ) ) ) ) ) ) l l l k k k O O D D D o g n ~ o g n o o n g g n n D ( = ) l l k ± 1 l o g o g o g n Results
Aspects of Network Design: Adriana Karagiozova • Multicommodity Buy at Bulk Network DesignCharikar, Karagiozova • Terminal Backup and Simplex MatchingAnshelevich, Karagiozova
Multicommodity Buy at Bulk Network Design • installing capacity in communication networks • Given graph, costs for installing capacity on edges • Pairs of communicating nodes with capacity demands • Goal: Install capacity on network to satisfy all users • Costs exhibit economy of scale
Algorithm Overview • Assume all unit demands • Simple greedy algorithm • Randomly permute demand pairs • “Inflate” demands so that kth demand is n/k • Install capacity in greedy fashion to route kth demand • Intuition: • Inflated demands encourage large investments in the network that will be useful later • first k pairs in random permutation act like random sample that suggest where investments should be made
Terminal Backup and Simplex Matching • Suggested by Mung Chiang’s group • Given a graph, set of terminals, edge costs • Find minimum cost set of edges, so that every terminal connected to at least one other • Generalization of classical matching • Algorithm: generalization of augmenting paths for matchings
New Results in Approximate Optimization:Amit Agarwal • Directed Cut Problems [Agarwal, Alon, Charikar] • Advantage of Network Coding for Improving Throughput [Agarwal, Charikar]
Part 1 s1 s2 source sink t1 t2 Undirected Cut Problems • G = (V,E): undirected graph • Cost on edges: ce: E ! R+ • Min-cut • Single source and sink • 2 source-sink pairs • Polynomial time [Hu]
Multicut: multiple source-sink pairs • k source-sink pairs si-ti, i 2 [k] • NP-hard • O(log k) approximation [Garg-Vazirani-Yannakakis ’96] • Objective Cheapest E’ µ E so that, in (V,E\E’), no ti can be reached from its corresponding si Alternate definition:At least one edge from every si-ti path should be removed
Part 1 s1 s2 source sink t1 t2 Undirected Cut Problems • G = (V,E): undirected graph • Cost on edges: ce: E ! R+ • Min-cut • Single source and sink • 2 source-sink pairs • Polynomial time [Hu]
Multicut: multiple source-sink pairs • k source-sink pairs si-ti, i 2 [k] • NP-hard • O(log k) approximation [Garg-Vazirani-Yannakakis ’96] • Objective Cheapest E’ µ E so that, in (V,E\E’), no ti can be reached from its corresponding si Alternate definition:At least one edge from every si-ti path should be removed
Outline • Directed Cut Problems • Advantage of Network Coding for Improving Throughput
Part 2 Definitions • Graph G = (V,E) • Capacities ce: E ! R+ • Source m0 and k terminals m1,…,mk • Send b bits of information to all • Steiner tree connects source to all terminals • Min. cost Steiner tree is cheapest such tree • : Set of all Steiner trees
Advantage of Network Coding • All edges have capacity 1 • Without Coding: < 1 of A and B to m1 and m2 • With Coding: 1 of A and B to m1 and m2 • What is the maximum increase in capacity from network coding ? • Our result: Maximum increase is the same as integrality gap of Steiner tree relaxation m0 A B B A A B Axor B Axor B m1 m2 A B B A