1 / 14

Weighted Matching-Algorithms, Hamiltonian Cycles and TSP

Weighted Matching-Algorithms, Hamiltonian Cycles and TSP. Graphs & Algorithms Lecture 6. TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A A A. Weighted bipartite matching. Given: K n , n (complete bipartite graph on 2n vertices)

nishan
Download Presentation

Weighted Matching-Algorithms, Hamiltonian Cycles and TSP

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Weighted Matching-Algorithms, Hamiltonian Cycles and TSP Graphs & Algorithms Lecture 6 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAAAA

  2. Weighted bipartite matching • Given: • Kn, n (complete bipartite graph on 2n vertices) • n£n weight matrix with entries wi, j¸ 0 • Want: perfectmatching M maximizing the total weight • Weighted cover (u, v): choice of vertex labelsu = u1,…,un and v = v1,…,vn , u, v2Rnsuch that, for all 1 ·i, j·n, we haveui + vj¸wi, j . • Cost c(u, v) := iui + vi .

  3. Duality: max. weighted matching and min. weighted vertex cover • For any matching M and weighted cover (u, v) of a weighted bipartite graph, we havew(M) ·c(u, v) . • Also, w(M) = c(u, v) if and only if M consists of edges {i, j} such that ui + vj = wi, j. In this case, M and (u, v) are optimal. • Equality subgraph Gu, vof a fixed cover (u, v): • spanning subgraph of Kn, n, • {i, j} 2E(Gu, v) ,ui + vj = wi, j . • Idea: a perfect matching of Gu, v corresponds to a maximum weighted matching of Kn, n.

  4. Hungarian AlgorithmKuhn (1955), Munkres (1957)

  5. Correctness of the Hungarian Method TheoremThe Hungarian Algorithm finds a maximum weight matching and a minimum cost cover. Proof • The statement is true if the algorithm terminates. • Loop invariant: consider (u, v) before and (u', v') after the while loop • (u, v) is cover of G) (u', v') is a cover of G • c(u, v) ¸ c(u', v') + ¸w(M) • For rational weights,  is bounded from below by an absolute constant. • In the presence of irrational weights, a more careful selection of the minimum vertex cover is necessary.

  6. Hamiltonian Cycles • A graph on n vertices is Hamiltonian if it contains a simple cycle of length n. • The Hamiltonian-cycle problem is NP-complete (reduction to the vertex-cover problem). • The naïve algorithm has running time (n!) = (2n). • What are sufficient conditions for Hamiltonian graphs? Theorem (Dirac 1952)Every graph with n¸ 3 vertices and minimum degree at least n/2 has a Hamiltonian cycle.

  7. x1 x2 x3 xi-1 xi xn Proof of Dirac’s Theorem (Pósa) • Suppose G is a graph with (G)n/2 that contains no Hamiltonian cycle. • Insert as many edges into G as possible  Embedding of G into a saturated graph G’ that contains a Hamilton path • Neighbourhood (x1) yields n/2 forbidden neighbours for xnin {x1, …, xn – 2}. • Since xn cannot connect to itself, there is not enough space for all of its n/2 neighbours.

  8. Weaker degree conditions • Let G be a graph on n vertices with degrees d1· … ·dn. • (d1, …, dn) is called the degree sequence • An integer sequence (a1, …, an) is Hamiltonian if every graph on n vertices with a pointwise greater degree sequence (d1, …, dn) is Hamiltonian. Theorem (Chvátal 1972) An integer sequence (a1, …, an) such that 0 ·a1· … ·an < n and n¸ 3 is Hamiltonian iff, for all i < n/2, we have: ai·i)an – i¸n – i.

  9. Traveling Salesman Problem (TSP) • Given n cities and costs c(i, j) ¸ 0 for going from city i to city j (and vice versa). • Find a Hamiltonian cycle H*of minimum costc(H*) = e2 E(H*)c(e) . • Existence of a Hamiltonian cycle is a special case. • Brute force: n! = (nn + 1/2 e-n) time complexity • Better solutions: • polynomial time approximation algorithm for TSP with triangle inequality approximation ratio 2 • optimal solution with running time O(n22n)

  10. Approximation algorithms • Consider a minimization problem. An algorithm ALG achieves approximation ratio (n) ¸ 1 if, for every problem instance P of size n, we haveALG(P) / OPT(P) ·(n) ,whereOPT(P) isthe optimal value of P. • An approximation scheme takes one additional parameter  and achieves approximation ratio (1 + ) on every problem instance. • A polynomial time approximation (PTAS) scheme runs in polynomial time for every fixed  ¸ 0. • The running time of a fully polynomial time approximation scheme (FPTAS) is also polynomial in -1.

  11. 2-approximation algorithm for TSP • Running time • Prim's algorithm with Fibonacci heaps: O(E + V logV) • Kruskal's algorithm: O(E logV)

  12. An optimal TSP algorithm • For each Sµ {2,…,n} and k2S, defineP(S, k) ´ "minimum cost of a Hamiltonian path on Sstarting in 1 and ending in k" • Let V = {1,…,n} and positive costs c(i, j) be given.TSP = min{P(Vn{1}, k) + c(k, 1) : k2Vn{1}} • Recursive computation of P(S, k) :P({k}, k) = c(1, k)P(S, k) = min{P(Sn{k}, j) + c(j, k) : j2Sn{k}} • Compute P(S, k) buttom-up (dynamic programming) • Number of distinct P(S, k) values is (n – 1)2n – 2 . • Each time at most n operations are necessary.

  13. Example

  14. More positive and negative results on TSP • Slight modifications yield a 1.5-approximation algorithm for TSP with -inequality. (Exercise) • Arora (1996) gave a PTAS for Euclidean TSP with running time nO(1/) . • For general TSP, there exists no polynomial time approximation algorithm with any constant approximation ratio  ¸ 1, unless P = NP. (Exercise)

More Related