1 / 59

Design and Analysis of Approximation Algorithms

Design and Analysis of Approximation Algorithms. Ding-Zhu Du. Text Books. Ding-Zhu Du and Ker-I Ko, Design and Analysis of Approximation Algorithms (Lecture Notes). Chapters 1-8. Schedule. Introduction Greedy Strategy Restriction Partition Guillotine Cut Relaxation

Jimmy
Download Presentation

Design and Analysis of Approximation Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Design and Analysis of Approximation Algorithms Ding-Zhu Du

  2. Text Books • Ding-Zhu Du and Ker-I Ko, Design and Analysis of Approximation Algorithms (Lecture Notes). Chapters 1-8.

  3. Schedule • Introduction • Greedy Strategy • Restriction • Partition • Guillotine Cut • Relaxation • Linear Programming • Local Ratio • Semi-definite Programming

  4. Rules • You may discuss each other on 5 homework assignments. But, do not copy each other. • Each homework is 10 points. 4 top scores will be chosen. • Midterm Exam (take-home) is 30 points. • Final Exam (in class) is 30 points. • Final grade is given based on the total points (A ≥ 80; 80 > B ≥ 60 ; 60 > C ≥ 40).

  5. Chapter 1 Introduction • Computational Complexity (background) • Approximation Performance Ratio • Early results

  6. Computability • Deterministic Turing Machine • Nondeterministic Turing Machine • Church-Turing Thesis

  7. tape head Finite Control Deterministic Turing Machine (DTM)

  8. e p h B a l a The tape has the left end but infinite to the right. It is divided into cells. Each cell contains a symbol in an alphabet Γ. There exists a special symbol B which represents the empty cell.

  9. a • The head scans at a cell on the tape and can read, erase, and write a symbol on the cell. In each move, the head can move to the right cell or to the left cell (or stay in the same cell).

  10. a • The head scans at a cell on the tape and can read, erase, and write a symbol on the cell. In each move, the head can move to the right cell or to the left cell (or stay in the same cell).

  11. The finite control has finitely many states which form a set Q. For each move, the state is changed according to the evaluation of a transition function δ : Q x Γ → Q x Γ x {R, L}.

  12. b a • δ(q, a) = (p, b, L) means that if the head reads symbol a and the finite control is in the state q, then the next state should be p, the symbol a should be changed to b, and the head moves one cell to the left. p q

  13. a b • δ(q, a) = (p, b, R) means that if the head reads symbol a and the finite control is in the state q, then the next state should be p, the symbol a should be changed to b, and the head moves one cell to the right. p q

  14. There are some special states: an initial state s and an final states h. • Initially, the DTM is in the initial state and the head scans the leftmost cell. The tape holds an input string. s

  15. x • When the DTM is in the final state, the DTM stops. An input string x is accepted by the DTM if the DTM reaches the final state h. • Otherwise, the input string is rejected. h

  16. The DTM can be represented by M = (Q, Σ, Γ, δ, s) where Σ is the alphabet of input symbols. • The set of all strings accepted by a DTM $M$ is denoted by L(M). We also say that the language L(M) is accepted by M.

  17. The transition diagram} of a DTM is an alternative way to represent the DTM. • For M = (Q, Σ, Γ, δ, s), the transition diagram of M is a symbol-labeled digraphG=(V, E) satisfying the following: V = Q (s = , h = ) E = { p q | δ(p, a) = (q, b, D)}. a/b,D

  18. 1/1,R 0/0,R; 1/1,R 0/0,R 0/0,R B/B,R s p q h 1/1,R M=(Q, Σ, Γ, δ, s) where Q = {s, p, q, h}, Σ = {0, 1}, Г = {0, 1, B}. δ 0 1 B s (p, 0, R) (s, 1, R) - p (q, 0, R) (s, 1, R) - q (q, 0, R) (q, 1, R) (h, B, R) L(M) = (0+1)*00(0+1)*.

  19. tape head Finite Control Nondeterministic Turing Machine (NTM)

  20. e p h B a l a The tape has the left end but infinite to the right. It is divided into cells. Each cell contains a symbol in an alphabet Γ. There exists a special symbol B which represents the empty cell.

  21. The finite control has finitely many states which form a set Q. For each move, the state is changed according to the evaluation of a transition function δ : Q x Γ → 2^{Q x Γ x {R, L}}.

  22. Church-Turing Thesis • Computability is Turing-Computability.

  23. Multi-tape DTM Input tape (read only) Storage tapes Output tape (possibly, write only)

  24. Time of TM • TimeM (x) = # of moves that TM M takes on input x. • TimeM(x) < infinity iff x ε L(M).

  25. Space • SpaceM(x) = # of cell that M visits on the work (storage) tapes during the computation on input x. • If M is a multitape DTM, then the work tapes do not include the input tape and the write-only output tape.

  26. Time Bound M is said to have a time bound t(n) if for every x with |x| < n, TimeM(x) < max {n+1, t(n)}

  27. Complexity Class • A language L has a (deterministic) time-complexity t(n) if there is a multitape TM M accepting L, with time bound t(n). • DTIME(t(n)) = {L(M) | DTM M has a time bound t(n)} • NTIME(t(n)) = {L(M) | NTM M has a time bound t(n)}

  28. c • P = U DTIME(n ) • NP = U NTIME(n ) C>0 c C>0

  29. NP Class

  30. Earlier Results on Approximations • Vertex-Cover • Traveling Salesman Problem • Knapsack Problem

  31. Performance Ratio

  32. Constant-Approximation • c-approximation is apolynomial-timeapproximation satisfying: 1 < approx(input)/opt(input) < c for MIN or 1 < opt(input)/approx(input) < c for MAX

  33. Vertex Cover • Given a graph G=(V,E), find a minimum subset C of vertices such that every edge is incident to a vertex in C.

  34. Vertex-Cover • The vertex set of a maximal matching gives 2-approximation, i.e., approx / opt < 2

  35. Traveling Salesman • Given n cities with a distance table, find a minimum total-distance tour to visit each city exactly once.

  36. Traveling Salesman with triangular inequality • Traveling around a minimum spanning tree is a 2-approximation.

  37. Traveling Salesman with Triangular Inequality • Minimum spanning tree + minimum-length perfect matching onodd verticesis 1.5-approximation

  38. Minimum perfect matching on odd vertices has weight at most 0.5 opt.

  39. Lower Bound 1+ε 1+ε 1 1 1+ε 1+ε 1+ε

  40. Traveling Salesman without Triangular Inequality Theorem For any constant c> 0, TSP has no c-approximation unless NP=P. Given a graph G=(V,E), define a distance table on V as follows:

  41. Contradition Argument • Suppose c-approximation exists. Then we have a polynomial-time algorithm to solve Hamiltonian Cycle as follow: C-approximation solution < cn if and only if G has a Hamiltonian cycle

  42. Knapsack

  43. 2-approximation

  44. PTAS • A problem has a PTAS (polynomial-time approximation scheme) if for any ε > 0, it has a (1+ε)-approximation.

  45. Knapsack has PTAS • Classify: for i < m, ci< a= cG, for i > m+1, ci > a. • Sort • For

  46. Proof.

  47. Time

  48. Fully PTAS • A problem has a fully PTAS if for any ε>0, it has (1+ε)-approximation running in time poly(n,1/ε).

  49. Fully FTAS for Knapsack

More Related