1 / 41

Greedy Algorithms, Part 2 Accessible Set Systems

Greedy Algorithms, Part 2 Accessible Set Systems. Andreas Klappenecker. Greedy Algorithms. A greedy algorithm obtains a solution to an optimization problem by making a sequence of choices, where each choice looks best at the time. . Coin Change. Suppose we have the denominations

tirzah
Download Presentation

Greedy Algorithms, Part 2 Accessible Set Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Greedy Algorithms, Part 2Accessible Set Systems Andreas Klappenecker

  2. Greedy Algorithms A greedy algorithm obtains a solution to an optimization problem by making a sequence of choices, where each choice looks best at the time.

  3. Coin Change Suppose we have the denominations v[1] = 6, v[2] = 4, v[3] = 1. Does the greedy coin change algorithm in this case always produce a minimum number of coins? << Work it out! >> C = 1,2,3,4,5,6,7 => Yes! C = 8 => No!

  4. Another Coin Change Example Suppose we have the denominations v[1] = 6, v[2] = 4, v[3] = 2, v[4] = 1 Does the greedy coin change algorithm in this case always produce a minimum number of coins? << Work it out! >> C = 1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16 => Yes! Wow, is there any counter example?

  5. Counterexamples Let us call C a counterexample to the greedy coin-changing algorithm iff the greedy algorithm does not produce the minimal number of coins to represent C. Kozen and Zaks have shown that if there exists a counterexample C to the greedy coin-changing algorithm with denominations v[1]>v[2]>…>v[n]=1 then there must exist one in the range v[n-2]+1 < C < v[1]+v[2].

  6. Another Coin Change Example (2) Suppose we have the denominations v[1] = 6, v[2] = 4, v[3] = 2, v[4] = 1 According to the theorem by Kozen and Zaks, a counter example C must exist in the range v[2]+1 = 5 < C < v[1]+v[2]=12 Since we have verified that the algorithm gives optimal change C=6,…,11, we can conclude that the algorithm produces optimal change.

  7. Finding Counterexamples Is it possible to write an efficient algorithm that checks whether there exists a counter-example to the greedy coin-changing algorithm for a given set of denominations?

  8. Hardness Results Lueker: When the coin values are large and represented in binary, then the problem of finding an optimal representation of a given C is NP-hard. [For smaller coin values, we might still succeed] Kozen and Zaks: It is coNP-complete to determine, given a system of denominations and a number C represented in binary, whether the greedy representation of C is optimal. [Here we ask whether G(C)=M(C) for a given C rather than G(C)=M(C) for all C.]

  9. Three Coins Suppose that the system contains 3 denominations: v[1] = d > v[2] = c > v[3] = 1; Let q = d/c be the quotient and r = d mod c the remainder. Then there exists a counterexample iff 0 < r < c-q. Example: v[1]=4; v[2]=3; v[3]=1 Then q = 4/3 = 1 and r = 4 mod 3 = 1. As 0 < r = 1 < 3-1, there must exist a counterexample.

  10. Minimum Change Given a system of coins, let M(C) denote the minimum size over all representations of the number C in that system. In other words, if v[1]>v[2]>…>v[n]=1 is a system of denominations, then M(C) is the solution to the optimization problem M(C) = min { 1<=i<=nm[i] | C = 1<=i<=n m[i]*v[i] } where the multiplicities m[i] are nonnegative integers. A sequence (m[1],m[2],…,m[n]) is called the representation of C.

  11. Greedy Change If v[1]>v[2]>…>v[n]=1 is a system of denominations, then G(C) denotes the number of coins m[1]+m[2]+…+m[n] calculated by the greedy coin-change algorithm procedure G(C) { for i = 1 to n do { m[i] :=  C/v[i]  ; C := C mod v[i]; } return m[1]+m[2]+…+m[n]; }

  12. Canonical Denomination Systems A system v[1]>v[2]>…>v[n]=1 of denominations is called canonical if and only if M(C) = G(C) holds for all integers C>0. If a system of denominations is not canonical, then a value C such that M(C)  G(C) is called a counterexample.

  13. Structure of Optimal Change Let v[1]>v[2]>…>v[n]=1 be a system of denominations. For all C and all coins v[i]<C, we have M(C) <= M(C-v[i])+1 with equality if and only if there exists an optimal representation of C that uses a coin with value v[i].

  14. Witnesses Let us call C a witness if and only if G(C ) > G(C – v) +1 for some coin with value v < C. By the previous result, a witness yields a counterexample. Indeed, M(C ) <=M(C-v)+1<= G(C-v)+1< G(C).

  15. Smallest Counterexample The smallest counterexample must be a witness. Seeking a contradiction, suppose that C is the smallest counterexample, but is not a witness. Let v be the value of any coin that is used in an optimal representation of C. Then M(C-v) = M(C )-1 // optimal repr. < G(C )-1 // is counterex. <= G(C-v) // not witness However, this would imply that C-v is a counterexample, contradicting the minimality of C.

  16. Skipping one Value We can avoid checking the largest coin v<C in the witness test, as the greedy representation satisfies G(C) = G(C – v)+1 for the largest coin with value v<C.

  17. The Kozen and Zaks Criterion A given system v[1]>v[2]>…>v[n]=1 of denominations is canonical if and only if there does not exist any witness C in the range v[n-2]+1 < C < v[1]+v[2]. A nice feature of this criterion is that one simply needs to compute greedy representations and store the number of coins that are used. Therefore, this criterion is easy to implement and does not require finding optimal representations.

  18. Matroids

  19. Matroid Let S be a finite set, and F a nonempty family of subsets of S, that is, F P(S). We call (S,F) a matroid if and only if M1) If BF and A  B, then AF. [The family F is called hereditary] M2) If A,BF and |A|<|B|, then there exists x in B\A such that A{x} in F [This is called the exchange property]

  20. Weight Functions A matroid (S,F) is called weighted if it equipped with a weight function w: S->R+, i.e., all weights are positive real numbers. If A is a subset of S, then w(A) := a in A w(a).

  21. Greedy Algorithm for Matroids Greedy(M=(S,F),w) // maximizing version A := ; Sort S into monotonically decreasing order by weight w. for each x in S taken in monotonically decreasing order do if A{x} in F then A := A{x}; fi; od; return A;

  22. Greedy Algorithm for Matroids (2) Greedy(M=(S,F),w) // minimizing version A := ; Sort S into monotonically increasing order by weight w. for each x in S taken in monotonically increasing order do if A{x} in F then A := A{x}; fi; od; return A;

  23. Matroid Terminology Let (S,F) be a matroid. The elements in F are called independent sets. A independent set in F that is maximal with respect to inclusion is called a basis. Since we assumed that the weight function is positive, the algorithm Greedy always returns a basis.

  24. Small Matroid Example Suppose that S={a,b,c,d}. Construct the smallest matroid (S,F) such that {a,b} and {c,d} are contained in F. F = { , {a}, {b}, {c}, {d}, {a,b}, {c,d}, by the hereditary property {a,c}, {b,c}, {a,d}, {b,d} by the exchange property }

  25. Graphic Matroid Example Consider the graph K3 The graphic matroid M(K3) of this graph is given by the set of all induced subgraphs of K3 that are forests { , , , , , , } Induced subgraphs are specified by their edge sets alone. If B is contained in the graphic matroid, then so is A  B (just delete edges). If A and B subforests of K3 with |A|<|B|, then we can find an edge e in B such that A  {e}.

  26. Kruskal’s Algorithm Let G be a connected graph with positive edge weights. Use the minimizing greedy algorithm on the graphic matroid M(G). Then it will yield a minimum spanning tree. Example: G=K3 with weights 1,2,3. The algorithm proceeds to construct 1 2 3 1 1 2

  27. Kruskal's MST algorithm 16 5 4 11 12 7 3 14 9 6 2 8 10 15 17 13 18 Consider the edges in increasing order of weight, add an edge iff it does not cause a cycle [Animation taken from Prof. Welch’s lecture notes]

  28. Conclusion Matroids characterize a group of problems for which the greedy algorithm yields an optimal solution. Kruskals minimum spanning tree algorithm fits nicely into this framework.

  29. Prim’s Algorithm for MST v9 v1 • We first pick an arbitrary vertex v1 to start with. • Maintain a set S = {v1}. • Over all edges from v1, find a lightest one. Say it’s (v1,v2). • S ← S ∪ {v2} • Over all edges from {v1,v2} (to V-{v1,v2}), find a lightest one, say (v2,v3). • S ← S ∪ {v3} • … • In general, suppose we already have the subset S = {v1,…,vi}, then over all edges from S to V-S, find a lightest one (vj, vi+1). • Update: S ←S ∪ {vi+1} • … • Finally we get a tree; this tree is a minimum spanning tree. 6 1 v2 5 3 6 2 4 v8 v3 5 v6 6 4 4 4 v5 v4 2 v7 [Slide due to Prof. Shengyu Zhang]

  30. Critique Prim’s algorithm is a greedy algorithm that always produces optimal solutions. S = edges of the graph G F = { A | T=(V,A) is a induced subgraph of G, is a tree, and contains the vertex v } Apparently, Prim’s algorithm is essentially Greedy applied to (S,F). However, the set system (S,F) is not a matroid, since it is not hereditary! Why? Removing an edge from a tree can yield disconnected components, not necessarily a tree.

  31. Goal Generalize the theory from matroids to more general set systems (so that e.g. Prim’s MST algorithm can be explained). Allow weight functions with arbitrary weight.

  32. Accessible Set Systems Let S be a finite set, F a non-empty family of subsets of S. Then (S,F) is called a set system. The elements in F are called feasible sets. A set system (S,F) satisfying the accessibility axiom: If A is a nonempty set in F, then there exists an element x in S such that A\{x} in F. is called an accessible set system. [Greedy algorithms need to be able to construct feasible sets by adding one element at the time, hence the accessibility axiom is needed.]

  33. Weight Functions Let w: S -> R be a weight function (negative weights are now allowed!). For a subset A of S, define w(A) := a in A w(a).

  34. Accessible Set Systems A maximal set B in an accessible set system (S,F) is called a basis. Goal: Solve the optimization problem BMAX: maximize w(B) over all bases of M.

  35. Greedy Algorithm Greedy(M=(S,F),w) A := ; Sort S into monotonically decreasing order by weight w. for each x in S taken in monotonically decreasing order do if A{x} in F then A := A{x}; fi; od; return A;

  36. Question Characterize the accessible set systems such that the greedy algorithm yields an optimal solution for any weight function w. This question was answered by Helman, Moret, and Shapiro in 1993, after initial work by Korte and Lovasz (Greedoids) and Edmonds, Gale, and Rado (Matroids).

  37. Extensibility Axiom: Motivation Define ext(A) = { x in S\A | A{x} in F }. Let A be in F such that ext(A)= and B a basis properly containing A. [This can happen, see homework.] Define w(x) = 2 if x in A w(x) = 1 if x in B\A w(x) = 0 otherwise. Then the Greedy algorithm incorrectly returns A instead of B.

  38. Extensibility Axiom Extensibility Axiom: For each basis B and every feasible set A properly contained in B, we have ext(A) B  . [This axiom is clearly necessary for the optimality of the greedy algorithm.]

  39. Matroid Embedding Axiom Let M=(S,F) be a set system. Define clos(M) = (S,F’), where F’ = { A | AB, B in F } We call clos(M) the hereditary closure of M. Matroid embedding axiom clos(M) is a matroid. [This axiom is necessary for the greedy algorithm to return optimal sets. Thus, matroid are always in the background, even though the accessible set system might not be a matroid.]

  40. Congruence Closure Axiom Congruence closure axiom For every feasible set A and all x,y in ext(A) and every subset X in S\(Aext(A)), AX{x} in clos(M) implies AX{y} in clos(M). [This axiom restricts the future extensions of the set system. Note that it is a property in the hereditary closure.]

  41. Theorem (Helman, Moret, Shapiro) Let M=(S,F) be an accessible set system. For each weight function w: S->R the optimal solutions to BMAX are the bases of M that are generated by Greedy (assuming a suitable ordering of the elements with the same weight) if and only if the accessible set system M satisfies the extension axiom, the congruence closure axiom, and the matroid embedding axiom.

More Related