1 / 81

Lecture 1: The Greedy Method

Lecture 1: The Greedy Method. 主講人 : 虞台文. Content. What is it? Activity Selection Problem Fractional Knapsack Problem Minimum Spanning Tree Kruskal’s Algorithm Prim’s Algorithm Shortest Path Problem Dijkstra’s Algorithm Huffman Codes . Lecture 1: The Greedy Method. What is it?.

Download Presentation

Lecture 1: The Greedy Method

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 1: The Greedy Method 主講人:虞台文

  2. Content • What is it? • Activity Selection Problem • Fractional Knapsack Problem • Minimum Spanning Tree • Kruskal’s Algorithm • Prim’s Algorithm • Shortest Path Problem • Dijkstra’s Algorithm • Huffman Codes

  3. Lecture 1: The Greedy Method What is it?

  4. The Greedy Method • A greedy algorithm always makes the choice that looksbest at the moment • For some problems, it always give aglobally optimal solution. • For others, it may only give a locally optimal one.

  5. Main Components • Configurations • differentchoices, collections, or values to find • Objective function • a score assigned to configurations, which we want to either maximize or minimize

  6. Is the solution alwaysoptimal? Example: Making Change • Problem • A dollar amount to reach and a collection of coin amounts to use to get there. • Configuration • A dollar amount yet to return to a customer plus the coins already returned • Objective function • Minimizenumber of coins returned. • Greedy solution • Always return the largest coin you can

  7. Example: Largest k-out-of-n Sum • Problem • Pick k numbers out of n numbers such that the sum of these k numbers is the largest. • Exhaustive solution • There are choices. • Choose the one with subset sum being the largest • Greedy Solution FOR i = 1 to k pick out the largest number and delete this number from the input. ENDFOR Is the greedy solution alwaysoptimal?

  8. Example:Shortest Paths on a Special Graph • Problem • Find a shortest path from v0 to v3 • Greedy Solution

  9. Is the solution optimal? Example:Shortest Paths on a Special Graph • Problem • Find a shortest path from v0 to v3 • Greedy Solution

  10. Is the greedy solution optimal? Example:Shortest Paths on a Multi-stage Graph • Problem • Find a shortest path from v0 to v3

  11. Is the greedy solution optimal? Example:Shortest Paths on a Multi-stage Graph • Problem • Find a shortest path from v0 to v3 The optimal path

  12. Is the greedy solution optimal? Example:Shortest Paths on a Multi-stage Graph • Problem • Find a shortest path from v0 to v3 What algorithm can be used to find the optimum? The optimal path

  13. Advantage and Disadvantageof the Greedy Method • Advantage • Simple • Work fast when they work • Disadvantage • Not always work  Short term solutions can be disastrous in the long term • Hard to prove correct

  14. Lecture 1: The Greedy Method Activity Selection Problem

  15. Activity Selection Problem(Conference Scheduling Problem) • Input: A set of activities S = {a1,…, an} • Each activity has a start time and a finish time ai = [si, fi) • Two activities are compatible if and only if their interval does notoverlap • Output: a maximum-size subset of mutually compatible activities

  16. Example:Activity Selection Problem Assume thatfi’s are sorted.

  17. 1 2 3 4 5 6 7 8 9 10 11 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Example:Activity Selection Problem

  18. 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Example:Activity Selection Problem Is the solution optimal? 1 2 3 4 5 6 7 8 9 10 11

  19. 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Example:Activity Selection Problem Is the solution optimal? 1 2 3 4 5 6 7 8 9 10 11

  20. Activity Selection Algorithm Greedy-Activity-Selector (s, f) // Assume that f1 f2   ...  fn n length [s] A { 1 } j 1 for i 2 to n    if sifj then AA{ i } ji return A Is the algorithm optimal?

  21. Proof of Optimality • Suppose A  S is an optimal solution and the first activity is k 1. • If k 1, one can easily show that B =A – {k}  {1} is also optimal. (why?) • This reveals that greedy-choice can be applied to the first choice. • Now, the problem is reduced to activity selection on S’ = {2, …, n}, which are all compatible with 1. • By the same argument, we can show that, to retain optimality, greedy-choice can also be applied for next choices.

  22. Lecture 1: The Greedy Method Fractional Knapsack Problem

  23. The Fractional Knapsack Problem • Given: A set S of n items, with each item i having • bi - a positive benefit • wi - a positive weight • Goal: Choose items, allowing fractional amounts, to maximizetotal benefit but with weight at most W.

  24. 1 2 3 4 5 Value: ($ per ml) 10 ml The Fractional Knapsack Problem “knapsack” • Solution: • 1 ml of 5 • 2 ml of 3 • 6 ml of 4 • 1 ml of 2 Items: wi: 4 ml 8 ml 2 ml 6 ml 1 ml bi: $12 $32 $40 $30 $50 3 4 20 5 50

  25. The Fractional Knapsack Algorithm • Greedy choice: Keep taking item with highest value AlgorithmfractionalKnapsack(S,W) Input:set S of items w/ benefit biand weight wi; max. weight W Output:amount xi of each item i to maximize benefit w/ weight at most W for each item i in S xi 0 vi bi / wi{value} w 0 {total weight} whilew < W remove item i with highest vi xi min{wi , W  w} w  w + min{wi , W  w} Does the algorithm always gives an optimum?

  26. Proof of Optimality • Suppose there is a better solution • Then, there is an item i with higher value than a chosen itemj, but xi < wi, xj > 0 and vi > vj • Substituting some i with j, we’ll get a better solution • How much of i: min{wi xi, xj} • Thus, there is no better solution than the greedy one

  27. Recall: 0-1 Knapsack Problem Which boxes should be chosen to maximize the amount of money while still keeping the overall weight under 15 kg ? Is the fractional knapsack algorithm applicable?

  28. Exercise • Construct an example show that the fractional knapsack algorithm doesn’t give the optimal solution when applying it to the 0-1 knapsack problem.

  29. Lecture 1: The Greedy Method Minimum Spanning Tree

  30. What is a Spanning Tree? • A tree is a connected undirected graph that contains nocycles • A spanning tree of a graph G is a subgraph of G that is a tree and contains all the vertices of G

  31. B B B A A A C C C D D D E E E Properties of a Spanning Tree • The spanning tree of a n-vertex undirected graph has exactly n – 1 edges • It connects all the vertices in the graph • A spanning tree has no cycles Undirected Graph Some Spanning Trees

  32. What is a Minimum Spanning Tree? • A spanning tree of a graph G is a subgraph of G that is a tree and contains all the vertices of G • A minimumspanning tree is the one among all the spanning trees with the lowest cost

  33. Applications of MSTs • Computer Networks • To find how to connect a set of computers using the minimum amount of wire • Shipping/Airplane Lines • To find the fastest way between locations

  34. Two Greedy Algorithms for MST • Kruskal’s Algorithm • merges forests into tree by adding small-cost edges repeatedly • Prim’s Algorithm • attaches vertices to a partially built tree by adding small-cost edges repeatedly

  35. 8 7 4 9 2 14 11 4 7 16 b c d 8 10 1 2 a i e b c d h g f a i e h g f Kruskal’s Algorithm

  36. b c d a i e h g f Kruskal’s Algorithm 8 7 4 9 2 14 11 4  7 16  8 10 1 2 b c d a i e h g f

  37. Kruskal’s Algorithm G = (V, E) – Graph w: ER+– Weight T Tree MST-Kruksal(G) T← Ø for each vertex vV[G] Make-Set(v) // Make separate sets for vertices sort the edges by increasing weight w for each edge (u, v)E, in sorted order if Find-Set(u) ≠ Find-Set(v) // If no cycles are formed T ← T {(u, v)}// Add edge to Tree Union(u, v) // Combine Sets return T

  38. O(|E|log|E|) Time Complexity G = (V, E) – Graph w: ER+– Weight T Tree MST-Kruksal(G , w) T← Ø for each vertex vV[G] Make-Set(v) // Make separate sets for vertices sort the edges by increasing weight w for each edge (u, v)E, in sorted order if Find-Set(u) ≠ Find-Set(v) // If no cycles are formed T ← T {(u, v)}// Add edge to Tree Union(u, v) // Combine Sets return T O(1) O(|V|) O(|E|log|E|) O(|E|) O(|V|) O(1)

  39. 8 7 4 9 2 14 11 4 7 16 b c d 8 10 1 2 a i e b c d h g f a i e h g f Prim’s Algorithm

  40. Prim’s Algorithm 8 7 d b b c d c 4 9 2 a a i e e i 14 11 4 7 16 8 10 h g f h g f 1 2 b b c c d d a i e a i e h g f h g f

  41. Prim’s Algorithm G = (V, E) – Graph w: ER+– Weight r – Starting vertex Q – Priority Queue Key[v] – Key of Vertex v π[v] –Parent of Vertex v Adj[v] – Adjacency List of v MST-Prim(G, w, r) Q← V[G]// Initially Q holds all vertices for each uQ Key[u] ← ∞// Initialize all Keys to ∞ Key[r] ← 0 // r is the first tree node π[r] ← Nil while Q ≠ Ø u ← Extract_min(Q) // Get the min key node for each v Adj[u] if vQ and w(u, v) < Key[v]// If the weight is less than the Key π[v] ← u Key[v] ← w(u, v)

  42. O(|E|log|V|) Time Complexity G = (V, E) – Graph w: ER+– Weight r – Starting vertex Q – Priority Queue Key[v] – Key of Vertex v π[v] –Parent of Vertex v Adj[v] – Adjacency List of v MST-Prim(G, r) Q← V[G]// Initially Q holds all vertices for each uQ Key[u] ← ∞// Initialize all Keys to ∞ Key[r] ← 0 // r is the first tree node π[r] ← Nil while Q ≠ Ø u ← Extract_min(Q) // Get the min key node for each v Adj[u] if vQ and w(u, v) < Key[v]// If the weight is less than the Key π[v] ← u Key[v] ← w(u, v)

  43. Are the algorithms optimal? Optimality Yes • Kruskal’s Algorithm • merges forests into tree by adding small-cost edges repeatedly • Prim’s Algorithm • attaches vertices to a partially built tree by adding small-cost edges repeatedly

  44. Lecture 1: The Greedy Method Shortest Path Problem

  45. Shortest Path Problem (SPP) • Single-Source SPP • Given a graph G = (V, E), and weight w: ER+, find the shortest path from a source node s  V to any other node, say, v  V. • All-Pairs SPP • Given a graph G = (V, E), and weight w: ER+, find the shortest path between each pair of nodes in G.

  46. Dijkstra's Algorithm • Dijkstra's algorithm, named after its discoverer, Dutch computer scientist Edsger Dijkstra, is an algorithm that solves the single-source shortest path problem for a directed graph with nonnegative edge weights.

  47. Dijkstra's Algorithm • Start from the source vertex, s • Take the adjacent nodes and update the current shortest distance • Select the vertex with the shortest distance, from the remaining vertices • Update the current shortest distance of the Adjacent Vertices where necessary, • i.e. when the new distance is less than the existing value • Stop when all the vertices are checked

  48. Dijkstra's Algorithm

  49. u v 1   9 2 3 9 0 s 4 6 7 5   2 y x Dijkstra's Algorithm

  50. u v 1   9 2 3 9 0 s 4 6 7 5   2 y x Dijkstra's Algorithm     

More Related