1 / 22

CSE 326: Data Structures Lecture #24 The Algorhythmics

This lecture covers greedy algorithms, divide & conquer, and dynamic programming, with examples and proofs of correctness. It also discusses the closest points problem and memoizing/dynamic programming.

chapel
Download Presentation

CSE 326: Data Structures Lecture #24 The Algorhythmics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSE 326: Data StructuresLecture #24The Algorhythmics Steve Wolfman Winter Quarter 2000

  2. Today’s Outline • Greedy • Divide & Conquer • Dynamic Programming • Randomized • Backtracking

  3. Greedy Algorithms Repeat until problem is solved: • Measure options according to marginal value • Commit to maximum Greedy algorithms are normally fast and simple. Sometimes appropriate as a heuristic solution or to approximate the optimal solution.

  4. Hill-Climbing Global Maximum Local Maximum

  5. Greed in Action • Best First Search • A* Search • Huffman Encodings • Kruskal’s Algorithm • Dijkstra’s Algorithm • Prim’s Algorithm • Scheduling

  6. Scheduling Problem • Given: • a group of tasks {T1, …, Tn} • each with a duration {d1, …, dn} • a single processor without interrupts • Select an order for the tasks that minimizes average completion time 1 1.5 0.5 0.3 2 1.4 T1 T2 T3 T4 T5 T6 Average time to completion:

  7. Greedy Solution

  8. Proof of Correctness Common technique for proving correctness of greedy algorithms is by contradiction: • assume there is a non-greedy best way • show that making that way more like greedy solution improves the supposedly best way: contradiction! Proof of correctness for greedy scheduling:

  9. Divide & Conquer • Divide problem into multiple smaller parts • Solve smaller parts • Solve base cases directly • Otherwise, solve subproblems recursively • Merge solutions together (Conquer!) Often leads to elegant and simple recursive implementations.

  10. Divide & Conquer in Action • Mergesort • Quicksort • buildHeap • buildTree • Closest points

  11. Closest Points Problem • Given: • a group of points {(x1, y1), …, (xn, yn)} • Return the distance between the closest pair of points 0.75

  12. Closest Points Algorithm Closest pair is: • closest pair on leftor • closest pair on rightor • closest pair spanning the middle runtime:

  13. Closest Points Algorithm Refined Closest pair is: • closest pair on leftor • closest pair on rightor • closest pair in middle strip within one  of each other vertically runtime:

  14. Memoizing/Dynamic Programming • Define problem in terms of smaller subproblems • Solve and record solution for base cases • Build solutions for subproblems up from solutions to smaller subproblems Can improve runtime of divide & conquer algorithms that have shared subproblems with optimal substructure. Usually involves a table of subproblem solutions.

  15. Dynamic Programming in Action • Sequence Alignment • Fibonacci numbers • All pairs shortest path • Optimal Binary Search Tree

  16. Memoized “Divide” & Conquer int fib(int n) { static vector<int> fibs; if (n <= 1) return 1; if (fibs[n] == 0) fibs[n] = fib(n - 1) + fib(n - 2); return fibs[n]; } int fib(int n) { if (n <= 1) return 1; else return fib(n - 1) + fib(n - 2); } Fibonacci Numbers F(n) = F(n - 1) + F(n - 2) F(0) = 1 F(1) = 1

  17. Optimal Binary Search Tree Problem • Given: • a set of words {w1, …, wn} • probabilities of each word’s occurrence {p1, …, pn} • Produce a Binary Search Tree which includes all the words and has the lowest expected cost: Expected cost = (where di is the depth of word i in the tree)

  18. The Optimal BST Shared Subproblems Optimal Substructure Falcon Millenium Falcon And to Ever Millenium to Zaphod And to Ever First to Meet Can an optimal solution possibly have suboptimal subtrees?

  19. Optimal BST Cost Let CLeft,Right be the cost of the optimal subtree between wLeftand wRight. Then, CLeft,Right is: Let’s maintain a 2-D table to store values of Ci,j

  20. Optimal BST Algorithm …a …am …and …egg …if …the …two a… am… and… egg… if… the… two…

  21. To Do • Finish Project IV!!!!!! • Read Chapter 10 (algorithmic techniques) • Work on final review (remember you can and should work with others)

  22. Coming Up • Course Wrap-up • Project IV due (March 7th; that’s tomorrow!) • Final Exam (March 13th)

More Related