1 / 65

Steps in Algorithm Design: Stable Matching Problem and Dijkstra's Algorithm

This review covers the main steps in algorithm design and analyzes the Stable Matching problem using the Gale-Shapley algorithm. It also explains Dijkstra's algorithm for finding the shortest path in a graph.

Download Presentation

Steps in Algorithm Design: Stable Matching Problem and Dijkstra's Algorithm

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSE 331: Review August 1, 2013

  2. Main Steps in Algorithm Design Problem Statement Real world problem Problem Definition Precise mathematical def Algorithm “Implementation” Data Structures Analysis Correctness/Run time

  3. Stable Matching Problem Gale-Shaply Algorithm

  4. Stable Marriage problem Input:M and W with preferences Output: Stable Matching Set of men M and women W Preferences (ranking of potential spouses) Matching (no polygamy in M X W) Perfect Matching (everyone gets married) m w Instablity m’ w’ Stable matching = perfect matching+ no instablity

  5. Gale-Shapley Algorithm At most n2 iterations Intially all men and women are free While there exists a free woman who can propose Let w be such a woman and m be the best man she has not proposed to w proposes to m O(1) timeimplementation If m is free (m,w) get engaged Else (m,w’) are engaged If m prefers w’ to w w remains free Else (m,w) get engaged and w’ is free Output the engaged pairs as the final output

  6. GS algorithm: Firefly Edition Inara Mal Zoe Wash 4 5 6 1 2 3 Kaylee Simon 4 5 6 1 2 3

  7. GS algo outputs a stable matching Lemma 1: GS outputs a perfect matching S Lemma 2: S has no instability

  8. Proof technique de jour Proof by contradiction Assume the negation of what you want to prove After some reasoning Source: 4simpsons.wordpress.com

  9. Two obervations Obs 1: Once m is engaged he keeps getting engaged to “better” women Obs 2: If w proposes to m’ first and then to m (or never proposes to m) then she prefers m’ to m

  10. Proof of Lemma 2 By contradiction w’ last proposed to m’ Assume there is an instability (m,w’) m w m prefers w’ to w w’ prefers m to m’ m’ w’

  11. Contradiction by Case Analysis Depending on whether w’ had proposed to m or not Case 1: w’ never proposed to m By Obs 2 w’ prefers m’ to m Assumed w’ prefers m to m’ Source: 4simpsons.wordpress.com

  12. Case 2:w’ had proposed to m Case 2.1: m had accepted w’ proposal m is finally engaged to w Thus, m prefers w to w’ By Obs 1 Case 2.2: m had rejected w’ proposal m was engaged to w’’(prefers w’’to w’) By Obs 1 m is finally engaged to w (prefers w to w’’) By Obs 1 m prefers w to w’ 4simpsons.wordpress.com 4simpsons.wordpress.com

  13. Overall structure of case analysis Did w’ propose to m? Did m accept w’ proposal? 4simpsons.wordpress.com 4simpsons.wordpress.com 4simpsons.wordpress.com

  14. Graph Searching BFS/DFS

  15. O(m+n) BFS Implementation Input graph as Adjacency list BFS(s) Array CC[s] = T and CC[w] = F for every w≠ s Set i = 0 Set L0= {s} While Li is not empty Linked List Li+1 = Ø For every u in Li For every edge (u,w) Version in KT also computes a BFS tree If CC[w] = F then CC[w] = T Add w to Li+1 i++

  16. An illustration 1 2 3 4 5 7 8 6 1 7 2 3 8 4 5 6

  17. O(m+n) DFSimplementation BFS(s) CC[s] = T and CC[w] = F for every w≠ s O(n) Intitialize Q= {s} O(1) While Q is not empty Repeated at most once for each vertex u Σu O(nu) = O(Σu nu) = O(m) Delete the front element u in Q For every edge (u,w) Repeated nu times O(1) If CC[w] = F then O(nu) CC[w] = T O(1) Add w to the back of Q

  18. A DFS run using an explicit stack 7 8 1 7 6 7 3 2 3 5 8 4 4 5 5 3 6 2 3 1

  19. Topological Ordering

  20. Run of TopOrd algorithm

  21. Greedy Algorithms

  22. Interval Scheduling: Maximum Number of Intervals Schedule by Finish Time

  23. End of Semester blues Can only do one thing at any day: what is the maximum number of tasks that you can do? Write up a term paper Party! Exam study 331 HW Project Monday Tuesday Wednesday Thursday Friday

  24. Schedule by Finish Time O(n log n) time sort intervals such that f(i) ≤ f(i+1) O(n) time build array s[1..n]s.t. s[i] = start time for i Set A to be the empty set While R is not empty Do the removal on the fly Choose i in R with the earliest finish time Add i to A Remove all requests that conflict with i from R Return A*=A

  25. The final algorithm Order tasks by their END time Write up a term paper Party! Exam study 331 HW Project Monday Tuesday Wednesday Thursday Friday

  26. Proof of correctness uses“greedy stays ahead”

  27. Interval Scheduling: Maximum Intervals Schedule by Finish Time

  28. Scheduling to minimize lateness All the tasks have to be scheduled GOAL: minimize maximum lateness Write up a term paper Exam study Party! 331 HW Project Monday Tuesday Wednesday Thursday Friday

  29. The Greedy Algorithm (Assume jobs sorted by deadline: d1≤ d2≤ ….. ≤ dn) f=s For every i in 1..n do Schedule job i from s(i)=f to f(i)=f+ti f=f+ti

  30. Proof of Correctness uses“Exchange argument”

  31. Proved the following Any two schedules with 0 idle time and 0 inversions have the same max lateness Greedy schedule has 0 idle time and 0 inversions There is an optimal schedule with 0 idle time and 0 inversions

  32. Shortest Path in a Graph: non-negative edge weights Dijkstra’s Algorithm

  33. Shortest Path problem s s s 100 Input: Directed graph G=(V,E) Edge lengths, le for e in E “start” vertex s in V w w 5 5 5 15 15 u u u Output: All shortest paths from s to all nodes in V

  34. Dijkstra’s shortest path algorithm 1 d’(w) = min e=(u,w) in E, u in Rd(u)+le 3 3 y 4 u 1 1 1 d(s) = 0 d(u) = 1 4 s x 4 2 d(w) = 2 d(x) = 2 2 2 d(y) = 3 d(z) = 4 2 w z 5 4 3 2 s Input: Directed G=(V,E), le ≥ 0, s in V w u R = {s}, d(s) =0 Shortest paths x While there is a x not in R with (u,x) in E, u in R Pick w that minimizes d’(w) Add w to R z y d(w) = d’(w)

  35. Dijkstra’s shortest path algorithm (formal) Input: Directed G=(V,E), le ≥ 0, s in V S = {s}, d(s) =0 While there is a v not in S with (u,v) in E, u in S At most n iterations Pick w that minimizes d’(w) Add w to S d(w) = d’(w) O(m) time O(mn) time bound is trivial O(m log n) time implementation is possible

  36. Proved that d’(v) is best when v is added

  37. Minimum Spanning Tree Kruskal/Prim

  38. Minimum Spanning Tree (MST) Input: A connected graph G=(V,E), ce> 0 for every e in E Output: A tree containing all V that minimizes the sum of edge weights

  39. Kruskal’s Algorithm Input: G=(V,E), ce> 0 for every e in E T = Ø Sort edges in increasing order of their cost Joseph B. Kruskal Consider edges in sorted order If an edge can be added to T without adding a cycle then add it to T

  40. Prim’s algorithm 0.5 2 Similar to Dijkstra’s algorithm 3 1 50 51 0.5 2 Robert Prim Input: G=(V,E), ce> 0 for every e in E 1 50 S = {s}, T = Ø While S is not the same as V Among edges e= (u,w) with u in S and w not in S, pick one with minimum cost Add w to S, e to T

  41. Cut Property Lemma for MSTs Condition: S and V\S are non-empty V \ S S Cheapest crossing edge is in all MSTs Assumption: All edge costs are distinct

  42. Divide & Conquer

  43. Sorting Merge-Sort

  44. Sorting Given n numbers order them from smallest to largest Works for any set of elements on which there is a total order

  45. Mergesort algorithm Input: a1, a2, …, an Output: Numbers in sorted order MergeSort( a, n ) If n = 2 return the order min(a1,a2); max(a1,a2) aL = a1,…, an/2 aR = an/2+1,…, an return MERGE ( MergeSort(aL, n/2), MergeSort(aR, n/2) )

  46. An example run 1 51 51 1 19 100 100 19 2 2 8 8 4 3 3 4 1 8 1 2 19 19 3 2 4 51 51 3 100 4 8 100 MergeSort( a, n ) If n = 2 return the order min(a1,a2); max(a1,a2) aL = a1,…, an/2 aR = an/2+1,…, an return MERGE ( MergeSort(aL, n/2), MergeSort(aR, n/2) )

  47. Correctness Input: a1, a2, …, an Output: Numbers in sorted order By induction on n MergeSort( a, n ) If n = 1 return the order a1 If n = 2 return the order min(a1,a2); max(a1,a2) aL = a1,…, an/2 aR = an/2+1,…, an return MERGE ( MergeSort(aL, n/2), MergeSort(aR, n/2) ) Inductive step follows from correctness of MERGE

  48. Counting Inversions Merge-Count

  49. Mergesort-Count algorithm Input: a1, a2, …, an Output: Numbers in sorted order+ #inversion T(2) = c MergeSortCount( a, n ) T(n) = 2T(n/2) + cn If n = 1 return( 0 , a1) O(n log n) time If n = 2 return( a1 > a2, min(a1,a2); max(a1,a2)) aL = a1,…, an/2 aR = an/2+1,…, an O(n) (cL, aL) = MergeSortCount(aL, n/2) (cR, aR) = MergeSortCount(aR, n/2) Counts #crossing-inversions+ MERGE (c, a) = MERGE-COUNT(aL,aR) return (c+cL+cR,a)

  50. Closest Pair of Points Closest Pair of Points Algorithm

More Related