200 likes | 332 Views
알고리즘 설계 및 분석. Foundations of Algorithm 유관우. Chap4. Greedy Approach. Grabs data items in sequence, each time with “best” choice, without thinking future. Efficient, but can not solve many problems Dynamic programming G.A : Choose locally optimal choice 1-by-1
E N D
알고리즘 설계 및 분석 Foundations of Algorithm 유관우
Chap4. Greedy Approach • Grabs data items in sequence, each time with “best” choice, without thinking future. • Efficient, but can not solve many problems Dynamic programming • G.A : Choose locally optimal choice 1-by-1 • D.P : solve smaller problems optimal solution. • (Eg) coin change problem: • Goal: correct change + as few coins as possible. • Approach : choose largest possible coin. Locally optimal choice optimal solution. No reconsideration! • Q?: Is the solution really optimal? ( proof required!!!) • (Eg)우리나라 동전 system : Greedy Approach O.K. • 이상한 나라 : ⑧⑥② 12 ? Dynamic Programming. Digital Media Lab.
1 1 1 3 3 v1 v2 3 6 3 6 4 4 v3 v4 2 5 5 2 v5 A S.T. An MST Minimum Spanning Trees (MST) • Connected weighted undirected graph • Disconnected graph • Tree : connected acyclic graph • n vertices, (n-1) edges, connected • Unique path between 2 vertices • Rooted tree, spanning tree G = ( V, E) A conn. Weighted undirected G Spanning forest Digital Media Lab.
Weight of a spanning tree T : • M.S.T : a S.T. with min. such weight • Input : Adjacency Matrix W of G=(V,E) connected, weighted, undirected Output : An MST T=(V,F). (F⊆E) • Uniqueness? (|F|=|V|-1) • Greedy Approach : F=Ø {initialization} While not solved yet do { Select an edge (L.O.C). {selection} if create not cycle then add. {feasibility} if T=(V,F) is a S.T. then exit. {solution check} } • Prim’s Alg. , Kruskal’s Alg. different locally optimal choice. Digital Media Lab.
1 1 1 1 1 1 1 3 3 3 3 3 3 3 v1 v1 v1 v1 v1 v1 v1 v2 v2 v2 v2 v2 v2 v2 3 3 3 3 3 3 3 6 6 6 6 6 6 6 4 4 4 4 4 4 4 v3 v3 v3 v3 v3 v3 v3 v4 v4 v4 v4 v4 v4 v4 2 2 2 2 2 2 2 5 5 5 5 5 5 5 v5 v5 v5 v5 v5 v5 v5 Prim’s Algorithm F={(v1,v2)} Y={v1,v2} Input Graph F=Ø, Y={v1} F=Ø; Y={v1}; //Initialization// While not solved do { Select v∈V-Y nearest to Y; //selection feasibility Add v to Y; add the edge to F; if Y=V then exit; } {(v1,v2),(v2,v3)} {v1,v2,v3} (v3,v4)F v4Y (v3,v5)F v5Y //no negative weight edge // Digital Media Lab.
Data Structures : nⅹn adjacency Matrix W. nearest[i]=index of vertex in Y nearest to vi distance[i]= weight of edge (vi, nearest[i]) procedure Prim(n, W, var F : set_of_edges) var i, near, min, e (edge), nearest : array [2..n] of index; distance : array [2..n] of number; { F=Ø; for i=2 to n do { nearest[i]=1; distance[i]=W[1,i];} repeat n-1 times min=∞; for i=2 to n do if 0≤distance[i]<min then { min=distance[i]; near=i; } e=(near, nearest[near]); add e to F distance[near]=-1; for i=2 to n do if W[i,near]<distance[i] then { distance[i]=W[i,near]; nearest[i]=near;} } Digital Media Lab.
(Every-case) time complexity Analysis : T(n) = 2(n-1)(n-2) ∈ θ(n2) Q1: Spanning tree ? : Yes. Easy. Q2: Minimum S.T. ? Formal proof required Q3: How to implement F? var parent : array [2..n] of index; (parameter) e=(near, ..)⇒parent[near]=nearest[near] Better way array nearest( local var ⅹ, parameter O) holds the information!!! • Proof for MST (proof for Prim’s Alg.) F⊆E is promising if F⊆T (an MST) (Eg) {(v1,v2),(v1,v3)} : promising {(v2,v4)} : not promising Digital Media Lab.
Y V-Y Min-weight edge • Lemma F : promising subset of E. Y : set of vertices connected by F e : Then F∪{e} is promising (proof) F is promising⇒∃MST(V,F´) s.t. F⊆F´. If e ∈ F´ (F ∪{e} ⊆F´) then done. Otherwise (e F’) : F´ ∪{e}∋ exactly one cycle ∃ e´∈F´ that connects vertex in Y and vertex in V-Y weight (e´)≥ weight (e) F´ ∪{e}- {e´} : an MST F∪{e} ⊆F´ ∪{e}- {e´} ⇒ F∪{e} : promising • Theorem : Prim’s Algorithm produces an MST (Proof) Basis : Ø is promising I. H. : Assume F is promising I. S. : e : edge selected in the next iteration F∪{e} is promising by Lemma e´ Y V-Y e Digital Media Lab.
1 3 v1 v1 v1 v2 v2 v2 3 6 4 v3 v3 v3 v4 v4 v4 2 5 v5 v5 v5 Kruskal’s Algorithm F=Ø Create n disjoint subsets {v1}, {v2},…,{vn}; Sort the edges in E; While not solved do { select next edge; if the edge connects 2 disjoint subsets{ merge the subsets; add the edge to F;} if all the subsets are merged then exit; } (v1,v2) 1 (v3,v5) 2 ( 1, 3) 3 ( 2, 3) 3 ( 3, 4) 4 ( 4, 5) 5 ( 2, 4) 6 Input Graph Disjoint sets ① v1 v2 v1 v2 ③ ⅹ v3 v4 v3 v4 ② Digital Media Lab. v5 v5
Procedural kruskal(n, m: integers; E :set_of_edges; var F : set_of_edges); Var i, j : index; p, q : set_pointer; e : edge { sort m edges of E; F=Ø; initial(n); //Initialize n disjoint sets.// repeat e=next edge; (i, j ) =e; p=find(i); q=find(j); if p≠q then {merge (p, q); add e to F; } until |F|=n-1; } • Worst-case time complexity Analysis : θ(m log m) • Sorting : θ(m log m) • Initialization :θ(n) • Total time for find, merge : θ(mα( m, n)) • Remaining op’s : O(m) Digital Media Lab.
Lemma :F⊆E promising e : min weight edge in E-F s.t. F∪{e} has no cycle Then F ∪{e} is promising (Proof) F is promising ⇒∃(V,F´) MST & F⊆F´. If e∈F´, then F ∪{e} ⊆ F´, done. Otherwise :F´ ∪{e}∋ exactly one cycle F ∪{e} has no cycle ⇒∃e´∈F´ in the cycle s.t. e´ F ⇒ e´∈ E-F. F ∪{e´} ⊆ F´⇒ F ∪{e´} has no cycle Therefore, weight(e) ≤ weight(e´) F´ ∪{e}-{e´} is also an MST, and F ∪{e} ⊆ F´ ∪{e}-{e´} ⇒ F ∪{e} is promising • Theorem : Kruskal’s Alg. Produces an MST • Prim’s A. : θ(n2), Kruskal’s A. : θ(m log m) (Note : n-1 ≤ m ≤ n(n-1)/2) • Prim’s Alg. : θ(m log n) if binary heap θ(m + n log n) if Fibonacci heap Digital Media Lab.
Dijkstra’s Algorithm for S.S.S.P. • All-pairs shortest prob. :θ(n3) Floyd Alg.(D.P.) (no negative-weight cycle) • Single-Source Shortest Path Prob. Dijkstra’s Alg. Θ(n2), Greedy Approach • Similar to Prim’s Alg. Start with {v1}; //source// Choose nearest vertex from v1,… Y={v1}; F=Ø; //Shortest paths tree // While not solved do { Select v from V-Y nearest from v1 using only Y as intermediates; add v to Y add the edge touching v to F; if Y=V then Exit; } • Weighted directed graph (no negative-weight edge) Adjacency Matrix : Input Digital Media Lab.
v4 1 7 v1 v1 v1 v1 v1 1 7 7 7 1 3 v5 v5 v5 v5 v5 v2 v2 v2 v2 v2 4 4 1 6 6 3 3 4 2 2 2 1 1 v3 v3 v3 v3 v3 v4 v4 v4 v4 v4 5 5 V5 is selected 1 7 5 Touch[i]=index of v ∈Y s.t. <v,vi> is the last edge from v1 to vi using only Y as intermediates Length[i]=length of such shortest path • Every-case Time Complexity T(n)=2(n-1)2∈θ(n2) • Binary heap : θ(m log n) Fibonacci heap : θ(m + n log n) 4 6 4 v2 v3 5 Digital Media Lab.
Procedural dijkstra ( n : integer; W, F); Var i, near, e, touch[2...n], length[2…n] { F=Ø; for i=2 to n do { touch[i]=1; length[i]=W[1,i];} repeat n-1 times { min=∞; for i=2 to n do if 0≤length[i]<min then { min=length[i]; near=i; } e=(touch[near], near); add e to F; for i=2 to n do if length[near]+W[near,i]< length[i] then { length[i]=length[near]+W[near, i]; touch[i]=near;} length[near]=-1; } } • Note : touch[2...n] constructs the shortest Path Tree. Digital Media Lab.
Scheduling • Minimization Total Time in the System (Eg) t1=5, t2=10, t3=4 Schedule Total time in the system [1, 2, 3] 5 + (5+10) + (5+10+4) = 39 [1, 3, 2] 5 + (5+4) + (5+4+10) = 33 [2, 1, 3] 10 + (10+5) + (10+5+4) = 44 [2, 3, 1] 10 + (10+4) + (10+4+5) = 43 [3, 1, 2] 4 + (4+5) + (4+5+10) = 32 [3, 2, 1] 4 + (4+10) + (4+10+5) = 37 • Algorithm : Sort in non-decreasing order, and schedule • Scheduling with Deadlines Job deadline profit Schedule Total Profit 1 2 30 [1, 3] 30 + 25 = 55 2 1 35 [2, 1] 30 + 35 = 65 3 2 25 [2, 3] 35 + 25 = 60 4 1 40 [3, 1] 30 + 25 = 55 [4, 1] 40 + 30 = 70 [4, 3] 40 + 25 = 65 Digital Media Lab.
2 2 2 4 4 1 1 1 1 7 6 5 3 3 3 Strategy : Sort in non-increasing order by profit. Schedule next job as late as possible. (Eg) Job Deadline Profit 1 3 40 2 1 35 3 1 30 4 3 25 5 1 20 6 3 15 7 2 10 100: optimal • Time Complexity : θ(n log n) • Disjoint set forest op.’s necessary! Digital Media Lab.
G.A. versus D.P. : The knapsack problem G.A. : more efficient, simpler (difficult proof) D.P. : more powerful (easy proof) (Eg) G.A. : 0/1 knapsack ⅹ D.P.: 0/1 knapsack O • G.A to 0/1 knapsack problem S = {item 1, item 2,…,item n} wi = weight of item i pi = profit of item i W = maximum weight the knapsack can hold. (wi, pi, W : positive integers) Determine A⊆S s.t. is maximized s.t. Brute Force Approach : consider all subsets 2n subsets exponential time Alg. Digital Media Lab.
Greedy Strategy : • Largest profit first : incorrect (eg) (w1, w2, w3) = (25, 10, 10) (P1, P2, P3 ) = (10, 9, 9) W = 30 Greedy : 10 optimal : 18 • Lightest item first : ⅹ • Largest profit per unit weight first ⅹ (eg) (w1, w2, w3) = (5, 10, 20) (P1, P2, P3 ) = (50, 60, 140) W = 30 Greedy : 190 optimal : 200 • G.A. to fractional knapsack problem ③ : optimal solution guaranteed (Eg) 50+140+5/10(60)=220 (proof necessary) Digital Media Lab.
D.P. Approach to 0/1-knapsack Prob. • Principle of optimality ? Yes A : optimal subset of S ① itemn ∈ A : A is opt. for S-{itemn} ② itemn ∈ A : A-{itemn} is optimal for S-{itemn} P[i, w]:optimal profit for {item1, …,itemi} with w being left in K.S. • Maximum profit :P[n, W] • P : array [0..n, 0..W] of integer; • Time : θ(nW) space : θ(nW) ⇒pseudo-polynomial time alg. (Note : 0/1 knapsack prob. Is NP-complete) ⇒If W is big, terrible performance!!! Digital Media Lab.
Refinement • Θ(nW) time & space may be too much ! ⇒improve to θ(2n) Idea : All i-th row of P[0..n, 0..W] need not be computed n-th row : P[n, W] only (n-1)th row : P[n-1,W], P[n-1,W-wn) (2) (n-2)th row : (4) stop when n=1 or w≤0 (∵P[i, w] is computed from P[i-1,w] and P[i-1,w-wi]) Total # entries = 1+2+22+…+2n-1=2n-1 θ(2n) time ∴Worst-case time complexity for 0/1-KS problem using D.P.: O(min (2n, nW)) Digital Media Lab.