1 / 46

Minimum Spanning Tree

Minimum Spanning Tree. 吳光哲、江盈宏、陳彥宏、 鍾至衡、張碧娟、余家興. Outline. History of MST Background Knowledge: Soft Heap Algorithm at a Glance The MST algorithm Notations Detail description Correctness. History of MST(1/2). 1926, Boruvka present the first model to solve MST.

jania
Download Presentation

Minimum Spanning Tree

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Minimum Spanning Tree 吳光哲、江盈宏、陳彥宏、 鍾至衡、張碧娟、余家興

  2. Outline • History of MST • Background Knowledge: Soft Heap • Algorithm at a Glance • The MST algorithm • Notations • Detail description • Correctness

  3. History of MST(1/2) • 1926, Boruvka present the first model to solve MST. • Textbook algorithms run in O(mlogn), Kruskal’s Greedy Algorithm, Prim’s Algorithm • 1975, Yao, O(mloglogn) • 1984, Freeman and Tarjan, O(mβ(m,n)) • 1987, Gabow, O(m logβ(m,n)), involved a new data structure : Fibonacci Heap

  4. β(m,n) β(m,n) is a very slowly growing function defined as follows: β(m,n) = min { i : logi(n)<= m/n} The worst case is O(m log*m) (m = O(n))

  5. History of MST(2/2) • 1997, Bernard Chazelle, O(mα(m, n)log α(m, n)) Problem: Does there exists a linear deterministic algorithm which solves MST problem? Hope : for a given weighted connected graph G with m edges the algorithm finds a MST of G in at most Km steps?

  6. α(m,n) α(m,n) is the “inverse” of Ackermann's function (a very slowly growing function) α(m,n) = { k>=1: A(k,m/n)>lgn} Ackermann’s function is defined by A(1,m) = 2m A(n,1) = A(n-1,2) A(n,m) = A(n-1,A(n,m-1))

  7. TODAY • A Minimum Spanning Tree Algorithm with Inverse-Ackermann Type Complexity -- Bernard Chazelle -- Princeton University, and NEC Research Institute -- Journal of the ACM, November 2000 -- involved a new data structure : Soft Heap -- Running time is O(mα(m,n)) • An Optimal Minimum Spanning Tree Algorithm -- Seth Pettie And Vijaya Ramachandran -- The University of Texas at Austin, Austin, Texas -- Journal of the ACM, January 2002 -- involved a new data structure : Soft Heap -- Runs in time O(T*(m,n)), T*(m,n) is beteween (m) & O(mα(m,n))

  8. Soft Heap • A variant of a binomial heap -create(H): Create an empty soft heap -insert(H,x): Add new item x to H -meld(H,H’): Union in H and H’. -delete(H,x): Remove item x from H. -findmin(H): return the smallest key in H. • The amortized complexity of each operation is constant, except for insert, which takes O(log 1/) •  is the error rate in soft Heap.

  9. Binomial Heap In Worst-case • Make-Heap (1) Insert O(lgn) • Meld O(lgn) Delete (lgn) • Findmin (lgn)

  10. Key Point • The entropy of the data structure is reduced by Artificially raising the values of certain keys. (corrupted) • Move items across the data structure not individually, but in groups.(car pooling) •  is the error rate in soft Heap. • Deletemin() -> sift()

  11. Algorithm at a Glance (1/4) Input: an undirected graph G. • Each edge e is assigned a cost c(e) • Edge costs are distinct • MST(G) is unique • There are m edges • There are n vertices

  12. Algorithm at a Glance (2/4) Contractibility (for a subgraph C of G) • C is contractible if C ∩ MST(G) is connected. • Computing MST(C) is easier. • How can we discover C without computing MST(G)? C 1 C 5 2 3 4 C is contractible C is not contractible

  13. Algorithm at a Glance (3/4) Decompose  Contract  Glue Step. 1 Step. 2 Step. 3 Step. 4

  14. Algorithm at a Glance (4/4) Iterating this process forms a hierarchy tree Each v of G is a leaf of The hierarchy tree

  15. Finally, the MST algorithm. To computer MST of G, call msf(G, t) with We will explain this later. msf(G, t)

  16. Any question? We are going into the details now….

  17. And Let’s welcome 光哲兄~

  18. Tree • Computed in O(m+d3n) • Balance between height and node:Ackermann’s function Cz z nz height dz

  19. Ackermann’s function • Choice nz as

  20. By induction on t, we have Expansion of Cz=S(t,dz)3 Compute tree in bt(m+d3n)=O(m+d3n)d=(m/n)1/3 The choice of t and d implies that t=O(α(m,n))MST is O(mα(m,n))

  21. Active path z1 z2 z3 z4=zk

  22. Cu Cz1 Cz2 Cz3 z1 u z2 z3 Border edges • Exactly one vertex in

  23. Corrupted edge • Border edges are stored in soft heaps • They may become corrupted due to soft heap operation. • If it’s corrupted: Cz Cz Soft heap Cost++ corrupted

  24. Bad edge • It’s called bad if When Cz contracted into one vertex Cz Cz corrupted bad Once bad always bad

  25. Cz Cz Cost++ corrupted Working cost Cz bad Working cost=original cost Working cost=current cost

  26. z1 z2 z3 z4=zk Stack view

  27. Detail description of each steps • Totally 5 steps. • Many details for Step [3]. • Detail but not too detail (hopefully).

  28. Step[1, 2] Case t=1 is special. Solve this in O(n2) • Because What if G is not connected? • Apply msf to each connected component. What if G is not simple? • Keep only the cheapest edge. The aim of performing Boruvka is to reduce the number of edges (to n/2c).

  29. Step[3] Building the hierarchy • With t > 1 specified, target size nz is specified. • We discuss for a general case that the active path is z1, z2, …zk • Keep two invariants: INV1 and INV2 (explain later) • Two possible actions: Retraction and Extension.

  30. INV1 For all i<k chain link e for any two pair (j1, j2)<i

  31. v u Czi-1 Czi Czj Step[3]: INV2 Each border edge (u, v) where uCzj are stored into H(j) or H(i, j) • No edge appears in more than one heap • Membership in H(j) implies v is incident to at least one edge in some H(i, j) • Membership in H(i, j) implies v is also incident to Czi but not any Czl that (i<l<j)

  32. V g f e Z1 d Z2 c b Z3 a Z4 Z5 Step[3]: INV2 A practice A possible assignment of edges to heaps: a H(4,5),b H(4), c H(2,4), d H(2), e H(1,2), f H(1,2), g H(0,1)

  33. a cluster Step[3]: Retraction If a last subgraph Czkhas attained its target size. • Contract Czkinto Czk-1 (as well as its links) • Czk-1 gains one vertex. • Destroy H(k) and H(k-1, k) • Discard bad edges. • For each cluster, insert the minimum edge implied by INV2. • Meld H(i, k) into H(i, k-1)

  34. Step[3]: Extension Do findmin on all heaps and retrieve the border edge (u,v) with minimum cost. • (u,v) is the extension edge (chain-link also) • v is the first vertex in Czk+1 • Older border edges incident to v are not border edges now. Delete them from heap, update min-links and insert new border edges.

  35. v u v a b a Becomes CZi CZk CZi Step[3]: Extension If any min-link (a, b) is less than (u, v) • We have to do a fusion • Explain by the figure. fusion into a

  36. Step[3]: Conclusion Keep doing Extension until target size is reached or no vertices is left. Perform Retraction when target size is reached. Maintain INV1 and INV2 at the same time.

  37. Step[4] Recursing in Subgraphs of. • For every Cz, do msf(Cz -1, t-1) • The edge cost are resetted to original value. • The main goal is to modify the target size. • For every fusion edge (a, b), this fusion does not occur anymore in Cz.

  38. Step[5] The final recursion • The candidate edge set becomes F∪B now. • Adding edges contracted in Step[2] produces MST of G.

  39. f e π C Correctness Contractibility • Subgraph C of G is strong contractible if each edge of π < min(e, f ) • That’s why we do fusion. Lemma 3.1: If an edge e is not bad and lies outside F, e lies outside of MST(G).

  40. Cz z MST(G) and T Cz z: tree node Cz: subgraph(vertex)

  41. v u Czi-1 Czi Czj a cluster 圖

  42. V g f e Z1 d Z2 c b Z3 a Z4 Z5 圖

More Related