1 / 34

Algorithms ( and Datastructures )

Algorithms ( and Datastructures ). Lecture 7 MAS 714 part 2 Hartmut Klauck. Data Structure. Store n vertices and their distance estimate d(v) Operations: ExtractMin : Find (and remove) the vertex with minimum d(v) DecreaseKey(v,x): replace key of v with a smaller value Initialize

clara
Download Presentation

Algorithms ( and Datastructures )

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Algorithms (andDatastructures) Lecture 7 MAS 714 part 2 Hartmut Klauck

  2. Data Structure • Store n vertices and their distance estimate d(v) • Operations: • ExtractMin: Find (and remove) the vertex with minimum d(v) • DecreaseKey(v,x): replace keyof v with a smaller value • Initialize • Insert(v,x): insert v withkey x • Test for emptiness • Priority Queue

  3. Heaps • We will implement a priorityqueuewith a heap • Heaps can also beusedforsorting! • Heapsort:Insert all elements, ExtractMinuntilempty • If all operations take time O(log n) we can sort in time O(n log n)

  4. Heaps • A heapis an arrayoflength n • can hold atmost n keys/vertices/numbers • The keys in thearrayare not sorted, but their order hastheheap-property • Namelytheycanbeviewedasa binarytree, in whichparentshavesmallerkeysthantheirchildren • ExtractMinis easy! • Unfortunatelyweneedtoworktomaintaintheheap-property after removingtheroot

  5. Heaps • keys in a heap are arranged as a full binary tree who’s last level is filled up from the left up to some point • Example:

  6. Heaps • Heap property: • Element in cell i is smaller than elements in cells 2i and 2i+1 • Example:Tree Array

  7. Heaps • Besides the array storing the keys we also keep a counter SIZE that tells us how many keys are in H • in cells 1…SIZE

  8. Procedures for Heaps • Initialize: • Declare the array H of correct length n • Set SIZE to 0 • Test for Emptiness: • Check SIZE • FindMin: • Minimum is in H[1] • All this in time O(1)

  9. Procedures for Heaps • Parent(i) is bi/2c • LeftChild(i) is 2i • Rightchild(i) is 2i+1

  10. Procedures for Heaps • Suppose we remove the root, and replace it with the rightmost element of the heap • Now we still have a tree, but we (probably) violate the heap property at some vertex i, i.e., H[i]>H[2i] or H[i]>]H[2i+1] • The procedure Heapify(i) will fix this • Heapifiy(i) assumes that the subtrees below i are correct heaps, but there is a (possible) violation at i • And no other violations in H (i.e., above i)

  11. Procedures for Heaps • Heapify(i) • l=LeftChild(i), r=Rightchild(i) • If l·SIZE and H[l]<H[i] Smallest=l else Smallest=i • If r·SIZE and H[r]<H[Smallest] Smallest=r • If Smallesti • Swap H[i] and H[Smallest] • Heapify(Smallest)

  12. Heapify • Running Time: • A heap with SIZE=n has depth at most log n • Running time is dominated by the number of recursive calls • Each call leads to a subheap that is 1 level shallower • Time O( log n)

  13. Proceduresfor Heaps • ExtractMin(): • Return H[1] • H[1]=H[SIZE] • SIZE=SIZE-1 • Heapify(1) • Time is O(log n)

  14. Proceduresfor Heaps • DecreaseKey(i,key) • If H[i]<keyreturnerror • H[i]=key \\Now parent(i) mightviolateheapproperty • While i>1 and H[parent(i)]>H[i] • Swap H[parent(i)] and H[i], i=parent(i) \\Move theelementtowardstheroot • Time is O(log n)

  15. Procedures for Heaps • Insert(key): • SIZE=SIZE+1 • H[SIZE]=1 • DecreaseKey(SIZE,key) • Time is O(log n)

  16. Note for Dijkstra • DecreaseKey(i,x) works on thevertexthatisstored in position i in theheap • But wewanttodecreasethekeyforvertex v! • Weneedtorememberthepositionof all v in theheap H • Keep an arraypos[1...n] • Wheneverwemove a vertex in H weneedtochangepos

  17. The single-sourceshortest-pathproblemwith negative edgeweights • Graph G, weight function W, start vertex s • Output: a bitindicatingifthereis a negative cyclereachablefrom sAND (if not) theshortestpathsfrom s to all v

  18. Bellman-Ford Algorithm • Initialize d(s)=0, d(v)=1forother v • For i=1 to n-1: • Relax all edges(u,v) • For all (u,v): ifd(v)>d(u)+W(u,v) thenoutput: „negative cycle!“ • Remark: d(v) and(v) containdistancefrom s andpredecessor in a shortestpathtree

  19. Running time • Running time is O(nm) • n-1 times relax all m edges

  20. Correctness • Assume that no cycle of negative length is reachable from s • Theorem:After n-1 iterations of the for-loop we have d(v)=(s,v) for all v. • Lemma:Let v0,…,vkbe ashortest path from v0 to vk. Relax edges (v0,v1)….(vk-1,vk) successively. Then d(vk)=(s,vk). This hold regardless of other relaxations performed.

  21. Correctness • Proof of the theorem: • Let v denote a reachable vertex • Let s, …,v be a shortest path with k edges • k· n-1 • In every iteration all edges are relaxed • By thelemma d(v) is correct after k·n-1 iterations • For all unreachable vertices we have d(v)=1at all times • To show: the algorithm decides the existence of negative cyclescorrectly • No neg. cyclepresent: for all edges (u,v): • d(v)=(s,v)·(s,u)+W(u,v)=d(u)+W(u,v), pass test

  22. Correctness • If a negative cycle exists: • Let v0,…,vk be a path with negative length and v0=vk • Assume the algorithm does NOT stop with error message, then • d(vi)·d(vi-1)+W(vi-1,vi) for all i=1...k • Hence

  23. Korrektheit • v0=vk and all vertices appear once, so • d(vi)< 1for all reachable vertices, hence

  24. The Lemma Lemma: Let v0,…,vk be a shortest path from v0 to vk. Relax edges (v0,v1)….(vk-1,vk) successively. Then d(vk)=(s,vk). This hold regardless of other relaxations performed. Proof: By induction. After relaxing (vi-1, vi) the value d(vi) correct. Base: i=0, d(v0)=d(s)=0 is correct Assume d(vi-1) correct. Accordingtoan earlierobservation after relaxing(vi-1,vi) also d(vi) correct Once d(v) is correct, thevaluestays correct d(v) is always an upper bound

  25. Application of Bellman Ford • Distributed networks • We look for distance ofverticesfrom s • Computation can beperformedin a distributed way, without • global control • global knowledge about the network • Dijkstra needs global knowledge • Runningtime: n-1 phases, vertices compute (in parallel)

  26. All-pairs shortest path • Given a graph • Variants: • directed/undirected • weighted/unweighted/pos./neg. weights • Output: For all pairs of vertices u,v: • Distance in G (APD: All-pairs distances) • Shortest Paths (APSP: All-pairs shortest-path)

  27. APSP • APD: n2outputs, running time at least n2 • Can just use adjacency matrix • APSP: problem: how to represent n2 paths? • Easy to construct a graph, such that for (n2) vertex pairs the distance is (n) • Simply writing pathsrequiresoutput length n3

  28. APSP output convention • Implicit representation of shortest paths as a successor matrix • Successor matrix S is n£n, S[i,j]=k for the neighbor k of i, which is first on the shortest path from i to j • Easy to compute the shortest path from i to j using S: • e.g. S[i,j]=k, S[k,j]=l, S[l,j]=a, S[a,j]=j

  29. APSP: some observations • Edge weights ¸ 0: use n times Disjktra, running time: O(nm+n2log n) • Unweighted graphs: n times BFS for time O(nm+n2) • For dense graphs m=(n2) and we get O(n3) • Can we save work?

  30. Floyd-Warshall Algorithmus • Input: G, directed graph with positive and negative weights, no negative cycles • O(n3) algorithmus based on Dynamic Programming • Compute shortest paths (from u to v) that use only vertices 1… k

  31. Floyd-Warshall Algorithmus • Definition: • d[u,v,k]= length of the shortest path from u to v that (besides u,v) uses vertices {1,…,k} only • d[u,v,0]=W(u,v) • is =1if (u,v) is no edge • Recursion: • d[u,v,k]= minimum of • d[u,v,k-1] paths using only 1,…,k-1 • d[u,k,k-1] + d[k,v,k-1] paths also using k

  32. Floyd-Warshall Algorithmus • Initialize d[u,v,0]=W(u,v) for all u,v • For k=1,…,n • compute d[u,v,k] for all u,v • Total running time: O(n3)

  33. Floyd-Warshall Algorithmus • Computing the paths: exercise • Note that this algorithm is very simple, no fancy datastructures, so constant factors are small

  34. Dynamic Programming • The values d[u,v,0] aregivenimmediately • The values d[u,v,n] arethesolutiontotheproblem • Wecaneasilycompute all d[u,v,k] onceweknow all d[u,v,k-1] • Thisprocessofcomputingsolutionsbottomupiscalleddynamicprogramming • Note thedifferencetocomputing top down byrecursion!

More Related