1 / 30

Algorithms ( and Datastructures )

Algorithms ( and Datastructures ). Lecture 5 MAS 714 part 2 Hartmut Klauck. Depth First Search. If we use a stack as datastructure we get Depth First Search (DFS) Typically, DFS will maintain some extra information: Time when v is put on the stack

oriana
Download Presentation

Algorithms ( and Datastructures )

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Algorithms (andDatastructures) Lecture 5 MAS 714 part 2 Hartmut Klauck

  2. Depth First Search • If we use a stack as datastructure we get Depth First Search (DFS) • Typically, DFS will maintain some extra information: • Time when v is put on the stack • Time, when all neighbors of v have been examined • This information is useful for applications

  3. Datastructure: Stack • A stack is a linked list together with two operations • push(x,S): Insert element x at the front of the list S • pop(S): Remove the front element of the list S • Implementation: • Need to maintain only the pointer to the front of the stack • Useful to also have • peek(S): Find the front element but don’t remove

  4. Digression: Recursion and Stacks • Our model of Random Access Machines does not directly allow recursion • Neither does any real hardware • Compilers will “roll out” recursive calls • Put all local variables of the calling procedure in a safe place • Execute the call • Return the result and restore the local variables

  5. Recursion • The best datastructure for this is a stack • Push all local variables to the stack • LIFO functionality is exactly the right thing • Example: Recursion tree of Quicksort

  6. DFS • Procedure: • For all v: • ¼(v)=NIL, d(v)=0, f(v)=0 • Enter s into the stack S, set TIME=1, d(s)=TIME • While S is not empty • v=peek(S) • Find the first neighbor w of v with d(w)=0: • push(w,S) , ¼(w)=v, TIME=TIME+1, d(w)=TIME • If there is no such w: pop(S), TIME=TIME+1, f(v)=TIME • Goto 3.

  7. DFS • The array d(v) holds the time we first visit a vertex. • The array f(v) holds the time when all neighbors of v have been processed • “discovery” and “finish” • In particular, when d(v)=0 then v has not been found yet

  8. Simple Observations • Vertices are given d(v) numbers between 1 and 2n • Each vertex is put on the stack once, and receives the f(v) number once all neighbors are visited • Running time is O(n+m)

  9. A recursive DFS • Stacks are there to roll out recursion • Consider the procedure in a recursive way! • Furthermore we can start DFS from all unvisited vertices to traverse the whole graph, not just the vertices reachable from s • We also want to label edges • The edges in (¼(v),v) form trees: tree edges • We can label all other edges as • back edges • cross edges • forward edges

  10. Edge classification • Lemma: the edges (¼(v),v) form a tree • Definition: • Edges going down along a path in a tree (but not tree edge) are forward edges • Edges going up along a path in a tree areback edges • Edges across paths/tree are cross edges • A vertex v is a descendant of u if there is a path of tree edges from u to v • Observation: descendants are discovered after their “ancestors” but finish before them

  11. Example: edge labeling • Tree edges, Back edges, Forward edges, Cross edges

  12. Recursive DFS • DFS(G): • TIME=1 (global variable) • For all v: ¼(v)=NIL, d(v)=0, f(v)=0 • For all v: if d(v)=0 then DFS(G,v) • DFS(G,v) • TIME=TIME+1, d(v)=TIME • For all neighbors w of v: • If d(w)=0 then (v,w) is tree edge, DFS(G,w) • If d(w)0 and f(w)0 then cross edge or forward edge • If d(w)0 and f(w)=0 then back edge • TIME=TIME+1, f(v)=TIME

  13. Recursive DFS • How to decide if forward or cross edge? • Assume, edge(v,w) and f(w) is not 0 • If d(w)>d(v) then forward edge • If d(v)>d(w) then cross edge

  14. Application 1: Topological Sorting • A DAG is a directed acyclic graph • A partial order on vertices • A topological sort of a DAG is a numbering of vertices s.t. all edges go from smaller to larger vertices • A total order that is consistent with the partial order

  15. Topological Sorting • Lemma: G is a DAG iff there are no back edges • Proof: • If there is a back edge, then there is a cycle • The other direction: suppose there is a cycle c • Let u be the first vertex discovered on c • v is u’s predecessor on c • Then v is a descendant of u, i.e., d(v)>d(u) and f(v)<f(u) • When (v,u) is processed f(u)=0, d(v)>d(u) • (v,u) is a back edge

  16. Topological Sorting • Algorithm: • Output the vertices in the reverse order of the f(v) as the topological sorting of G • I.e., put the v into a list when they are finished, last finished is first in list

  17. Topological Sorting • We need to prove correctness • Certainly we provide a total ordering of vertices • Now assume vertex i is smaller than j in the ordering • I.e., i finished after j • Need to show: there is no path from j to i • Proof: • j finished means all descendants of j are finished • i is not a descendant (otherwise finishes first) • So there is a path from a descendant to an ancestor • Such a path must have back edges • Not a DAG

  18. Topological Sorting • Hence we can compute a topological sort in linear time

  19. Application 2: Strongly connected components • Definition: • A strongly connected component of a graph G is a maximal set of vertices V’ such that for each pair of vertices v,w in V’ there is a path from v to w • Note: in undirected graphs connected component, but now we have one-way roads • Connected components (viewed as vertices) form a DAG inside a graph G

  20. Stronglyconnectedcomponents • Algorithm • DFS(G) tocompute finish time f(v) for all v • ComputeGT • DFS(GT), but in DFS proceduregooververtices in the order ofdecreasing f(v) fromfirst DFS • Vertices in a treeofDFS(GT) form a connectedcomponent

  21. Stronglyconnectedcomponents • Time: O(n+m) • Weskipthecorrectnessproof • Note theusefulnessofthe f(v) numberscomputed in thefirst DFS

  22. Shortest paths in weighted graphs • We are given a graph G (adjacency list withweights W(u,v)) • No edge means W(u,v)=1 • We look for shortest paths from start vertex s to all other vertices • Length of a path is the sum of edge weights • Distance±(u,v) is the minimum path length on any path u to v

  23. Variants • Single-Source Shortest-Path (SSSP):Shortest paths from s to all v • All-Pairs Shortest-Path (APSP):Shortest paths between all u,v • We will now consider SSSP • Solved in unweighted graphs by BFS

  24. Observation • We don‘t allow negative weight cycles • Consider a minimal path from u to v • Then: all subpaths of minimal paths are also minimal! • Idea: Weshould find shortest paths with few edges first • We will use estimates d(v), starting from d(s)=0 and d(v)=1for all other v (Pessimism!) • Improve estimates until tight

  25. Storing paths • Again we store predecessors (v), starting at NIL • In the end they will form a shortest path tree

  26. Relaxing an edge • Basic Operation: • Relax(u,v,W) • if d(v)>d(u)+W(u,v)then d(v):=d(u)+W(u,v); (v):=u • D.h. If we find a better estimate we go for it

  27. Observations • Every sequence of relax operations satisfies: • d(v) ¸(s,v) at all time • Clearly: vertices with (s,v)=1always have d(v)=1 • If s u v is a shortest path and (u,v) an edge and d(u)=(s,u).Relaxing (u,v) gives d(v)=(s,v) • d(v)· d(u)+W(u,v) after relaxing=(s,u)+W(u,v)=(s,v), minimality of partial shortest paths

  28. Dijkstra‘s Algorithm • Solves SSSP • Needs: W(u,v)¸ 0 for all edges • Idea: store vertices so that we can choose a vertex with minimal distance estimate • Choose v with minimal d(v), relax all edges • until all v are processed

  29. Data Structure • Store n vertices and their distance estimate d(v) • Operations: • ExtractMin: Get the vertex with minimum d(v) • DecreaseKey(v,x): replace d(v) with a smallervalue x • Insert(v,x): Insert a newvertexwith d(v)=x • Initialize and Test for empty • Priority Queue

  30. Dijkstra‘s Algorithm • Initialize (v)=NIL for all v andd(s)=0, d(v)=1otherwise • S=;set of vertices processed • Insert all vertices into Q (priority queue) • While Q; • u:=ExtractMin • S:=S [ {u} and Q:=Q-{u} • For all neighbors v of u: relax(u,v,W) • relax uses DecreaseKey

More Related