1 / 20

Dijkstra’s Algorithm

Dijkstra’s Algorithm. Announcements. Assignment #2 Due Tonight Exams Graded Assignment #3 Posted. Announcements. Last Time. Breadth First Search, Depth First Search Different algorithms for searching all nodes in a graph. Topological Sort

kiele
Download Presentation

Dijkstra’s Algorithm

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dijkstra’s Algorithm

  2. Announcements • Assignment #2 Due Tonight • Exams Graded • Assignment #3 Posted

  3. Announcements

  4. Last Time • Breadth First Search, Depth First Search • Different algorithms for searching all nodes in a graph. • Topological Sort • Produces an ordering of vertices in a graph, given certain constrains (A before B, B before C, etc.) • Dijkstra’s Algorithm • Finds the shortest paths from a source vertex to all other vertices.

  5. Today • Finish up Dijkstra’s • Path reconstruction • MSTs – Minimum Spanning Tree algorithms • Prim’s • Kruskal’s

  6. Dijkstra’s Algorithm • This algorithm finds the shortest path from a source vertex to all other vertices in a weighted directed graph without negative edge weights. 5 10 2 a b d 2 3 8 4 3 4 12 e c 16 13

  7. Dijkstra’s Path Reconstruction • Example shown on the Board

  8. Minimum Spanning Trees • In a weighted, undirected graph, it is a tree formed by connecting all of the vertices with minimal cost. • The MST is a tree because it’s acyclic. • It’s spanning because it covers every vertex. • And it’s minimum because it has minimum cost. 2 v1 v2 4 1 3 10 2 7 v3 v4 v5 4 8 5 6 v6 v7 1 2 v1 v2 1 2 v3 v4 v5 4 6 v6 v7 1

  9. MSTs – proof that greedy works • Let G be a graph with vertices in the set V partitioned into two sets V1 and V2. Then the minimum weight edge, e, that connects a vertex from V1 to V2 is part of a minimum spanning tree of G. • Proof: Consider a MST T of G that does NOT contain the minimum weight edge e. • This MUST have at least one edge in between a vertex from V1 to V2. (Otherwise, no vertices between those two sets would be connected.) • Let G contain edge f that connects V1 to V2. • Now, add in edge e to T. • This creates a cycle. In particular, there was already one path from every vertex in V1 to V2 and with the addition of e, there are two. • Thus, we can form a cycle involving both e and f. Now, imagine removing f from this cycle. • This new graph, T' is also a spanning tree, but it's total weight is less than or equal to T because we replaced e with f, and e was the minimum weight edge.

  10. MSTs – proof that greedy works • As a spanning tree is created • If the edge that is added is the one of minimum cost that avoids creation of a cycle. • Then the cost of the resulting spanning tree cannot be improved • Because any replacement edge would have cost at least as much as an edge already in the spanning tree. • This is why greedy works!

  11. Kruskal’s Algorithm Let V =  For i=1 to n-1, (where there are n vertices in a graph) V = V  e, where e is the edge with the minimum edge weight not already in V, and that does NOT form a cycle when added to V. Return V • Basically, you build the MST of the graph by continually adding in the smallest weighted edge into the MST that doesn't form a cycle. • When you are done, you'll have an MST. • You HAVE to make sure you never add an edge the forms a cycle and that you always add the minimum of ALL the edges left that don't.

  12. Kruskal’s Given Graph G: Edges in sorted order: 2 v1 v2 v4 4 1 3 10 v1 4 2 7 1 v3 v4 v5 v7 4 8 v4 v3 5 6 v6 v7 1 v6 v7 5 1 v6 v5 Determine the MST: 2 v1 v2 6 v7 2 v1 v2 4 2 3 v3 v4 7 1 v4 v5 2 CYCLE! v3 v4 v2 v5 v4 3 4 8 6 v4 v6 v6 v7 v1 v2 1 4 10 All Vertices, we’re done!! v3 v5

  13. Kruskal’s • The reason this works: • is that each added edge is connecting between two sets of vertices, • and since we select the edges in order by weight, • we are always selecting the minimum edge weight that connects the two sets of vertices. • Cycle detection: • Keep track of disjoint sets. • Initially, each vertex is in its own disjoint set. • When you add an edge you are unioning two sets. • A union cannot happen if the two vertices are already in the same set. • This would create a cycle.

  14. Prim’s Algorithm • This is quite similar to Kruskal's with one big difference: • The tree that you are "growing" ALWAYS stays connected. • Whereas in Kruskal's you could add an edge to your growing tree that wasn't connected to the rest of it, here you can NOT do it. Here is the algorithm: 1) Set S = . 1) Pick any vertex in the graph. 2) Add the minimum edge incident to that vertex to S. 3) Continue to add edges into S (n-2 more times) using the following rule: Add the minimum edge weight to S that is incident to S but that doesn't form a cycle when added to S.

  15. Given Graph G: Prim’s 2 v1 v2 Edges in sorted order: 4 1 3 10 v4 2 7 v1 v3 v4 v5 4 4 8 1 5 6 v7 v6 v7 v4 v3 1 v6 v7 5 Determine the MST, using Prim’s starting with vertexV1: 1 v6 v5 2 v1 v2 6 v7 2 v1 v2 4 2 3 v3 v4 7 1 v4 v5 2 CYCLE! v3 v4 v2 v5 v4 3 4 8 6 v4 v6 v6 v7 v1 v2 1 4 10 All Vertices, we’re done!! v3 v5

  16. Example on the board

  17. Huffman Encoding • Compress the storage of data using variable length codes. • For example, each character in a text file is stored using 8 bits. • Nice and easy because we always read in 8 bits for a single character. • Not the most efficient… • What if ‘e’ is used 10 times more frequently than ‘q’. • It would be more advantageous for us to use a 7 bit code for e and a 9 bit code for q.

  18. Huffman Coding • Finds the optimal way to take advantage of varying character frequencies in a particular file. • On average, standard files can shrink them anywhere from 10% to 30% depending to the character distribution. • The idea behind the coding is to give less frequent characters and groups of characters longer codes. • Also, the coding is constructed in such a way that no two constructed codes are prefixes of each other. • This property about the code is crucial with respect to easily deciphering the code.

  19. Building a Huffman Tree • Example on the board

  20. References • Slides adapted from Arup Guha’s Computer Science II Lecture notes: http://www.cs.ucf.edu/~dmarino/ucf/cop3503/lectures/ • Additional material from the textbook: Data Structures and Algorithm Analysis in Java (Second Edition) by Mark Allen Weiss • Additional images: www.wikipedia.com xkcd.com

More Related