1 / 26

Chapter 5

Chapter 5. Decrease and Conquer. Homework 7. hw7 (due 3/17) page 127 question 5 page 132 questions 5 and 6 page 137 questions 5 and 6 page 168 questions 1 and 4. Decrease and Conquer Also referred to as inductive or incremental approach.

amadis
Download Presentation

Chapter 5

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 5 Decrease and Conquer

  2. Homework 7 • hw7 (due 3/17) • page 127 question 5 • page 132 questions 5 and 6 • page 137 questions 5 and 6 • page 168 questions 1 and 4

  3. Decrease and ConquerAlso referred to asinductive or incremental approach • Reduce problem instance to smaller instance of the same problem and extend solution • Solve smaller instance • Extend solution of smaller instance to obtain solution to original problem Note: We are not dividing the problem into two smaller problems.

  4. Decrease by one: Insertion sort Graph search algorithms: DFS BFS Topological sorting Algorithms for generating permutations, subsets Decrease by a constant factor Binary search Fake-coin problems multiplication à la russe Josephus problem Variable-size decrease Euclid’s algorithm Selection by partition Examples of Decrease and Conquer

  5. What’s the difference? Consider the problem of exponentiation: Compute an • Brute Force: • Divide and conquer: • Decrease by one: • Decrease by constant factor:

  6. anBrute Force: • a*a*a*a*a* ... *a*a*a • Requires n multiplications • Programmed as a loop • Obviously O(n)

  7. anDivide and conquer: • an/2 * an/2 • Isn’t this clever? • a8 =a4 * a4 = a2*a2 * a2*a2 = a1*a1*a1*a1 * a1*a1*a1*a1 • Why is this retarded? • If you compute a4 why do you have to compute it again? • Sometimes divide and conquer doesn’t yield an advantage!

  8. anDecrease by one: • an = an-1 * a • Q: Is this really any different than the brute force method? • A: No, except that it can be programmed recursively. • We still haven’t done better than O(n)

  9. anDecrease by constant factor: • an= (an/2)2 • Your probably thinking: Dr.B. kidding, right? • Q: Isn’t this exactly the same as the divide an conquer approach? • A: No, check this out. • a8 = (a4)2 = ((a2)2)2 = ((a*a)2)2 • We actually only do 3 multiplications • a*a = v1; • v1 * v1 = v2; • v2 * v2 = a8

  10. anDecrease by constant factor: • an= (an/2)2 • a16= (((a*a)2)2)2 4 multiplications • a32= ((((a*a)2)2)2)2 5 multiplications • … • a1024  10 multiplications • Obviously this is an O(log n) algorithm • What about a47?

  11. anDecrease by constant factor: • What about a47? • a47 = a*(a23)2  2 multiplications • a23 = a*(a11)2  2 multiplications • a11 = a*(a5)2  2 multiplications • a5 = a*(a2)2  2 multiplications • a2 = a*a  1 multiplications • It might actually take 2*log n the worst case, which is still O(log n). • Isn’t Big-O nice?

  12. Graph Traversal • Many problems require processing all graph vertices in systematic fashion • Graph traversal algorithms: • Depth-first search • Breadth-first search

  13. A B C D E F Depth-first search • Given a graph G=(V,E), explore graph always moving away from last visited vertex • G is a graph which consists of two sets • V is a set of vertices • V = {A, B, C, D, E, F} • E is a set of edges • E = {(A,B), (A,C), (C,D), (D,E), (E,C), (B,F)}

  14. Depth-first search DFS(G) count :=0 mark each vertex with 0 (unvisited) for each vertex v in V do if v is marked with 0 dfs(v) dfs(v) count := count + 1 mark v with count for each vertex w adjacent to vdo if w is marked with 0 dfs(w)

  15. Types of edges • Tree edges: edges comprising forest • Back edges: edges to ancestor nodes • Forward edges: edges to descendants (digraphs only) • Cross edges: none of the above

  16. Depth-first search: Notes • DFS can be implemented with graphs represented as: • Adjacency matrices: Θ(V2) • Adjacency linked lists: Θ(V+E) • Yields two distinct ordering of vertices: • preorder: as vertices are first encountered (pushed onto stack) • postorder: as vertices become dead-ends (popped off stack)

  17. Depth-first search: Notes • Applications: • checking connectivity, finding connected components • checking acyclicity • searching state-space of problems for solution (AI)

  18. Breadth-first search • Explore graph moving across to all the neighbors of last visited vertex • Similar to level-by-level tree traversals • Instead of a stack, breadth-first uses queue • Applications: same as DFS, but can also find paths from a vertex to all other vertices with the smallest number of edges

  19. Breadth-first search algorithm • BFS(G) • count :=0 • mark each vertex with 0 • for each vertex v in V do • bfs(v) bfs(v) count := count + 1 mark v with count initialize queue with v while queue is not empty do a := front of queue for each vertex w adjacent to a do if w is marked with 0 count := count + 1 mark w with count add w to the end of the queue remove a from the front of the queue

  20. Breadth-first search: Notes • BFS has same efficiency as DFS and can be implemented with graphs represented as: • Adjacency matrices: Θ(V2) • Adjacency linked lists: Θ(V+E) • Yields single ordering of vertices (order added/deleted from queue is the same)

  21. Directed acyclic graph (dag) • A directed graph with no cycles • Arise in modeling many problems, eg: • prerequisite structure • food chains • Imply partial ordering on the domain

  22. Topological sorting • Problem: find a total order consistent with a partial order • Example: tiger Order them so that they don’t have to wait for any of their food (i.e., from lower to higher, consistent with food chain) human fish sheep shrimp plankton wheat Problem is solvable iff graph is dag

  23. Topological sorting Algorithms • DFS-based algorithm: • DFS traversal noting order vertices are popped off stack • Reverse order solves topological sorting • Back edges encountered?→ NOT a dag! • Source removal algorithm • Repeatedly identify and remove a source vertex, ie, a vertex that has no incoming edges • Both Θ(V+E) using adjacency linked lists

  24. Variable-size-decrease: Binary search trees • Arrange keys in a binary tree with the binary search tree property: Example 1: 5, 10, 3, 1, 7, 12, 9 Example 2: 4, 5, 7, 2, 1, 3, 6 k <k >k • What about repeated keys?

  25. Searching and insertion in binary search trees • Searching – straightforward • Insertion – search for key, insert at leaf where search terminated • All operations: worst case # key comparisons = h + 1 • lg n≤ h ≤ n – 1 with average (random files) 1.41 lg n • Thus all operations have: • worst case: Θ(n) • average case: Θ(lgn) • Bonus: inorder traversal produces sorted list (treesort)

  26. Homework 7 • hw7 (due 3/17) • page 127 question 5 • page 132 questions 5 and 6 • page 137 questions 5 and 6 • page 168 questions 1 and 4

More Related