1 / 25

COSC 3100 Transform and Conquer

COSC 3100 Transform and Conquer. Instructor: Tanvir. What is Transform and Conquer ?. The 4 th algorithm design technique we are going to study Three major variations Instance Simplification: Transform to a simpler or more convenient instance of the same problem

fisk
Download Presentation

COSC 3100 Transform and Conquer

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. COSC 3100Transform and Conquer Instructor: Tanvir

  2. What is Transform and Conquer ? • The 4th algorithm design technique we are going to study • Three major variations • Instance Simplification: Transform to a simpler or more convenient instance of the same problem • Representation Change: Transform to a different representation of the same instance • Problem Reduction: Transform to an instance of a different problem for which you know an efficient algorithm • Two-step process: • Step 1: Modify problem instance to something easier to solve • Step 2: Conquer!

  3. What is Transform and Conquer ? 1. Presorting Simpler instance Or Another representation Or Another problem’s instance Problem’s instance Solution 1. Computing Least Common Multiple 2. Counting Paths in a Graph 3. Reduction to Optimization Problems 4. Reduction to Graph Problems 1.Heapsort 2. Horner’s Rule 3. Binary Exponentiation

  4. Trns. & Conq.: Presorting • Many questions about a list are easier to answer if the list is sorted • We shall see three examples • Example 1: Checking element uniqueness in an array T(n) = (n-1)+(n-2)+…+1 = єΘ(n2) E.g., What is the brute-force idea ? Take say the first element and check whether it is in the rest of the array, take the second element and check if it is in the rest of the array, and so on…

  5. Trns. & Conq.: Presorting • Let us apply sorting first ALGORITHMPresortElementUniqueness(A[0..n-1]) sort the array A fori <- 0 to n-2 do if A[i] = A[i+1] returnfalse return true S O R T T(n) = Tsort(n) + Tscan(n) єΘ(nlgn) + Θ(n) = Θ(nlgn)

  6. Trns. & Conq.: Presorting 5 • Example 2: Computing a mode • A mode is a value that occurs most often in a list • E.g., in , mode is ? Brute-force: scan the list and compute the frequencies of all distinct values, then find the value with the highest frequency How to implement brute-force ? Value: Frequency: Store the values already encountered along with their frequencies, in a separate list. What is the worst-case input ? 7 6 5 1 An array with no equal elements… 1 3 1 2 C(n) = 0 + 1 + … + (n-1) = єΘ(n2)

  7. Trns. & Conq.: Presorting • Example 2: Computing a mode (contd.) • If we sort the array first, all equal values will be adjacent to each other. • To compute the mode, all we need to know is to find the longest run of adjacent equal values in the sorted array

  8. Trns. & Conq.: Presorting Example 2: Computing a mode (contd.) ALGORITHMPresortMode( A[0..n-1] ) sort the array A i <- 0 modefrequency <- 0 whilei ≤ n-1 do runlength <- 1; runvalue <- A[i] whilei+runlength ≤ n-1 and A[i+runlength] = runvalue runlength <- runlength+1 ifrunlength > modefrequency modefrequency <- runlength; modevalue <- runvalue i <- i+runlength returnmodevalue while loop takes linear time, so the overall runtime is dominated by the time for sorting, Θ(nlgn)

  9. Trns. & Conq.: Heapsort • Another O(nlgn) sorting algorithm • Uses a clever data structure called “Heap” • It transforms an array into a Heap and then sorting becomes very easy • Heap has other important uses, like in implementing “priority queue” • Recall, priority queue is a multiset of items with an orderable characteristic called “priority”, with the following operations: • Find an item with the highest priority • Deleting an item with the highest priority • Adding a new item to the multiset Let us study Heap first…

  10. Trns. & Conq.: Heap • A “heap” can be described as a binary tree, with one key per node, provided the following two conditions are met: • Shape property: Binary tree is essentially complete, all levels are full except possibly the last level, where only some right most leaves may be missing • Parental dominance (or heap property): key in each node is greater than or equal to the keys in its children (considered automatically satisfied for the leaves) This is actually the “max heap”, There is a corresponding “min heap”…

  11. Trns. & Conq.: Heap Shape and Heap properties… 10 10 5 7 5 7 2 1 Is it a Heap ? 4 2 1 10 5 7 6 2 1

  12. Trns. & Conq.: Heap Note: Sequence of values on a path from the root to a leaf is nonincreasing There is no left-to-right order in key values, though 10 7 8 5 2 1 6 ALGORITHM Parent(i) return 1 3 5 ALGORITHM Left(i) return 2i index: ALGORITHM Right(i) return value: parents leaves

  13. Trns. & Conq.: Heap Important properties of heaps: 9 There exists one essentially complete binary tree with n nodes, its height is 7 2. The root of a heap contains the largest element 8 3. A node with all its descendants is also a heap 5 2 1 6 4. Heap can be implemented as an array By recording its elements in the top-down, left-right fashion. H[0] is unused or contains a number bigger than all the rest. 1 3 5 H: a. Parental node keys will be in first positions, leafs will Be in the last positions b. Children of key at index i (1 ≤ i ≤ ) will be in positions 2i and 2i+1. The parent of a key at index i (2 ≤ i ≤ n) will be in Position . Which gives: H[i] ≥ max{ H[2i], H[2i+1] } for i = 1, … ,

  14. Trns. & Conq.: Heap 1. Start at the last parent at index 2 • How can we transform an array into a heap? Two ways: 2 9 8 9 7 2. Check if parental dominance holds for this node’s key 2 6 5 7 9 6 5 8 8 3. If it does not, Exchange the node’s key K with the larger key of its children 6 5 7 n 9 4. Check if parental dominance holds for K in its new position and so on… Stop after doing for root. 2 6 8 6 5 7 2 This process is called “Heapifying” This method is called “bottom-up heap construction”

  15. Trns. & Conq.: Heap ALGORITHMHeapBottomUp(H[1..n]) //Input: An array H[1..n] of orderable items //Output: A heap H[1..n] fori <- downto 1 do k <- i; v <- H[k] heap <- false whilenot heap and 2*k ≤ n do j <- 2*k if j < n // there are 2 children if H[j] < H[j+1] j <- j+1 if v ≥ H[j] // only left child heap <- true else H[k] <- H[j] k <- j H[k] <- v 2 9 7 n 6 5 8 Heapify one parent

  16. Trns. & Conq.: Heap Let’s take a heap whose binary tree is full Why ? So, number of nodes, n = 2m-1 largest possible # of nodes at each level… n = 22-1 • Let us analyze the HeapBottomUp(H[1..n])’s worst case 9 n = 23-1 So what is the height ? h = = -1 = m-1 Let’s look at a worst-case… A key on level i must travel to the leaf-level h; Level i has (h-i) lower levels. Going down by 1 level requires 2 comparisons: One to determine the larger child, the other to determine if exchange is needed. Cworst(n) = = 7 So, fewer than 2n comparisons are required 4 2 = 2h(2h-1) – 2(h-2)2h + 4 = h2h+1-2h-h2h+1+2*2h+1+4 = 2 ( n – lg(n+1) + 4 ) 6 5 3

  17. Trns. & Conq.: Heap • Building a heap, Method 2 • It’s called “top-down heap construction” • Idea: Successively insert a new key into a previously constructed heap • Attach a new node with key K after the last leaf of the existing heap • Sift K up until the parental dominance is established 9 9 10 8 10 9 6 6 6 2 7 10 2 7 8 2 7 8 5 5 5 Insertion operation cannot take more than twice the height of the heap’s tree, so it is in O(lgn)

  18. Trns. & Conq.: Heap • Maximum key deletion from a heap • Exchange root’s key with the last key K in the heap • Decrease the heap’s size by 1 • “Heapify” the smaller tree by sifting K down establishing “parental dominance” as long as it is needed 1 We want to delete the root 1 6 8 6 8 9 2 5 2 9 5 6 8 2 1 5 8 Efficiency is O(lgn) 8 6 1 6 5 2 5 2 1

  19. Trns. & Conq.: Heapsort • An interesting algorithm discovered by J. W. J. Williams in 1964 • It has 2 stages • Stage 1: Costruct a heap from a given array • Stage 2: Apply the root-deletion operation n-1 times to the remaining heap

  20. Trns. & Conq.: Heapsort Stage 1: Heap construction 2 9 7 6 5 8 9 2 6 8 9 8 2 5 7 6 5 7 9 2 2 8 9 8 6 5 7 6 5 7

  21. Trns. & Conq.: Heapsort Stage 2: Maximum deletions

  22. Trns. & Conq.: Heapsort • What is Heapsort’s worst-case complexity ? Stage 1: Heap construction is in ? O(n) Stage 2: Maximum deletions is in ? Let’s see… Let C(n) be the # of comparisons needed in Stage 2 Recall, height of heap’s tree is C(n) ≤ 2+ 2 + … … + 2 ≤ 2 C(n) ≤ 2 = 2(n-1)lg(n-1) ≤ 2nlgn So, C(n) є O(nlgn) So, for two stages, O(n) + O(nlgn) = O(nlgn)

  23. Trns. & Conq.: Heapsort • More careful analysis shows that Heapsort’s worst and best cases are in Θ(nlgn) like Mergesort • Additionally Heapsort is in-place • Timing experiments on random files show that Heapsort is slower than Quicksort, but can be competitive with Mergesort • What sorting algorithm is used by Collections.sort(List<T> list) of Java?

  24. Trns. & Conq.: Problem Reduction reduction Alg. A Solution to Problem 2 Problem 2 Problem 1 • Compute least common multiple of two nonnegative integers, lcm(m, n) • Say m = 24, and n = 60 • 24 = 2*2*2*3 and 60 = 2*2*3*5 • Take product of all common factors of m and n, and all factors of m not in n, and all factors of n not in m • lcm(24, 60) = (2*2*3)*2*5 = 120 To be solved Solvable by Alg. A So, m*n = lcm(m, n) * gcd(m, n) lcm(m, n) = ! Is it a good algorithm? Notice, 24*60 = (2*2*2*3) * (2*2*3*5) = (2*2*3)2*2*5

  25. THANK YOU…

More Related