1 / 23

CSCE 350 - Data Structures and Algorithms

CSCE 350 - Data Structures and Algorithms. Lecture 08 – The Master Theorem/Merge Sort. Overview. Last Time Topological sort (4.2 ) Source Removal=Kahn’s DFS = Tarjan’s Generating permutations from list permutations on (n-1) insert n into all the right spots

yagil
Download Presentation

CSCE 350 - Data Structures and Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSCE 350 - Data Structures and Algorithms Lecture 08 – The Master Theorem/Merge Sort

  2. Overview • Last Time • Topological sort (4.2) • Source Removal=Kahn’s • DFS = Tarjan’s • Generating permutations • from list permutations on (n-1) insert n into all the right spots • Lexicographic generation • Generating subsets • Divide and Conquer • Binary search, exponentiation by squaring, find kth element, Russian multiplication • Today • Smoothness Appendix B • Divide and Conquer • Merge sort 5.1 • Quicksort 5.2 • Last Times Slides 30- • Test 2 - Monday

  3. Eventually non-decreasing functions • f(n) is eventually non-decreasing if there exists an n0 such that if n0 <n1< n2 then f(n1) ≤ f(n2) • Examples • f(n) = (n -10)3 is eventually increasing with n0=10 • f(n) = .5*n + sin(n) ? • f’(x) = .5 + cos(x) for x in reals • sometimes positive (f increasing) sometimes negative(f decreasing) • However f(n) = n + sin(n) is non-decreasing as f’≥0

  4. Smooth Functions • f(n) is smooth if it is eventually non-decreasing and • f(2n) ε(f(n)) • Example f(n) = n log n • Since • f(2n) = 2n log 2n = 2n[log n + log 2] = • 2n log n + log2 *n ε(f(n)) • On the other hand f(n) = 2n is not smooth • As f(2n) = 22n = 4n and 4n is not in O(2n)

  5. Theorem 3 • Theorem 3: Let f be a smooth function then for any fixed integer b ≥ 2, f(bn) εΘ(f(n)) • i.e., there exits positive constants cb and db and a non-negative integer n0 such that dbf(n)≤ f(bn) ≤ cbf(n)

  6. Theorem 4 (the Smoothness Rule) Let T(n) be an eventually non-decreasing function and f be a smooth function. If T(n) ε(f(n)) for the values of n that are powers of b, where b ≥ 2, then T(n) ε(f(n))

  7. Example: Addition of inputs of size n=2k • Algorithm • Recurrence relation • A(n) = 2A(n/2)+1

  8. Divide and Conquer Figure (Levitin) Problem of size n Problem of size n/2 Problem of size n/2 Solution to Subproblem 1 Solution to Subproblem 2 Solution to Original Problem

  9. Divide and Conquer more generally procedure T( n : size of problem ) if n < 1 then exit Do work of amount f(n) T(n/b) T(n/b) ... repeat for a total of a times... T(n/b) end procedure • Recurrence relation? http://en.wikipedia.org/wiki/Master_theorem

  10. The Master Theorem • Master Theorem: • If T(n) = aT(n/b) + f (n)where f(n)(nd), d  0 then If a < bd, T(n) (nd) If a = bd, T(n) (ndlog n) If a > bd, T(n) (nlogb a ) Note: The same results hold with O instead of . Examples: T(n) = 4T(n/2) + n  T(n)  ? T(n) = 4T(n/2) + n2  T(n)  ? T(n) = 4T(n/2) + n3  T(n)  ? A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed.

  11. Merge Sort • Sort array by splitting array in half and sorting the halves(recursively) then merging the sorted sublists

  12. Mergesort the Picture • Fig 5.2 Levitin

  13. Merge Sort ALGORITHM Mergesort(A[0..n − 1]) //Sorts array A[0..n − 1] by recursive mergesort //Input: An array A[0..n − 1] of orderable elements //Output: Array A[0..n − 1] sorted in non-decreasing order if n > 1 copy A[0..└n/2┘− 1] to B[0 ..└n/2┘− 1] copy A[└n/2┘.. n − 1] to C[0 ..└n/2┘− 1] Mergesort(B[0 ..└n/2┘− 1]) Mergesort(C[0 .. └n/2┘− 1]) Merge(B, C, A) //see below Levitin, Anany (2011-08-31). Introduction to the Design and Analysis of Algorithms (3rd Edition) (Page 172). Addison-Wesley. Kindle Edition.

  14. The Merge step ALGORITHM Merge(B[0..p − 1], C[0..q − 1], A[0..p + q − 1]) //Merges two sorted arrays into one sorted array //Input: Arrays B[0..p − 1] and C[0..q − 1] both sorted //Output: Sorted array A[0..p + q − 1] of the elements of B and C i ← 0; j ← 0; k ← 0 while i < p and j < q do if B[i] ≤ C[j ] A[k]← B[i]; i ← i + 1 else A[k]← C[j ]; j ← j + 1 k ← k + 1 if i = p copy C[j..q − 1] to A[k..p + q − 1] else copy B[i..p − 1] to A[k..p + q − 1] Levitin, Anany (2011-08-31). Introduction to the Design and Analysis of Algorithms (3rd Edition) (Page 172). Addison-Wesley. Kindle Edition.

  15. Analysis of Mergesort • C(n) = 2C(n/2) + Cmerge(n) for n > 1 and C(1) = 0 • Now what is the worst case for the merge-step? • So Cmerge-worst(n) = n-1, and • Cworst(n) = 2C(n/2) + (n-1) for n > 1 and Cworst(1) = 0 • By the Master Theorem • a=2, b=2, and since (n-1) ε(n1) we have d=1 • So bd = 21 = 2 = a so case 2 applies and Cworst(n)ε(n1log n)

  16. Analysis of Mergesort • All cases have same efficiency: Θ(n log n) • Number of comparisons in the worst case is close to theoretical minimum for comparison-based sorting: log2n! ≈ n log2 n - 1.44n • Space requirement: Θ(n) (not in-place) • Can be implemented without recursion (bottom-up) A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed.

  17. For the n=2k case • Cworst(n)= n log n -n + 1 • Homework Section 5.1 2. a. Write pseudocode for a divide-and-conquer algorithm for finding values of both the largest and smallest elements in an array of n numbers. b. Set up and solve (for n = 2k) a recurrence relation for the number of key comparisons made by your algorithm. c. How does this algorithm compare with the brute-force algorithm for this problem? 3. a. Write pseudocode for a divide-and-conquer algorithm for the exponentiation problem of computing a2n where n is a positive integer. b. Set up and solve a recurrence relation for the number of multiplications made by this algorithm. c. How does this algorithm compare with the brute-force algorithm for this problem? Levitin, Anany (2011-08-31). Introduction to the Design and Analysis of Algorithms (3rd Edition) (Page 174). Addison-Wesley. Kindle Edition.

  18. Quicksort Partitioning • Partition into two sublists: • The sublist of A[i] ≤ A[s] and • the sublist of A[i] ≥ A[s] • Then sort the sublists (again recursively)

  19. Quicksort ALGORITHM Quicksort(A[l..r]) //Sorts a subarray by quicksort //Input: Subarray of array A[0..n − 1], defined by its left and right // indices l and r //Output: Subarray A[l..r] sorted in nondecreasing order if l < r s ←Partition(A[l..r]) //s is a split position Quicksort(A[l..s − 1]) Quicksort(A[s + 1..r]) Levitin, Anany (2011-08-31). Introduction to the Design and Analysis of Algorithms (3rd Edition) (Page 176). Addison-Wesley. Kindle Edition.

  20. Hoare’s Partition Step ALGORITHM HoarePartition(A[l..r]) //Partitions a subarray by Hoare’s algorithm, using the first element // as a pivot //Input: Subarray of array A[0..n − 1], defined by its left and right // indices l and r (l < r ) //Output: Partition of A[l..r], with the split position returned as // this function’s value p ← A[l] i ← l; j ← r + 1 repeat repeat i ← i + 1 until A[i] ≥ p repeat j ← j − 1 until A[j ] ≤ p swap(A[i], A[j ]) until i ≥ j swap(A[i], A[j ]) //undo last swap when i ≥ j swap(A[l], A[j ]) return j

  21. The partitioning picture • i starts at left; j starts at right; they move toward each other

  22. Analysis of Quicksort • Best case: split in the middle, everytime— Θ(n log n) • Worst case: sorted array! — Θ(n2) • Average case: random arrays —Θ(n log n) • Improvements: • better pivot selection: median of three partitioning • switch to insertion sort on small subfiles • elimination of recursion These combine to 20-25% improvement • Considered the method of choice for internal sorting of large files (n ≥ 10000) A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed.

  23. Homework Problems • 1. Apply quicksort to sort the list E, X, A, M , P , L, E in alphabetical order. Draw the tree of the recursive calls made. • 5. For the version of quicksort given in this section: • a. Are arrays made up of all equal elements the worst-case input, the best case input, or neither? • b. Are strictly decreasing arrays the worst-case input, the best-case input, or neither? • 6. For quicksort with the median-of-three pivot selection, • a. are strictly increasing arrays the worst-case input, the best-case input, or neither? • b. Answer the same question for strictly decreasing arrays. Levitin, Anany (2011-08-31). Introduction to the Design and Analysis of Algorithms (3rd Edition) (Page 181). Addison-Wesley. Kindle Edition.

More Related