1 / 16

CPSC 411 Design and Analysis of Algorithms

CPSC 411 Design and Analysis of Algorithms. Set 3: Divide and Conquer Prof. Jennifer Welch Spring 2011. Divide and Conquer Paradigm. An important general technique for designing algorithms: divide problem into subproblems recursively solve subproblems

luisw
Download Presentation

CPSC 411 Design and Analysis of Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CPSC 411Design and Analysis of Algorithms Set 3: Divide and Conquer Prof. Jennifer Welch Spring 2011 CPSC 411, Spring 2011: Set 3

  2. Divide and Conquer Paradigm • An important general technique for designing algorithms: • divide problem into subproblems • recursively solve subproblems • combine solutions to subproblems to get solution to original problem • Use recurrences to analyze the running time of such algorithms CPSC 411, Spring 2011: Set 3

  3. Mergesort as D&C Example • DIVIDE the input sequence in half • RECURSIVELY sort the two halves • basis of the recursion is sequence with 1 key • COMBINE the two sorted subsequences by merging them CPSC 411, Spring 2011: Set 3

  4. 1 5 2 2 4 2 3 6 1 4 5 3 2 6 6 6 5 2 4 2 5 4 6 6 1 1 2 3 3 2 6 6 2 5 5 2 5 2 4 4 6 6 1 1 3 3 2 2 6 6 4 6 2 6 1 3 Mergesort Example CPSC 411, Spring 2011: Set 3

  5. Mergesort Animation • http://ccl.northwestern.edu/netlogo/models/run.cgi?MergeSort.862.378 CPSC 411, Spring 2011: Set 3

  6. Recurrence Relation for Mergesort • Let T(n) be worst case time on a sequence of n keys • If n = 1, then T(n) = (1) (constant) • If n > 1, then T(n) = 2 T(n/2) + (n) • two subproblems of size n/2 each that are solved recursively • (n) time to do the merge CPSC 411, Spring 2011: Set 3

  7. How To Solve Recurrences • Ad hoc method: • expand several times • guess the pattern • can verify with proof by induction • Master theorem • general formula that works if recurrence has the form T(n) = aT(n/b) + f(n) • a is number of subproblems • n/b is size of each subproblem • f(n) is cost of non-recursive part CPSC 411, Spring 2011: Set 3

  8. <board work> CPSC 411, Spring 2011: Set 3

  9. Additional D&C Algorithms • binary search • divide sequence into two halves by comparing search key to midpoint • recursively search in one of the two halves • combine step is empty • quicksort • divide sequence into two parts by comparing pivot to each key • recursively sort the two parts • combine step is empty CPSC 411, Spring 2011: Set 3

  10. Additional D&C Applications • computational geometry • finding closest pair of points • finding convex hull • mathematical calculations • converting binary to decimal • integer multiplication • matrix multiplication • matrix inversion • Fast Fourier Transform CPSC 411, Spring 2011: Set 3

  11. Matrix Multiplication • Consider two n by n matrices A and B • Definition of AxB is n by n matrix C whose (i,j)-th entry is computed like this: • consider row i of A and column j of B • multiply together the first entries of the row and column, the second entries, etc. • then add up all the products • Number of scalar operations (multiplies and adds) in straightforward algorithm is O(n3). • Can we do it faster? CPSC 411, Spring 2011: Set 3

  12. Faster Matrix Multiplication • Clever idea due to Strassen • Start with a recursive version of the straightfoward algorithm • divide A, B and C into 4 quadrants • compute the 4 quadrants of C by doing 8 matrix multiplications on the quadrants of A and B, plus (n2) scalar operations CPSC 411, Spring 2011: Set 3

  13. Faster Matrix Multiplication • Running time of recursive version of straightfoward algorithm is • T(n) = 8T(n/2) + (n2) • T(2) = (1) where T(n) is running time on an n x n matrix • Master theorem gives us: T(n) = (n3) • Can we do fewer recursive calls (fewer multiplications of the n/2 x n/2 submatrices)? CPSC 411, Spring 2011: Set 3

  14. Strassen's Matrix Multiplication • There is a way to get all the required information with only 7 matrix multiplications, instead of 8. • unintuitive • Recurrence for new algorithm is • T(n) = 7T(n/2) + (n2) • solution is T(n) = (nlog 7) = O(n2.81) CPSC 411, Spring 2011: Set 3

  15. <board work> CPSC 411, Spring 2011: Set 3

  16. Discussion of Strassen's Algorithm • Not always practical • constant factor is larger than for naïve method • specially designed methods are better on sparse matrices • issues of numerical (in)stability • recursion uses lots of space • Not the fastest known method • Fastest known is O(n2.376) • Best lower bound known is only is (n2) CPSC 411, Spring 2011: Set 3

More Related