1 / 36

Analysis of Algorithms Chapter - 02 Recurrences

Analysis of Algorithms Chapter - 02 Recurrences. 1. Methods. Definition: A recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs. The worst-case running time T(n) of an algorithm could be described as by the recurrence

levia
Download Presentation

Analysis of Algorithms Chapter - 02 Recurrences

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Analysis of Algorithms Chapter - 02 Recurrences 1

  2. Methods • Definition: A recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs. • The worst-case running time T(n) of an algorithm could be described as by the recurrence T(n) = aT(n/b) + f(n) where a≥1, b>1, and f(n) is a given function. • Three methods will be discussed to solve the recurrences obtaining asymptotic bounds on the solution. • Substitution method: • We guess a bound and then use mathematical induction to prove that our guess is correct. • Recursion-tree method: • It converts the recurrence into a tree whose nodes represent the cost incurred at various levels of the recursion • We use techniques for bounding summations to solve the recurrence. • Master method: • It provides bounds for recurrences of the above form. • It requires memorization of three cases. • Once you memorize the cases, determining asymptotic bounds for many recurrences is easy. 2

  3. Induction • Definition of Proof: The process or an instance of establishing the validity of a statement especially by derivation from other statements in accordance with principles of reasoning. • Proof by Induction: Let Pnbe a statement for all the positive integers (n = 1, 2, 3, . . .). If the following two properties hold: • P1 is true. • Pk+1 is true if Pk is true for each positive integer k. Then Pnis true for all n. • Why Induction Works? • Let S be the set of all numbers n for which Pnis false. • Let k be the minimum number in S. • k > 1 since by the first property of the induction definition, P1 is true. • By the minimality of k, Pk−1 is true and Pk is false. • A contradiction to the second property of the induction definition. 3

  4. A Summation Problem • Prove that for any integer n ≥ 1: 1 + 2 + 3 + · · · + n = n(n+1)/2 . • Define: − L(n) = 1 + 2 + 3 + · · · + n. − R(n) = n(n+1)/2 . • Prove that L(n) = R(n) for n ≥ 1. • Verifying the Claim • First check for small values of n: 4

  5. A Direct Proof • Idea: Compute the value of 2L(n). • Example: • 2(1 + 2 + 3) = (1 + 2 + 3) + (3 + 2 + 1) • = (1 + 3) + (2 + 2) + (3 + 1) • = 3 · 4 • = 12 • = 2(3·4/2) • Hence • 2·L(n) = (1 + · · · + n) + (n + · · · + 1) • = (1 + n) + (2 + (n − 1)) + · · · + (n + 1) • = (n + 1) + (n + 1) + · · · + (n + 1) • = n(n + 1) • This implies that: L(n) = n(n+1)/2 = R(n). 5

  6. A Proof by Induction • For n=1, L(1) = 1 and R(1) = (1.2)/2 = 1 So, for n=1, it is true. • Let us assume that it is true for n=k, that is, L(k) = R(k) =>1+2+3+…..+k = k(k+1)/2 • Now, if we can prove that it is true for n=k+1 also, then it can be said that it is true for all n. • For n=k+1, L(k + 1) = 1 + 2 + · · · + k + (k + 1) = L(k) + (k + 1) = R(k) + (k + 1) = k(k + 1)/2 + (k + 1) = (k + 1)(k/2 + 1) = (k + 1)(k + 2)/2 = R(k + 1) • Hence L(n) = R(n), for all integer n. 6

  7. Substitution Method 7

  8. Recurrence -1 • The method for solving recurrences entails two steps: • Guess the form of the solution • Use mathematical induction to find the constants and show that the solution works. • This method is powerful, but it obviously can be applied only in cases when it is easy to guess the form of the solution. • This method can be used to establish either upper or lower bounds on a recurrence. • Problem: Let us determine an upper bound on the following recurrence: T(1) = 0. T(n) = 2T(n/2) + n, for n>1. • Solution: Compute the solution for small powers of 2: T(2) = 2T(1) + 2 = 2. T(4) = 2T(2) + 4 = 8. T(8) = 2T(4) + 8 = 24. T(16) = 2T(8) + 16 = 64. T(32) = 2T(16) + 32 = 160. 8

  9. Recurrence -1 (Contd.) • Guessing the Solution • - Guess T(n) = n log2 n for n a power of 2. • A Proof by Induction For n=1, T(1)=0, that is T(n)=nlog2 n • So, for n=1, it is true. • Let us assume that it is true for n=k/2, • that is, T(k/2)=(k/2) log2 (k/2) • Now, if we can prove that it is true for n=k also, then it can be said that it is true for all n. • For n=k, • T(k) = 2T(k/2) + k • = 2(k/2) log2(k/2) + k • = k(log2 k − 1) + k • = k log2 k • Hence T(n) = n log2 n, for all integer n. 9

  10. Recurrence -2 • Problem: Let us determine an upper bound on the following recurrence: • T(1) = a. • T(n) = 2T(n/2) + bn. • For some constants a, b (independent of n). • Solution: Compute the solution for small powers of 2: • T(2) = 2T(1) + 2b = 2b + 2a. • T(4) = 2T(2) + 4b = 8b + 4a. • T(8) = 2T(4) + 8b = 24b + 8a. • T(16) = 2T(8) + 16b = 64b + 16a. • T(32) = 2T(16) + 32b = 160b + 32a. • Guessing the Solution • Guess T(n) = bn log2 n + an, • for n a power of 2 10

  11. Verify the guess for small numbers: • b · 1 log2 1 + a · 1 = a. • b · 2 log2 2 + a · 2 = 2b + 2a. • b · 4 log2 4 + a · 4 = 8b + 4a. • b · 8 log2 8 + a · 8 = 24b + 8a. • b · 16 log2 16 + a · 16 = 64b + 16a. • b · 32 log2 32 + a · 32 = 160b + 32a. • A Proof by Induction For n=1, T(1)=a, that is T(n)=b.nlog2 n + a.n So, for n=1, it is true. Let us assume that it is true for n=k/2, that is, T(k/2)=b.(k/2) log2 (k/2) + a.(k/2) Now, if we can prove that it is true for n=k also, then it can be said that it is true for all n. For n=k, • T(k) = 2T(k/2) + bk • = 2(b(k/2) log2(k/2) + a(k/2)) + bk • = bk(log2 k − 1) + ak + bk • = bk log2 k + ak • Hence T(n) = b.n log2 n + a.n, for all integer n 11

  12. Recurrence -3 • Problem: Let us determine an upper bound on the following recurrence: • T(1) = a. • T(n) = T(n/2) + b. • For some constants a, b (independent of n). • Solution: Compute the solution for small powers of 2: • T(2) = T(1) + b = b + a. • T(4) = T(2) + b = 2b + a. • T(8) = T(4) + b = 3b + a. • T(16) = T(8) + b = 4b + a. • T(32) = T(16) + b = 5b + a. • Guessing the Solution • Guess T(n) = b log2 n + a, • for n a power of 2 12

  13. Verify the guess for small numbers: • b · log2 1 + a = a. • b · log2 2 + a = b + a. • b · log2 4 + a = 2b + a. • b · log2 8 + a = 3b + a. • b · log2 16 + a = 4b + a. • b · log2 32 + a = 5b + a. • A Proof by Induction For n=1, T(1)=a, that is T(n)=b log2 n + a • So, for n=1, it is true. • Let us assume that it is true for n=k/2, • that is, T(k/2)=b log2 (k/2) + a • Now, if we can prove that it is true for n=k also, • then it can be said that it is true for all n. • For n=k, • T(k) = T(k/2) + b • = (b log2(k/2) + a) + b • = b(log2 k − 1) + a + b • = b log2 k + a • Hence T(n) = b log2 n + a, for all integer n 13

  14. T(n) = 2T(n/2) + n = O(n lg n) Recurrence -4 • Thus, we need to show that T(n)  c n lg n with an appropriate choice of c • Inductive hypothesis: assume T(n/2)  c (n/2) lg (n/2) • Substitute back into recurrence to show thatT(n)  c n lg n follows, when c  1 • T(n) = 2 T(n/2) + n  2 (c (n/2) lg (n/2)) + n = cn lg(n/2) + n = cn lg n – cn lg 2 + n = cn lg n – cn + n  cn lg n for c  1 = O(n lg n) for c  1

  15. Iteration Method • Iteration method: • Expand the recurrence k times • Work some algebra to express as a summation • Evaluate the summation

  16. Iteration Method – Example T(n) = n + 2T(n/2) T(n) = n + 2T(n/2) = n + 2(n/2 + 2T(n/4)) = n + n + 4T(n/4) = n + n + 4(n/4 + 2T(n/8)) = n + n + n + 8T(n/8) … = in + 2iT(n/2i) = kn + 2kT(1) = nlgn + nT(1) = Θ(nlgn) Assume: n = 2k n = 2k Taking lg on both sides lg n = lg (2k ) lg n = klg2 lg n = k x 1 lg n = k

  17. T(n) = c + T(n-1) = c + c + T(n-2) = 2c + T(n-2) = 2c + c + T(n-3) = 3c + T(n-3) … kc + T(n-k) = ck + T(n-k) • So far for nk we have • T(n) = ck + T(n-k) • To stop the recursion, we should have • n - k = 0  k = n • T(n) = cn + T(0) = cn • Thus in general T(n) = O(n)

  18. T(n) = n + T(n-1) = n + n-1 + T(n-2) = n + n-1 + n-2 + T(n-3) = n + n-1 + n-2 + n-3 + T(n-4) = … = n + n-1 + n-2 + n-3 + … + (n-k+1) + T(n-k) = for nk To stop the recursion, we should have n - k = 0  k = n

  19. The Master Method • Based on the Master theorem. • T(n) = aT(n/b) + f(n) • a  1, b > 1 are constants. • f(n) is asymptotically positive. • Requires memorization of three cases.

  20. The Master Theorem • Theorem: • Let a  1 and b > 1be constants, let f(n) be a function, and Let T(n) be defined on nonnegative integers by the recurrence T(n) = aT(n/b) + f(n), where we can replace n/b by n/b or n/b. T(n) can be bounded asymptotically in three cases: • If f(n) = O(nlogba–) for some constant  > 0, then T(n) = (nlogba). • If f(n) = (nlogba), then T(n) = (nlogbalg n). • If f(n) = (nlogba+) for some constant  > 0, and if, for some constant c < 1 and all sufficiently large n, we have a·f(n/b)  c f(n), then T(n) = (f(n)).

  21. The Master Theorem • Given: a divide and conquer algorithm • An algorithm that divides the problem of size n into a subproblems, each of size n/b • Let the cost of each stage (i.e., the work to divide the problem + combine solved subproblems) be described by the function f(n)

  22. The Master Theorem • if T(n) = aT(n/b) + f(n) where a ≥ 1 & b > 1 • then

  23. Understanding Master Theorem • In each of the three cases, we are comparing f(n) with nlogba , the solution to the recurrence is determined by the larger of the two functions. • In case 1, if the function nlogba is the larger, then the solution T(n) = Θ(nlogba). • In case 3, if the function f(n) is the larger, then the solution is T(n) = (f(n)). • In case 2, if the two functions are the same size, then the solution is T(n) = Θ(nlogba lg n) = Θ(f(n) lg n).

  24. Understanding Master Theorem • In case 1, not only must f(n) be smaller than nlogba, it must be polynomially smaller. That is f(n) must be asymptotically smaller than nlogba by a factor of nε for some constant ε > 0. • In case 3, not only must f(n) be larger than nlogba , it must be polynomially larger and in addition satisfy the “regularity” condition that: a f(n/b) ≤ c f(n).

  25. Understanding Master Theorem • It is important to realize that the three cases do not cover all the possibilities for f(n). • There is a gap between cases 1 and 2 when f(n) is smaller than nlogba but not polynomially smaller. • There is a gap between cases 2 and 3 when f(n) is larger than nlogba but not polynomially larger. • If f(n) falls into one of these gaps, or if the regularity condition in case 3 fails to hold, the master method cannot be used to solve the recurrence.

  26. Using The Master MethodCase 1 • T(n) = 9T(n/3) + n • a=9, b=3, f(n) = n • nlogb a = nlog3 9 = (n2) >> from log332 • Since f(n) = O(nlog3 9 - ) = O(n2-0.5) = O(n1.5) • where =0.5 • case 1 applies: • Thus the solution is • T(n) = (n2)

  27. Using The Master MethodCase 2 • T(n) = T(2n/3) + 1 • a=1, b=3/2, f(n) = 1 • nlogba = nlog3/21 = n0 = 1 • Since f(n) = (nlogba) = (1) • case 2 applies: • Thus the solution is • T(n) = (lg n)

  28. Using The Master MethodCase 3 • T(n) = 3T(n/4) + n lg n • a=3, b=4, f(n) = n lg n • nlogba = nlog43 = n0.793 = n0.8 • Since f(n) = W(nlog43+) = W(n0.8+0.2)= W(n) • where   0.2, and for sufficiently large n, • a . f(n/b) = 3(n/4) lg(n/4) < (3/4) n lg n for c = 3/4 • case 3 applies: • Thus the solution is • T(n) = (n lg n)

  29. Recursion-tree Method 29

  30. The method • Although the substitution method can provide a concise proof that a solution to a recurrence is correct, it is sometimes difficult to come up with a good guess. • Drawing out a recursion-tree is a straight forward way to devise a good guess. • In a recursion-tree, • each node represents the cost of a single sub-problem somewhere in the set of recursive function invocations. • We sum the costs within each level of the tree to obtain a set of per-level costs. • Then we sum all the per-level cost to determine the total cost of all levels of the recursion. • Recursion trees are particularly useful when the recurrence describes the running time of a divide-and-conquer algorithm. • A recursion tree is best used to generate a good guess, which is then verified by the substitution method. • In this section, we will use recursion-trees to generate good guess. 30

  31. Example-1 • Let us see a recursion-tree would provide a good guess for the recurrence: • We start by focusing on finding an upper bound for the solution. Following figure shows the derivation of the recursion-tree, assuming n as exact power of 4. • We create a recursion tree for T(n) = 3T(n/4) + cn2, c>0. 31

  32. Explanation • Part (a) of the figure shows T(n), which is expanded in • Part (b) into an equivalent tree representing the recurrences. The cn2 term at the root represents the cost at the top level of recursion, and the three subtrees of the root represent the cost incurred by the subproblems of size n/4. • Part (c) shows this process carried one step further by expanding each node with cost T(n/4) from part (b). The cost for each of the three children of the root is c(n/4)2. • We continue expanding each node in the tree by breaking it into its constituent parts as determined by the recurrence. • Since the subproblem sizes decrease as we get further from the root, we eventually must reach a boundary condition. • How far from the root do we reach one? • The subproblem size for a node at depth i is n/4i. Thus the subproblem size hits n=1, when n/4i=1, that is, i=log4 n. • Thus the tree has levels (0, 1, 2, ……, log4n). 32

  33. Explanation (Contd.) • Now, we determine the cost at each level of the tree. • Each level has three times more nodes than the level above, So the number of nodes at depth i is 3i. • Each node at depth i has a cost of c(n/4i), for i=0, 1, 2, ….., log4n -1. • Multiplying, we get that total cost over all nodes at depth i is 3ic(n/4i)2 = (3/16)i cn2. • The last level at, depth log4n, has 3log4n = nlog43 nodes, each contributing cost T(1), for a total cost of nlog43 .T(1) , which is Θ(nlog43). • Now add up the costs over all levels to determine the cost for the entire tree. 33

  34. Explanation (Contd.) • Now, we can use the substitution method to verify that our guess was correct, that is T(n)=O(n2) is an upper bound for the recurrence • We want to show that T(n) ≤ dn2, for some d>0. • Using the same constant c>0 as before, we have 34

  35. Example-2 • Let us see a recursion-tree would provide a good guess for the recurrence: • T(n) = T(n/3) + T(2n/3) + O(n). • Let c>0 be the constant factor for the term O(n). 35

  36. Explanation (Contd.) • Now, when we add the values across the levels of the recursion-tree, we get a value of cn for every level. • The longest path from root to a leaf is n→(2/3)n →(2/3)2n →….. →1. • Since (2/3)kn =1, when k = log3/2n, • The height of the tree is log3/2n. • Intuitively, we expect the solution to the recurrence be O(cn.log3/2n) = O(n.lg n) • We can use the substitution method to verify that our guess was correct, that is T(n)=O(n.lg n) is an upper bound for the given recurrence . • We want to show that T(n) ≤ dn.lgn, for some d>0. • Using the same constant c>0 as before, we have 36

More Related