1 / 19

Recurrences (in color)

Recurrences (in color). It continues…. Recurrences. When an algorithm calls itself recursively, its running time is described by a recurrence. A recurrence describes itself in terms of its value on smaller inputs

leona
Download Presentation

Recurrences (in color)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Recurrences(in color) It continues…

  2. Recurrences • When an algorithm calls itself recursively, its running time is described by a recurrence. • A recurrence describes itself in terms of its value on smaller inputs • There are three methods of solving these: substitution, recursion tree, or master method

  3. What it looks like • This is the recurrence of MERGE-SORT • What this says is that the time involved is 1 if n = 1, • Else, the time involved is 2 times half the size of the array, plus n time to merge sorted sub-arrays { Θ(1) if n = 1 T(n) = 2T(n/2) + Θ(n) if n > 1

  4. Substitution • Similar to induction • Guess solution, and prove it holds true for next call • Powerful method, but can sometimes be difficult to guess the solution!

  5. Substitution • Example: • T (n) = 2T (n/2) + n • Guess that T (n) = O(n lg n) • We must prove thatT (n)  cn lg n, for an appropriate constant c > 0 • Assume it holds for n/2 as well T (n/2) = c(n/2) lg (n/2) T (n)  2 (c(n/2) lg (n/2)) + n = cn lg (n/2)) + n = cn (lg n – lg 2) + n = cn lg n – cn + n  cn lg n, for  c  1 Note:  means ‘for all’

  6. Subtleties • Let T (n) = T (n/2) + T(n/2) + 1 • Assume that T (n) = O(n) • ThenT (n/2) = c(n/2) T (n)  c(n/2) + c(n/2) + 1 = cn + 1 (note there is an extra “1”!) Which does not implyT (n)  cn • Here, we’re correct, but off by a constant!

  7. Subtleties • We strengthen our guess: T (n)  cn – b T (n)  (c(n/2) – b) + (c(n/2) – b) + 1 = cn – 2b + 1  cn – b, for  b > 1

  8. One Last Example Original Equation: T (n) = 2T (n) + lg n Let m = lg n, then T (2m) = 2T (2m/2) + m Let S (m) = T (2m) S (m) = 2 S (m/2) + m We know S (m) = (m lg m), so T (n) = T (2m) = S (m) = O(m lg m) = O(lg n lg lg n)

  9. Recursion Tree Method • A recursion tree is built • We sum up each level • Total cost = number of levels * cost at each level • Usually used to generate a good guess for the substitution method • Could still be used as direct proof • Example: T (n) = 3T (n/4) + (n2)

  10. T(n)

  11. cn2 T(n/4) T(n/4) T(n/4)

  12. cn2 c(n/4)2 c(n/4)2 c(n/4)2 T(n/16) T(n/16) T(n/16) T(n/16) T(n/16) T(n/16) T(n/16) T(n/16) T(n/16)

  13. cn2 c(n/4)2 c(n/4)2 c(n/4)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1)

  14. cn2 cn2 c(n/4)2 c(n/4)2 c(n/4)2 3/16cn2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 (3/16)2cn2 c(n/16)2 c(n/16)2 (nlog43) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1)

  15. Questions • How many levels does this tree have? • The subproblem at depth i is n/4i • When does the subproblem hit size 1? • n/4i = 1 • n = 4i • lg4n = i • Therefore, the tree has lg4n + 1 levels (0, 1, 2,… lg4n) • There are 3i nodes at each level • The cost at each level is 3ic(n/4i)2 • The last level has 3log4nnodes = nlog43

  16. The Master's Method When it has this form: T(n) = aT(n/b) + f(n) • If f (n) = Ο(nlogba-ε) for some constant ε>0, then T (n) = Θ (nlogba) • If f (n) = Θ(nlogba-ε) for some constant ε>0, then T (n) = Θ (nlogba lgn) • If f (n) = Ω (nlogba+ε) for some constant ε>0, and if af(n/b) ≤cf(n) for c < 1 and large n T (n) = Θ (f (n))

  17. Example • T(n) = 9T(n/3) + n • a = 9, b = 3, f(n)=n, thus nlogba =nlog3 9=n2 • f(n)=n=O(nlog3 9-ε), where ε=1 • So we can apply case 1, thus T(n) =Θ (n2) • T(n) = T(2n/3) + 1 • a = 1, b = 3/2, f(n)=1, thus nlogba =nlog3/2 1 =n0 =1 • Case 2 applies, thus T(n) =Θ (lgn)

  18. Example … T(n) = 3 T(n/4) + nlgn a = 3, b = 4, f(n) = nlgn nlogba = nlog43 = O(n0.793) f(n) = Ω (nlog43+ε) where ε ≈ 0.2 (solve for it) For large n, a f(n/b) = 3(n/4)lg(n/4) ≤ (3/4)nlgn = c f(n) for c = 3/4 Case 3 applied Then T (n) = Θ (nlgn)

  19. When it doesn’t work... • T(n) = 2T(n/2) + n lg n • a = 2, b = 2, f(n) = n lg n • You would think that rule 3 should apply • f(n) > nlogba • n lg n > n • But f(n) is not polynomially larger! • Because (n lg n)/n = lg n, which is asymptotically less than n.

More Related