1 / 36

Lecture 5

Lecture 5. Today, how to solve recurrences We learned “guess and proved by induction” We also learned “substitution” method Today, we learn the “master theorem” More divide and conquer: closest pair problem matrix multiplication. Master Theorem.

palani
Download Presentation

Lecture 5

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 5 • Today, how to solve recurrences • We learned “guess and proved by induction” • We also learned “substitution” method • Today, we learn the “master theorem” • More divide and conquer: • closest pair problem • matrix multiplication

  2. Master Theorem Theorem 4.1 (CLRS, Theorem 4.1) Let a ≥ 1 and b > 1 be constants. Let f(n) be a function and let T(n) be defined on the nonnegative integers by T(n) = aT(n/b) + f(n). Then

  3. Note • Only apply to a particular family of recurrences. • f(n) is positive for large n. • Key is to compare f(n) with nlog_b a • Case 2, more general is f(n) = Θ( nlog_b a lgkn). Then the result is T(n) = Θ( nlog_b a lgk+1n). • Sometimes it does not apply. Ex. T(n) = 4T(n/2) + n2 /logn.

  4. Proof ideas of Master Theorem • Consider a tree with T(n) at the root, and apply the recursion to each node, until we get down to T(1) at the leaves. The first recursion is T(n) = aT(n/b) + f(n), so assign a cost of f(n) to the root. At the next level we have “a” nodes, each with a cost of T(n/b). When we apply the recursion again, we get a cost of af(n/b) for all of these. At the next level we have a2 nodes, each with a cost of T(n/b2). We get a cost of a2f(n/b2). We continue down to T(1) at the leaves. There are alog_b n leaves and each costs Θ(1), which gives Θ(alog_b n). The total cost associated with f is Σ 0 ≤ i ≤ log_b n - 1 ai f(n/bi). • Thus T(n) = Θ(n log_b a) + Σ 0 ≤ i ≤ (log_b n) - 1 ai f(n/bi). • The three cases now come from deciding which term is dominant. In case (1), the Θ term is dominant. In case (2), the terms are roughly equal (but the second term has an extra lg n factor). In case (3), the f(n) term is dominant. The details are somewhat painful, but can be found in CLRS, pp. 76-84.

  5. f(n) af(n/b) h = logbn a2f(n/b2) … #leaves = ah = alogbn = nlogba nlogbaT(1) Idea of master theorem Recursion tree: f(n) a … f(n/b) f(n/b) f(n/b) a … f(n/b2) f(n/b2) f(n/b2) … T(1)

  6. Three common cases Compare f(n) with nlogba: • f(n) = O(nlogba – e) for some constant e > 0. • f(n) grows polynomially slower than nlogba (by an ne factor). • Solution: T(n) = Q(nlogba) .

  7. These functions increase from top to bottom geometrically, hence we only need to have the last bottom term Idea of master theorem Recursion tree: f(n) f(n) a … af(n/b) f(n/b) f(n/b) f(n/b) a h = logbn … a2f(n/b2) f(n/b2) f(n/b2) f(n/b2) … … CASE 1: The weight increases geometrically from the root to the leaves. The leaves hold a constant fraction of the total weight. nlogbaT(1) T(1) Q(nlogba)

  8. Case 2 Compare f(n) with nlogba: • f(n) = Q(nlogba lgkn) for some constant k³ 0. • f(n) and nlogba grow at similar rates. • This is clear for k=0. For k>0, the intuition is that lgk n factor remain for constant fraction of levels, hence sum to the following • Solution: T(n) = Q(nlogba lgk+1n) .

  9. Idea of master theorem All levels same Recursion tree: f(n) f(n) a … af(n/b) f(n/b) f(n/b) f(n/b) a h = logbn … a2f(n/b2) f(n/b2) f(n/b2) f(n/b2) … … CASE 2: (k = 0) The weight is approximately the same on each of the logbn levels. nlogbaT(1) T(1) Q(nlogbalgn)

  10. Case 3, c<1, akf(n/bk) geometrically decreases hence = Θ(f(n)) Compare f(n) with nlogba: • f(n) = W(nlogba + e) for some constant e > 0. • f(n) grows polynomially faster than nlogba (by an ne factor), • andf(n) satisfies the regularity conditionthat af(n/b) £cf(n) for some constant c< 1. • Solution: T(n) = Q(f(n)) .

  11. Idea of master theorem af(n/b)<(1-ε)f(n) Recursion tree: f(n) f(n) a … af(n/b) f(n/b) f(n/b) f(n/b) a h = logbn … a2f(n/b2) f(n/b2) f(n/b2) f(n/b2) … … CASE 3: The weight decreases geometrically from the root to the leaves. The root holds a constant fraction of the total weight. nlogbaT(1) T(1) Q(f(n))

  12. Examples for the Master Theorem • The Karatsuba recurrence has a = 3, b = 2, f(n) = cn. Then case 1 applies, and so T(n) = Θ(n l og_2 3 ), as we found. • The mergesort recurrence has a = 2, b = 2, f(n) = n. Then case 2 applies, and so T(n) = Θ(n lg n). • Finally, a recurrence like T(n) = 3T(n/2) + n2 gives rise to case 3. In this case f(n) = n2, so 3f(n/2) = 3 (n/2)2 = (3/4) n2 ≤ c n2 for c = 3/4, and so T(n) = Θ(n2). • Note that the master theorem does not cover all cases. In particular, it does not cover the case T(n) = 2 T(n/2) + n / lg n since then the only applicable case is case 3, but then the inequality involving f does not hold.

  13. Closest pair problem • Input: • A set of points P = {p1,…, pn} in two dimensions • Output: • The pair of points pi, pj that minimize the Euclidean distance between them.

  14. Distances • Euclidean distance

  15. Closest Pair Problem

  16. Closest Pair Problem

  17. Divide and Conquer • O(n2) time algorithm is easy • Assumptions: • No two points have the same x-coordinates • No two points have the same y-coordinates • How do we solve this problem in 1 dimension? • Sort the number and walk from left to right to find minimum gap.

  18. Divide and Conquer • Divide and conquer has a chance to do better than O(n2). • We can first sort the points by their x-coordinates and sort also by y-coordinates

  19. Closest Pair Problem

  20. Divide and Conquer for the Closest Pair Problem Divide by x-median

  21. Divide L R Divide by x-median

  22. Conquer L R Conquer: Recursively solve L and R

  23. Combination I L R d2 Take the smaller one of d1 , d2 : d = min(d1 , d2 )

  24. Combination IIIs there a point in L and a point in R whose distance is smaller than d ? L R d = min(d1 , d2 )

  25. Combination II • If the answer is “no” then we are done!!! • If the answer is “yes” then the closest such pair forms the closest pair for the entire set • How do we determine this?

  26. Combination IIIs there a point in L and a point in R whose distance is smaller than d ? L R

  27. Combination IIIs there a point in L and a point in R whose distance is smaller than d ? L R Need only to consider the narrow band O(n) time

  28. Combination IIIs there a point in L and a point in R whose distance is smaller than d ? L R Denote this set by S, assume Sy is the sorted list of S by the y-coordinates.

  29. Combination II • There exists a point in L and a point in R whose distance is less than d if and only if there exist two points in S whose distance is less than d. • If S is the whole thing, did we gain anything? • CLAIM: If s and t in S have the property that ||s-t|| <d, then s and t are within 15 positions of each other in the sorted list Sy.

  30. Combination IIIs there a point in L and a point in R whose distance is smaller than d ? L R There are at most one point in each box of size δ/2 by δ/2. Thus s and t cannot be too far apart.

  31. Closest-Pair • Preprocessing: • Construct Pxand Pyas sorted-list by x- and y-coordinates • Closest-pair(P, Px,Py) • Divide • Construct L, Lx, Lyand R, Rx, Ry • Conquer • Let d1= Closest-Pair(L, Lx, Ly) • Let d2= Closest-Pair(R, Rx, Ry) • Combination • Let d = min(d1 , d2 ) • Construct S and Sy • For each point in Sy, check each of its next 15 points down the list • If the distance is less than d, update the das this smaller distance

  32. Complexity Analysis • Preprocessing takes O(n lg n) time • Divide takes O(n) time • Conquer takes 2 T(n/2) time • Combination takes O(n) time T(n) = 2T(n/2) + cn • So totally takes O(n lg n) time.

  33. Matrix Multiplication • Suppose we multiply two NxN matrices together. • Regular method is NxNxN = N3 multiplications • O(N3)

  34. Can we Divide and Conquer? B = C= A*B = A = C11 = A11*B11 + A12*B21 C12 = A11*B12 + A12*B22 C21 = A21*B11 + A22*B21 C22 = A21*B12 + A22*B22 Complexity : T(N) = 8T(N/2) + O(N2) = O(Nlog28) = O(N3) No improvement

  35. Strassen’s Matrix Multiplication P1 = (A11+ A22)(B11+B22) P2 = (A21 + A22) * B11P3 = A11 * (B12 - B22) P4 = A22 * (B21 - B11) P5 = (A11 + A12) * B22P6 = (A21 - A11) * (B11 + B12) P7 = (A12 - A22) * (B21 + B22) C11 = P1 + P4 - P5 + P7C12 = P3 + P5C21 = P2 + P4C22 = P1 + P3 - P2 + P6 Volker Strassen And do this recursively as usual.

  36. Time analysis • T(n) = 7T(n/2) + O(n2 ) = 7logn by the Master Theorem =nlog7 =n2.81 • Best bound: O(n2.376) by Coppersmith-Winograd. • Best known (trivial) lower bound: Ω(n2). • Open: what is the true complexity of matrix multiplication?

More Related