1 / 37

Divide and Conquer

Divide and Conquer. Recall. Divide the problem into a number of sub-problems that are smaller instances of the same problem.

matsu
Download Presentation

Divide and Conquer

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Divide and Conquer

  2. Recall • Divide the problem into a number of sub-problems that are smaller instances of the same problem. • Conquer the sub-problems by solving them recursively. If the sub-problem sizes are small enough, however, just solve the sub-problems in a straightforward manner. • Combine the solutions to the sub-problems into the solution for the original problem.

  3. Recurrences • A recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs. • We define the running time of MERGE-SORT via a recurring equation • We will study three methods to solve recurrences—that is, for obtaining asymptotic “Θ” or “O” bounds on the solution: • Substitution method • Recurrence tree • Master Method

  4. The maximum-sub-array problem • You have to invest in a corporation • You are allowed to buy one unit of stock only one time and then sell it at a later date, buying and selling after the close of trading for the day. • Your goal is to maximize your profit. • Ideally, you would want to “buy low, sell high”—buy at the lowest possible price and later on sell at the highest possible price—to maximize your profit.

  5. The maximum-sub-array problem

  6. The maximum-sub-array problem

  7. The maximum-sub-array problem A brute-force solution: just try every possible pair of buy and sell dates in which the buy date precedes the sell date. A period ofn days has (such pairs of dates. Since (is Θ (n2) and the best we can hope for is to evaluate each pair of dates in constant time, this approach would take Ω (n2) time.

  8. The maximum-sub-array problem - solution • let us instead consider the daily change in price, where the change on day i is the difference between the prices after day i- 1 and after day i. • If we treat this row as an array A, we now want to find the nonempty, contiguous sub-array of A whose values have the largest sum. • We call this contiguous sub-array the maximum sub-array.

  9. A solution using divide-and-conquer • Divide the sub-array into two then: • entirely in the sub-array A[low…mid], so that low ≤i≤j ≤mid, • entirely in the sub-array A[mid + 1 ..High], so that mid < i≤j ≤ high, or • crossing the midpoint, so that low ≤i≤mid < j ≤high.

  10. Procedure to find sub-array crossing midway takes Θ (n)time

  11. Maximum sub-array recursive procedure

  12. Analyzing the divide-and-conquer algorithm • For starters, line 1 takes constant time. The base case, when n = 1, is easy: line 2 takes constant time, and so • For the recursive case (n > 1), we have • Lines 1 and 3 take constant time • lines 4 and 5, we spend 2*T (n/2)time solving each of them • In line 6 we call, FIND-MAX-CROSSING-SUBARRAY which takes Q(n) • Lines 7–11 take only Q(1) time

  13. Solving Recurrences We have three methods to solve recurrence equations • Substitution Method • Recurrence Tree Method • Master Method

  14. Substitution Method The substitution method for solving recurrences comprises two steps: • Guess the form of the solution. • Use mathematical induction to find the constants and show that the solution works.

  15. Example Consider the recurrence • We guess that the solution is T(n) = O(n lg n) • The substitution require us to prove that T(n) <= cnlgn for c > 0 • We start by assuming that this bound holds for all positive m < n, in particular for m = Floor (n/2) • It gives • Substituting it into the recurrence yields

  16. Example - continued Now we require to show that this solution holds for boundary conditions • Let us assume, for the sake of argument, that T(1) = 1 • For n = 1? • We only require to prove for any n0 • for n > 3, the recurrence does not depend directly on T(1) • Distinction between base case of recurrence and induction • We derive from the recurrence that T(2) = 4 and T(3) = 5 • We can complete the inductive proof that T(n) =c n lgnfor some constant c>=1 by choosing c large enough so that T(2) <= c2 lg 2 and T(3) <= c3 lg3 • c >= 2?

  17. Another Example • Make sure you show the same exact form when doing a substitution proof. • Consider the recurrence T (n) = 8T (n/2) + Θ (n2) • For an upper bound: T (n) ≤ 8T (n/2) + cn2. Guess: T (n) ≤ dn3. T (n) ≤ 8d(n/2)3 + cn2 = 8d(n3/8) + cn2 = dn3 + cn2 ≤ dn3doesn’t work!

  18. Another Example - continued • Remedy: Subtract off a lower-order term. Guess: T (n) ≤ dn3 − d’n2. T (n) ≤ 8(d(n/2)3 − d’(n/2)2) + cn2 = 8d(n3/8) − 8d’(n2/4) + cn2 = dn3 − 2d’n2 + cn2 = dn3 − d’n2− d’n2+ cn2 ≤ dn3 − d’n2if −d’n2+ cn2 ≤ 0 , d’≥ c

  19. Yet another example • T(n) = cn + 3T(2n/3) • How about F(n) = nlgn? • cn + 3kF(2n/3)= cn + 3k(2n/3) lg (2n/3)= cn + 2knlgn − 2knlg (2/3)= cn + 2knlgn + 2knlg (3/2) • There is no way to choose k to make the left side (knlgn) larger • Therefore, n lg n is not correct

  20. Yet another example - continued • Try a higher order of growth like n2 or n3, but which one? Maybe nx • We can solve for the correct exponent x by plugging in knx: cn + 3T(2n/3)=  cn + 3k(2/3)xnx • This will be asymptotically less than knx as long as 3(2/3)x> 1 , which requires x > lg3/23 • Let a = lg3/2 3, then our algorithm is O(na+ε) for any positive ε • Let's try O(na) itself, the RHS after substituting knais cn + 3(2/3)akna = cn + kna≥ kna • This tells us that  kna is an asymptotic lower bound on T(n): T(n) is Ω(na). So the complexity is somewhere between Ω(na) and O(na+ε). It is in fact Θ(na). • To show the upper bound, we will try  F(n) = na + bn where b is a constant to be filled in later.

  21. Yet another example - continued • The idea is to pick a b so that bn will compensate for the cn  term that shows up in the recurrence. • Because bn is O(na), showing T(n) is O(na + bn) is the same as showing that it is O(na). Substituting kF(n)for T(n) in the RHS of the recurrence, we obtain: cn + 3kF(2n/3)= cn + 3k((2n/3)a + b(2n/3))= cn + 3k(2n/3)a + 3kb(2n/3)= cn + kna + 2kbn= kna + (3kb+c)n • The substituted LHS of the recurrence is  kna + kbn, which is larger than  kna + (2kb+c)n as long as kb>2kb+c, or b<−c/k. There is no requirement that b be positive, so choosing k=1, b= −1 satisfies the recurrence. • Therefore T(n) = O(na + bn) = O(na), and since T(n) is both O(na) and Ω(na), it is Θ(na).

  22. Substitution method - warning • Be careful when using asymptotic notation. • The false proof for the recurrence T (n) = 4T (n/4) + n, that T (n) = O(n):T (n) ≤ 4(c(n/4)) + n ≤ cn+ n = O(n) wrong! • Because we haven’t proven the exact form of our inductive hypothesis (which is that T (n) ≤ cn), this proof is false.

  23. Recursion tree method • Use to generate a guess. Then verify by substitution method. • T (n) = T (n/3)+T (2n/3)+Θ (n). • For upper bound, rewrite as T (n) ≤ T (n/3) + T (2n/3) + cn; • for lower bound, as T (n) ≥ T (n/3) + T (2n/3) + cn. • By summing across each level, the recursion tree shows the cost at each level of recursion (minus the costs of recursive calls, which appear in subtrees):

  24. Recursion tree method

  25. Recursion tree method • There are log3 n full levels, and after log3/2 n levels, the problem size is down to 1. • Each level contributes ≤ cn. • Lower bound guess: ≥ dnlog3n =(nlgn) for some positive constant d. • Upper bound guess: ≤ dnlog3/2n=O(nlgn) for some positive constant d. • Then prove by substitution.

  26. Recursion tree method Upper bound: Guess: T (n) ≤ dnlgn. Substitution: T (n) ≤ T (n/3) + T (2n/3) + cn ≤ d(n/3) lg(n/3) + d(2n/3) lg(2n/3) + cn = (d(n/3) lg n − d(n/3) lg 3) + (d(2n/3) lg n − d(2n/3) lg(3/2)) + cn = dn lg n − d((n/3) lg 3 + (2n/3) lg(3/2)) + cn = dn lg n − d((n/3) lg 3 + (2n/3) lg 3 − (2n/3) lg 2) + cn = dnlgn − dn(lg 3 − 2/3) + cn ≤ dnlgn if −dn(lg 3 − 2/3) + cn≤ 0 , d ≥ c / lg3 − 2/3 Lower bound?

  27. Another example • T(n) = 3T(n/4)+cn2.

  28. Another example

  29. Another example • The sub-problem size for a node at depth i is n=4i • Thus, the sub-problem size hits n = 1 when n/4i =1 or, equivalently, when i=log4 n. • Thus, the tree has log4 n + 1 levels (at depths 0; 1; 2, … ,log4 n). • the number of nodes at depth i is 3i • each node at depth i, for i = 0; 1; 2, …, log4 n -1, has a cost of c(n/4i)2 • Total cost at depth I is (3/16)icn2 • The bottom level, at depth log4n has each cost T(1) for a total cost of

  30. Another example • Taking advantage of a decreasing geometric sequence

  31. Another example

  32. Master Method • Used for many divide-and-conquer recurrences of the form • T (n) = aT (n/b) + f (n) , • where a ≥ 1, b > 1, and f (n) > 0. • Based on the master theorem

  33. Master Method

  34. Examples

  35. Examples

  36. Examples

  37. Examples

More Related