1 / 42

Introduction to Algorithms

Introduction to Algorithms. Jiafen Liu. Sept. 2013. Today’s task. Develop more asymptotic notations How to solve recurrences. Θ- notation. Math: Θ( g(n)) = { f(n) : there exist positive constants c 1 , c 2 , and n 0 such that 0 ≤ c 1 g(n) ≤ f (n) ≤ c 2 g(n) for all n ≥ n 0 }

Download Presentation

Introduction to Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Algorithms Jiafen Liu Sept. 2013

  2. Today’s task • Develop more asymptotic notations • How to solve recurrences

  3. Θ-notation • Math: Θ(g(n)) = { f(n) : there exist positive constants c1, c2, and n0such that 0 ≤ c1g(n) ≤ f (n) ≤ c2g(n)for all n ≥ n0} • Engineering: • Drop low-order terms; • ignore leading constants.

  4. O-notation • Another asymptotic symbol used to indicate upper bounds. • Math: • Ex : 2n2= O(n3) (c= 1, n0= 2) • “=” means “one-way” equality

  5. Set definition of O-notation • Math: • Looks better? • EX : 2n2 ∈ O(n3) • O-notation corresponds roughly to “less than or equal to”.

  6. Usage of O-notation • Macro substitution: A set in a formula represents an anonymous function in the set, if O-notation only occurs on the right hand of formula. • Ex: f(n) = n3+ O(n2) Means f(n) = n3+ h(n) for some h(n) ∈O(n2), here we can think of h(n) as error term.

  7. Usage of O-notation • If O-notation occurs on the left hand of formula, such as n2+ O(n) = O(n2), which means for any f(n) ∈O(n): there exists some h(n) ∈O(n2), makes n2+ f(n) = h(n) • O-notation is an upper-bound notation. • It makes no sense to say f(n) is at least O(n2).

  8. Ω-notation • How can we express a lower bound? • Ex:

  9. Θ-notation • O-notation is like ≤ • Ω-notation is like ≥. • And Θ-notation is like = • Now we can give another definition of Θ-notation • We also call Θ-notation as “tight bound”.

  10. A Theorem on Asymptotic Notations • Theorem 3.1 • For any two functions f(n) and g(n), we say f(n)=Θ(g(n))if and only if f(n)=O(g(n)) and f(n)=Ω(g(n)). • Ex: , how can we prove that?

  11. Two More Strict Notations • We have just said that: • O-notation and Ω-notation are like ≤and ≥. • o-notation and ω-notation are like < and >. • Difference with O-notation : This inequality must hold for all c instead of just for 1.

  12. What does it mean? “for any constant c ” • With o-notation, we mean that: no matter what constant we put in front of g(x), f(x) will be still less than cg(x) for sufficiently large n. No matter how small c is. • Ex: 2n2= o(n3) • An counter example: • 1/2n2 = Ω (n2) does not hold for each c. • (n0= 2/c)

  13. Two More Strict Notations • We have just said that: • O-notation and Ω-notation are like ≤and ≥. • o-notation and ω-notation are like < and >. • Difference with Ω-notation : This inequality must hold for all c instead of just for 1.

  14. Solving recurrences • The analysis of merge sort from Lecture 1 required us to solve a recurrence. • Lecture 3: Applications of recurrences to divide-and-conquer algorithms. • We often omit some details inessential while solving recurrences: • n is supposed to be an integer because the size of input is an integer typically . • Ignore the boundary conditions for convenience.

  15. Substitution method • The most general method : Substitution method • Guess the form of the solution. • Verify by induction. • Solve for constants. • Substitution method is used to determine the upper or lower bounds of recurrence.

  16. Example of Substitution • EXAMPLE: T(n) = 4T(n/2) + n (n≥1) • (Assume that T(1) = Θ(1)) • Can you guess the time complexity of it? • Guess O(n3). (Prove O and Ω separately.) • We will prove T(n) ≤ cn3by induction. • First ,we assume that T(k) ≤ ck3for k < n • T(k) ≤ ck3holds while k=n/2 by assumption.

  17. Example of Substitution • All we need is • this holds as long as c ≥ 2 • for n ≥ 1 desired desired residual

  18. Base Case in Induction • We must also handle the initial conditions, that is, ground the induction with base cases. • Base: T(n) = Θ(1) for all n < n0, where n0 is a suitable constant. • For 1 ≤ n< n0, we have “Θ(1)” ≤ cn3, if we pick c big enough. Here we are! BUT thisboundisnottight !

  19. A tighter upper bound? • We shall prove that T(n) = O(n2) . • Assume that T(k) ≤ ck2 for k < n. Anybody can tell me why? Now we need –n ≥ 0 But it seems impossible for n ≥ 1 desired residual

  20. A tighter upper bound! • IDEA: Strengthen the inductive hypothesis. Subtract a low-order term. • Inductive hypothesis: T(k) ≤ c1k2 – c2k for k< n. Now we need (c2-1) n ≥0, it holds if c2 ≥ 1 desired residual

  21. For the Base Case • We have proved now that for any value of c1, and provided c2 ≥ 1. • Base: We need T(1) ≤ c1– c2 • Assumed T(1) is some constant. • We need to choose c1 to be sufficiently larger than c2, and c2 has to be at least 1. • To handle the initial conditions, just pick c1 big enough with respect to c2.

  22. About Substitution method • We have worked for upper bounds, and the lower bounds are similar. • Try it yourself. • Shortcomings: • We had to know the answer in order to find it, which is a bit of a pain. • It would be nicer to just figure out the answer by some procedure, and that will be the next two techniques.

  23. Recursion-tree method • A recursion tree models the costs (time) of a recursive execution of an algorithm. • The recursion-tree method can be unreliable, just like any method that uses dot,dot,dots (…). • The recursion-tree method promotes intuition, however. • The recursion tree method is good for generating guesses for the substitution method.

  24. Example of recursion tree • Solve T(n) = T(n/4) + T(n/2)+ n2:

  25. Example of recursion tree • Solve T(n) = T(n/4) + T(n/2)+ n2:

  26. Example of recursion tree • Solve T(n) = T(n/4) + T(n/2)+ n2:

  27. Example of recursion tree • Solve T(n) = T(n/4) + T(n/2)+ n2: ? How many leaves? <n We just need an upper bound.

  28. Example of recursion tree • Solve T(n) = T(n/4) + T(n/2)+ n2: <2n2 • So T(n) = Θ(n2) ? ? ? ? Recall 1+1/2+1/4+1/8+…

  29. The Master Method • It looks like an application of the recursion tree method but with more precise. • The sad part about the master method is it is pretty restrictive. • It only applies to recurrences of the form: T(n) = a T(n/b) + f(n) , where a ≥1, b >1, and f(n) is asymptotically positive. • aT(n/b) means every problem you recurse on should be of the same size.

  30. Three Common Cases

  31. Three Common Cases • Case 1:

  32. Three Common Cases • Case 1: • Case 2:

  33. Three Common Cases • Case 3:

  34. Examples • Ex: T(n) = 4T(n/2) + n • a = 4, b= 2 • and

  35. Examples • Ex: T(n) = 4T(n/2) + n2 • a = 4, b= 2 • and

  36. Examples • Ex: T(n) = 4T(n/2) + n3 • a = 4, b= 2 • and

  37. Homework • Read Chapter 3 and 4 to be prepared for applications of recurrences.

  38. Proof of Master Method Height = ? logbn #(leaves) = ?

  39. Proof of Master Method • CASE1: The weight increases geometrically from the root to the leaves. The leaves hold a constant fraction of the total weight. Height = ? logbn #(leaves) = ?

  40. Proof of Master Method • CASE2(k= 0) :The weight is approximately the same on each of the logbn levels. Height = ? logbn #(leaves) = ?

  41. Proof of Master Method • CASE3: The weight decreases geometrically from the root to the leaves. The root holds a constant fraction of the total weight. Height = ? logbn #(leaves) = ?

  42. Have FUN !

More Related