1 / 51

Complexity Analysis : Asymptotic Analysis

Complexity Analysis : Asymptotic Analysis. Nattee Niparnan. Recall. What is the measurement of algorithm? How to compare two algorithms? Definition of Asymptotic Notation Complexity Class. Today Topic. Finding the asymptotic upper bound of the algorithm. Wait…. Did we miss something?.

trory
Download Presentation

Complexity Analysis : Asymptotic Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Complexity Analysis : Asymptotic Analysis NatteeNiparnan

  2. Recall • What is the measurement of algorithm? • How to compare two algorithms? • Definition of Asymptotic Notation • Complexity Class

  3. Today Topic • Finding the asymptotic upperbound of the algorithm

  4. Wait… Did we miss something?

  5. Resource Function What is this one? Time Function of an algorithm A Time used Size of input Space Function of an algorithm A Space used Size of input

  6. From Experiment • We count instruction executed

  7. Why instruction count? • Does instruction count == time? • Probably • But not always • But… • Usually, there is a strong relation between instruction count and time if we count the most occurred instruction

  8. Why count just some? • Remember the findMaxCount()? • Why we count just the max assignment? • Why don’t we count everything? • What if, each max assignment takes N instruction • That starts to make sense

  9. Time Function = instruction count • Time Function = instruction count • Time Function = instruction count • Time Function = instruction count

  10. Compute O

  11. Interesting Topics of Upper Bound • Rule of thumb! • We neglect • Lower order terms from addition • E.g. n3+n2 = O(n3) • Constant • E.g. 3n3 = O(n3) Remember that we use = instead of (more correctly) 

  12. Why Discard Constant? • From the definition • We can use any constant • E.g. 3n = O(n) • Because • When we let c >= 3, the condition is satisfied

  13. Why Discard Lower Order Term? • Consider • f(n) = n3+n2 • g(n) = n3 • If f(n) = O(g(n)) • Then, for some c and n0 • c * g(n)-f(n) > 0 • Definitely, just use any c >1

  14. Why Discard Lower Order Term? • 1.1 * g(n)-f(n) = 0.1n3-n2 • Try c = 1.1 • Does 0.1n3-n2 > 0 ? • It is when • 0.1n > 1 • E.g., n > 10 • 0.1n3-n2 > 0 • 0.1n3 > n2 • 0.1n3/n2 > 1 • 0.1n > 1

  15. Lower Order only? • In fact, • It’s only the dominant term that count • Which one is dominating term? • The one that grow faster • Why? • Eventually, it is g*(n)/f*(n) • If g(n) grows faster, • g(n)/f*(n) > some constant • E.g, lim g(n)/f*(n)  infinity The non-dominant term The dominant term

  16. What dominating what? Left side dominates

  17. Putting into Practice • What is the asymptotic class of • 0.5n3+n4-5(n-3)(n-5)+n3log8n+25+n1.5 • (n-5)(n2+3)+log(n20) • 20n5+58n4+15n3.2*3n2 O(n4) O(n3) O(n5.2)

  18. Asymptotic Notation from Program Flow • Sequence • Conditions • Loops • Recursive Call

  19. Sequence Block A f (n) f(n) + g(n) = O(max (f(n),g(n)) Block B g (n)

  20. Example f(n) + g(n) = O(max (f(n),g(n)) Block A O(n) O(n2) Block B O(n2)

  21. Example f(n) + g(n) = O(max (f(n),g(n)) Block A Θ(n) O(n2) Block B O(n2)

  22. Example f(n) + g(n) = O(max (f(n),g(n)) Block A Θ(n) Θ(n2) Block B Θ(n2)

  23. Example f(n) + g(n) = O(max (f(n),g(n)) Block A O(n2) Θ(n2) Block B Θ(n2)

  24. Condition O(max (f(n),g(n)) Block A Block B f (n) g (n)

  25. Loops for (i = 1;i <= n;i++) { P(i) } Let P(i) takes timeti

  26. Example for (i = 1;i <= n;i++) { sum += i; } sum += iΘ(1)

  27. Why don’t we use max(ti)? • Because the number of terms is not constant for (i = 1;i <= n;i++) { sum += i; } Θ(n) for (i = 1;i <= 100000;i++) { sum += i; } Θ(1) With big constant

  28. Example for (j = 1;j <= n;j++) { for (i = 1;i <= n;i++) { sum += i; } } sum += iΘ(1)

  29. Example for (j = 1;j <= n;j++) { for (i = 1;i <= j;i++) { sum += i; } } sum += iΘ(1)

  30. Example : Another way for (j = 1;j <= n;j++) { for (i = 1;i <= j;i++) { sum += i; } } sum += iΘ(1)

  31. Example for (j = 2;j <= n-1;j++) { for (i = 3;i <= j;i++) { sum += i; } } sum += iΘ(1)

  32. Example : While loops While (n > 0) { n = n - 1; } Θ(n)

  33. Example : While loops While (n > 0) { n = n - 10; } Θ(n/10) = Θ(n)

  34. Example : While loops While (n > 0) { n = n / 2; } Θ(log n)

  35. Example : Euclid’s GCD function gcd(a, b) { while (b > 0) { tmp = b b = a mod b a = tmp } return a }

  36. Example : Euclid’s GCD Until the modding one is zero function gcd(a, b) { while (b > 0) { tmp = b b = a mod b a = tmp } return a } How many iteration? Compute mod and swap

  37. Example : Euclid’s GCD a b

  38. Example : Euclid’s GCD a b a mod b If a > b a mod b < a / 2

  39. Case 1: b > a / 2 a b a mod b

  40. Case 1: b ≤ a / 2 a b We can always put another b a mod b

  41. Example : Euclid’s GCD function gcd(a, b) { while (b > 0) { tmp = b b = a mod b a = tmp } return a } O( log n) B always reduces at least half

  42. Theorem • T(n)= ΣT(ain) + O(N) • If Σai <1 then • T(n) = O(n) T(n) = T(0.7n) + T(0.2n) + T(0.01)n + 3n = O(n)

  43. Recursion try( n ){ if ( n <= 0 ) return 0; for ( j = 1; j <= n ; j++) sum += j; try (n * 0.7) try (n * 0.2) }

  44. Recursion try( n ){ if ( n <= 0 ) return 0; for ( j = 1; j <= n ; j++) sum += j; try (n * 0.7) try (n * 0.2) } Θ(1) terminating process T(n) Θ(n) recursion T(0.7n) + T(0.2n) T(n) = T(0.7n) + T(0.2n) + O(n)

  45. Guessing and proof by induction • T(n) = T(0.7n) + T(0.2n) + O(n) • Guess: T(n) = O(n), T(n) ≤ cn • Proof: • Basis: obvious • Induction: • Assume T(i < n) = O(i) • T(n) ≤ 0.7cn + 0.2cn + O(n) • = 0.9cn + O(n) • = O(n) <<< dominating rule

  46. Using Recursion Tree • T(n) = 2 T(n/2) + n n n Lg n n/2 n/2 2 n/2 n/4 n/4 n/4 n/4 4 n/4 T(n) = O(n lg n)

  47. Master Method • T(n) = aT(n/b) + f (n) a ≥ 1, b > 1 • Let c = logb(a) • f (n) = Ο( nc - ε ) → T(n) = Θ( nc ) • f (n) = Θ( nc ) → T(n) = Θ( nclog n) • f (n) = Ω( nc + ε ) → T(n) = Θ( f (n) )

  48. Master Method : Example • T(n) = 9T(n/3) + n • a = 9, b = 3, c = log 3 9 = 2, nc = n2 • f (n) = n = Ο( n2 - 0.1 ) • T(n) = Θ(nc) = Θ(n2) The case of f (n) = Ο( nc - ε )

  49. Master Method : Example • T(n) = T(n/3) + 1 • a = 1, b = 3, c = log 3 1 = 0, nc = 1 • f (n) = 1 = Θ( nc ) = Θ( 1 ) • T(n) = Θ(nc log n) = Θ( log n) The case of f (n) = Θ( nc )

  50. Master Method : Example • T(n) = 3T(n/4) + n log n • a = 3, b = 4, c = log 4 3 < 0.793, nc < n0.793 • f (n) = n log n = Ω( n0.793 ) • a f (n/b) = 3 ((n/4) log (n/4) ) ≤ (3/4) n log n = d f (n) • T(n) = Θ( f (n) ) = Θ( n log n) The case of f (n) = Ω( nc + ε )

More Related