1 / 30

Complexity Analysis

Complexity Analysis. Asymptotic Complexity Big-O (asymptotic) Notation Big-O Computation Rules Proving Big-O Complexity How to determine complexity of code structures. Asymptotic Complexity. Finding the exact complexity, f(n) = number of basic operations , of an algorithm is difficult.

amal
Download Presentation

Complexity Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Complexity Analysis • Asymptotic Complexity • Big-O (asymptotic) Notation • Big-O Computation Rules • Proving Big-O Complexity • How to determine complexity of code structures

  2. Asymptotic Complexity • Finding the exact complexity, f(n) = number of basic operations, of an algorithm is difficult. • We approximate f(n) by a function g(n) in a way that does not substantially change the magnitude of f(n). --the function g(n) is sufficiently close to f(n) for large values of the input size n. • This "approximate" measure of efficiency is called asymptotic complexity. • Thus the asymptotic complexity measure does not give the exact number of operations of an algorithm, but it shows how that number grows with the size of the input. • This gives us a measure that will work for different operating systems, compilers and CPUs.

  3. Big-O (asymptotic) Notation • The most commonly used notation for specifying asymptotic complexity is the big-O notation. • The Big-O notation, O(g(n)), is used to give an upper bound (worst-case) on a positive runtime function f(n) where n is the input size. Definition of Big-O: • Consider a function f(n) that is non-negative  n  0. We say that “f(n) is Big-O of g(n)” i.e., f(n) = O(g(n)), if  n0  0 and a constant c > 0 such that f(n)  cg(n),  n  n0

  4. Big-O (asymptotic) Notation Implication of the definition: • For all sufficiently large n, c *g(n) is an upper bound of f(n) Note: By the definition of Big-O: f(n) = 3n + 4 is O(n) it is also O(n2), it is also O(n3), . . . it is also O(nn) • However when Big-O notation is used, the function g in the relationship f(n) is O(g(n)) is CHOOSEN TO BE AS SMALL AS POSSIBLE. • We call such a function g a tight asymptotic bound of f(n)

  5. Big-O (asymptotic) Notation Some Big-O complexity classes in order of magnitude from smallest to highest:

  6. Examples of Algorithms and their big-O complexity

  7. Warnings about O-Notation • Big-O notation cannot compare algorithms in the same complexity class. • Big-O notation only gives sensible comparisons of algorithms in different complexity classes when n is large . • Consider two algorithms for same task: Linear: f(n) = 1000 nQuadratic: f'(n) = n2/1000The quadratic one is faster for n < 1000000.

  8. Rules for using big-O • For large values of input n , the constants and terms with lower degree of n are ignored. • Multiplicative Constants Rule: Ignoring constant factors. O(c f(n)) = O(f(n)), where c is a constant; Example: O(20 n3) = O(n3) 2. Addition Rule: Ignoring smaller terms. If O(f(n)) < O(h(n)) then O(f(n) + h(n)) = O(h(n)). Example: O(n2 log n + n3) = O(n3) O(2000 n3 + 2n ! + n800 + 10n + 27n log n + 5) = O(n !) 3. Multiplication Rule:O(f(n) * h(n)) = O(f(n)) * O(h(n)) Example: O((n3 + 2n 2 + 3n log n + 7)(8n 2 + 5n + 2)) = O(n 5)

  9. Proving Big-O Complexity To prove that f(n) is O(g(n)) we find any pair of values n0 and c that satisfy: f(n) ≤ c * g(n) for  n n0 Note: The pair (n0, c) is not unique. If such a pair exists then there is an infinite number of such pairs. Example: Prove that f(n) = 3n2 + 5 is O(n2) We try to find some values of n and c by solving the following inequality: 3n2 + 5  cn2 OR3 + 5/n2 c (By putting different values for n, we get corresponding values for c)

  10. Proving Big-O Complexity Example: Prove that f(n) = 3n2 + 4n log n + 10 is O(n2) by finding appropriate values for c and n0 We try to find some values of n and c by solving the following inequality 3n2 + 4n log n + 10  cn2 OR 3 + 4 log n / n+ 10/n2 c ( We used Log of base 2, but another base can be used as well)

  11. How to determine complexity of code structures Loops: for, while, and do-while: Complexity is determined by the number of iterations in the loop times the complexity of the body of the loop. Examples: for (int i = 0; i < n; i++) sum = sum - i; O(n) for (int i = 0; i < n * n; i++) sum = sum + i; O(n2) i=1; while (i < n) { sum = sum + i; i = i*2 } O(log n)

  12. How to determine complexity of code structures Nested Loops: Complexity of inner loop * complexity of outer loop. Examples: sum = 0 for(int i = 0; i < n; i++) for(int j = 0; j < n; j++) sum += i * j ; O(n2) i = 1; while(i <= n) { j = 1; while(j <= n){ statements of constant complexity j = j*2; } i = i+1; } O(n log n)

  13. How to determine complexity of code structures Sequence of statements: Use Addition rule O(s1; s2; s3; … sk) = O(s1) + O(s2) + O(s3) + … + O(sk) = O(max(s1, s2, s3, . . . , sk)) Example: Complexity is O(n2) + O(n) +O(1) = O(n2) for (int j = 0; j < n * n; j++) sum = sum + j; for (int k = 0; k < n; k++) sum = sum - l; System.out.print("sum is now ” + sum);

  14. How to determine complexity of code structures char key; int[] X = new int[5]; int[][] Y = new int[10][10]; ........ switch(key) { case 'a': for(int i = 0; i < X.length; i++) sum += X[i]; break; case 'b': for(int i = 0; i < Y.length; j++) for(int j = 0; j < Y[0].length; j++) sum += Y[i][j]; break; } // End of switch block Switch: Take the complexity of the most expensive case o(n) o(n2) Overall Complexity: o(n2)

  15. How to determine complexity of code structures • char key; • int[][] A = new int[5][5]; • int[][] B = new int[5][5]; • int[][] C = new int[5][5]; • ........ • if(key == '+') { • for(int i = 0; i < n; i++) • for(int j = 0; j < n; j++) • C[i][j] = A[i][j] + B[i][j]; • } // End of if block • else if(key == 'x') • C = matrixMult(A, B); • else • System.out.println("Error! Enter '+' or 'x'!"); If Statement: Take the complexity of the most expensive case : O(n2) Overall complexity O(n3) O(n3) O(1)

  16. How to determine complexity of code structures • Sometimes if-else statements must carefully be checked: O(if-else) = O(Condition)+ Max[O(if), O(else)] int[] integers = new int[10]; ........ if(hasPrimes(integers) == true) integers[0] = 20; else integers[0] = -20; public boolean hasPrimes(int[] arr) { for(int i = 0; i < arr.length; i++) .......... .......... } // End of hasPrimes() O(1) O(1) O(n) O(if-else) = O(Condition) = O(n)

  17. How to determine complexity of code structures while (n > 0) { if (n % 2 = = 0) { System.out.println(n); n = n / 2; } else{ System.out.println(n); System.out.println(n); n = n – 1; } } • Note: Sometimes a loop may cause the if-else rule not to be applicable. Consider the following loop: The else-branch has more basic operations; therefore one may conclude that the loop is O(n). However the if-branch dominates. For example if n is 60, then the sequence of n is: 60, 30, 15, 14, 7, 6, 3, 2, 1, and 0. Hence the loop is logarithmic and its complexity is O(log n)

  18. Comp 122 Asymptotic Complexity • Running time of an algorithm as a function of input size n for large n. • Expressed using only the highest-order term in the expression for the exact running time. • Instead of exact running time, say Q(n2). • Describes behavior of function in the limit. • Written using Asymptotic Notation.

  19. Comp 122 Asymptotic Notation • Q, O, W, o, w • Defined for functions over the natural numbers. • Ex:f(n) = Q(n2). • Describes how f(n) grows in comparison to n2. • Define a set of functions; in practice used to compare two function sizes. • The notations describe different rate-of-growth relations between the defining function and the defined set of functions.

  20. -notation For function g(n), we define (g(n)), big-Theta of n, as the set: (g(n)) ={f(n) :  positive constants c1, c2, and n0,such that n  n0, we have 0 c1g(n)  f(n) c2g(n) } Intuitively: Set of all functions that have the same rate of growth as g(n). g(n) is an asymptotically tight bound for f(n).

  21. Comp 122 O-notation For function g(n), we define O(g(n)), big-O of n, as the set: O(g(n)) ={f(n) :  positive constants c and n0,such that n  n0, we have 0 f(n) cg(n) } Intuitively: Set of all functions whose rate of growth is the same as or lower than that of g(n). g(n) is an asymptotic upper bound for f(n). f(n) = (g(n))  f(n) = O(g(n)). (g(n))  O(g(n)).

  22. Comp 122  -notation For function g(n), we define (g(n)), big-Omega of n, as the set: (g(n)) ={f(n) :  positive constants c and n0,such that n  n0, we have 0 cg(n) f(n)} Intuitively: Set of all functions whose rate of growth is the same as or higher than that of g(n). g(n) is an asymptotic lower bound for f(n). f(n) = (g(n))  f(n) = (g(n)). (g(n))  (g(n)).

  23. Comp 122 Relations Between Q, O, W

  24. Comp 122 Relations Between Q, W, O Theorem : For any two functions g(n) and f(n), f(n) = (g(n))iff f(n) =O(g(n)) and f(n) = (g(n)). • I.e., (g(n)) = O(g(n)) ÇW(g(n)) • In practice, asymptotically tight bounds are obtained from asymptotic upper and lower bounds.

  25. Comp 122 o-notation For a given function g(n), the set little-o: f(n) becomes insignificant relative to g(n)as n approaches infinity: lim [f(n) / g(n)] = 0 n g(n) is anupper bound for f(n)that is not asymptotically tight. Observe the difference in this definition from previous ones. Why? o(g(n)) = {f(n): c > 0, n0 > 0 such that n n0, we have0 f(n)<cg(n)}.

  26. Comp 122 w-notation For a given function g(n), the set little-omega: f(n) becomes arbitrarily large relative to g(n)as n approaches infinity: lim [f(n) / g(n)] = . n g(n) is alower bound for f(n)that is not asymptotically tight. w(g(n)) = {f(n): c > 0, n0 > 0 such that n n0, we have0 cg(n) < f(n)}.

  27. Comp 122, Spring 2004 Common Functions

  28. Comp 122 x = logba is the exponent for a = bx. Natural log: ln a = logea Binary log: lg a = log2a lg2a = (lg a)2 lglg a =lg(lg a) Logarithms

  29. Comp 122 Review on Summations • Constant Series: For integers a and b, a  b, • Linear Series (Arithmetic Series): For n 0, • Quadratic Series: For n 0,

  30. Review on Summations • Cubic Series: For n 0, • Geometric Series: For real x 1, For |x| < 1,

More Related