1 / 25

Order of growth

Order of growth. Suppose you have analyzed two algorithms and expressed their run times in terms of the size of the input: Algorithm A: takes 100 n + 1 steps to solve a problem with size n ; Algorithm B: takes n 2 + n + 1 steps.

mauve
Download Presentation

Order of growth

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Order of growth Suppose you have analyzed two algorithms and expressed their run times in terms of the size of the input: • Algorithm A: takes 100 n + 1 steps to solve a problem with size n; • Algorithm B: takes n2 + n + 1 steps. The leading term is the term with the highest exponent.

  2. The following table shows the run time of these algorithms for different problem sizes:

  3. Notes • At n=10, Algorithm A looks bad • For Algorithm A, the leading term has a large coefficient, 100, which is why B does better than A for small n. • But for n=100 they are about the same, • for larger values of n, A is much better. • any function that contains an n2 term will grow faster than a function whose leading term is n. • Even if the run time of Algorithm A were n + 1000000, it would still be better than Algorithm B for sufficiently large n.

  4. How to compare algorithms? • for large problems, we expect an algorithm with a smaller leading term to be a better algorithm • but for smaller problems, there may be a crossover point where another algorithm is better. • The location of the crossover point depends on the details of the algorithms, the inputs and the hardware,

  5. How to compare algorithms? • If two algorithms have the same leading order term, it is hard to say which is better; the answer will depend on the details. • they are considered equivalent, even if they have different coefficients.

  6. Order of growth An order of growth is a set of functions whose growth is considered equivalent. Examples: • 2n, 100n and n + 1 belong to the same order of growth, which is written O(n) in “Big-Oh notation” • All functions with the leading term n2 belong to O(n2); • What is the order of growth of n3 + n2? • What about 1000000 n3 + n2. What about n3 + 1000000 n2? • What is the order of growth of (n2 + n) * (n + 1)?

  7. The following table shows some of the orders of growth that appear most commonly in algorithmic analysis, in increasing order of badness.

  8. Asymptotic Analysis of Algorithms(Asymptotic  for large n) big oh expressions greatly simplify the analysis of the running time of algorithms • all that we get is an upper bound on the running time of the algorithm • the result does not depend upon the values of the constants • the result does not depend upon the characteristics of the computer and compiler actually used to execute the program! 8

  9. Let f and g be nonnegative functions on the positive integers We write f(n)=O(g(n)) And say that f(n) is of order at most g(n) or, f(n) is big oh of g(n) or, g is an asymptotic upper bound for f if there exist constants C1>0 and N1 such that f(n)  C1g(n), for all n  N1 Big O notation 9

  10. Is 2n+1=O(2n)? 2n+1<=c*2n? Yes, if c>=2 for all n Is 22n=O(2n)? 22n<=c*2n? 2n*2n<=c*2n? 2n<=c? No f(n)=5n3f(n)=O(n3) g(n)= 3n2g(n)=O(n3)  but f(n) not equal g(n) 10

  11. Notation O() is set of functions. But common to abuse notation, writing T(n) = O(…) instead of T(n)  O(…) as well as T(n) = f(n) + O(…) 11

  12. Conventions for Writing Big Oh Expressions • Ignore the multiplicative constants • Instead of writing O(3n2), we simply write O(n2) • If the function is constant (e.g. O(1024) we write O(1)) • Ignore the lower order terms • Instead of writing O(nlogn+n+n2), we simply write O(n2) 12

  13. Examples T(n) = 32n2 + 17n + 32 T(n)=O(n2), O(n3), not O(n) NOTE: O(n2) tight bound, O(n3) not tight bound, • n, n+1, n+80, 40n, n+log n is O(n) • n2 + 10000000000n is O(n2) • 3n2 + 6n + log n + 24.5 is O(n2) 13

  14. Properties of Big Oh If f1(n)=O(g1(n)) and f2(n)=O(g2(n)) , then f1(n)+f2(n)=O(max(g1(n),g2(n))) If f1(n)=O(g1(n)) and f2(n)=O(g2(n)) , then f1(n)X f2(n)=O(g1(n)X g2(n)) f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n)) 14

  15. e.g.6 (revisited): c1 O(1) c2(n+1)O(n) c3n O(n) c4 O(1) T(n)=c1+c2(n+1)+c3n+c4 T(n)=a+bnfor some constants a, b T(n)=O(n) Horner(int a[], n, x){ result=a[n] for(i=n-1;i>=0,--i) result=result*x+a[i] return result } 15

  16. e.g.8 (revisited): asymptotic analysis O(1) O(n) O(n) O(n2) O(n2) O(n) T(n) = O(n2) fun (int x, int n){ Sum=0; For(i=0n){ P=1 For(j=0; j < n ;++j) P*=x Sum+=p } } 16

  17. e.g.10 (revisited): Sequential search The worst case time T(n) = an+b The best case time T(n) = constant Obtain asymptotic O bound on the solution? The worst case is O(n) The best case is O(1) 17

  18. e.g.11: find a O-notation in terms of n for the number of times the statement x=x+1 is executed in the segment: for (i=1 to n) for (j=1 to i) x=x+1 i j 1 11 2 12 3 13 : n 1n cn=1+2+3+…+n =n(n+1) / 2 O(n2) 18

  19. e.g.12: find a O-notation in terms of n for the number of times the statement x=x+1 is executed in the segment: j=n while ( j >= 1){ for (i=1 to j) x=x+1 j=j/2 } j i n 1n n/2 1n/2 n/4 1n/4 : n/2k 1n/2k cn=n+n/2+n/4+…+ n/2k =n(1+1/2+1/22+…+ 1/2k) =n(1/(1-0.5)) O(n) 19

  20. e.g.7 (revisited):Obtain asymptotic O bound for recursive functions??? Solving Recurrence Relations-Repeated Substitution T(n)=a if n=0 T(n)=T(n-1)+b if n>0 for some constants a, b 20

  21. An Asymptotic Lower Bound-Omega • Let f and g be nonnegative functions on the positive integers • We write • f(n)= (g(n)) • And say that • f(n) is of order at least g(n) or, • f(n) is omega of g(n) or, • g is an asymptotic lower bound for f • if there exist constants C2>0 and N2 such that • f(n)  C2g(n), • for all n  N2 21

  22. 2n+13  O( ? ) Also, O(n2), … Can always weaken the bound. O(n) 2n+13  ( ? ) (n), also (log n), (1), … 2n O(n) ? (n) ? (n), not O(n). nlog n O(n5) ? No. Thus, (n5).

  23. Let f and g be nonnegative functions on the positive integers We write f(n)= (g(n)) And say that f(n) is of order g(n) or, f(n) is theta of g(n) or, g is an asymptotic tight bound for f if f(n)=O(g(n)) and f(n)= (g(n)) 23

  24. e.g. T(n) = 32n2 + 17n + 32 we can ignore the multiplicative constants and the lower order terms T(n)=O(n2), O(n3), not O(n) NOTE: O(n2) tight bound, O(n3) not tight bound, Since 32n2 + 17n + 32 T(n)= (n2), (n), not (n3)  T(n)= (n2), not (n), not (n3)

  25. Properties Transitivity f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n)) f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n)) f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n)) Symmetry f(n) = (g(n)) if and only if g(n) = (f(n)) Transpose Symmetry f(n) = O(g(n)) if and only if g(n) = (f(n)) 25

More Related