1 / 30

Dr.Surasak Mungsing E-mail: Surasak.mu@spu.ac.th

CSE 221/ICT221 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh. Dr.Surasak Mungsing E-mail: Surasak.mu@spu.ac.th. Big-Oh Notation. Big-Oh was introduced for functions’ growth rate comparison in 1927 based on asymptotic behavior

Download Presentation

Dr.Surasak Mungsing E-mail: Surasak.mu@spu.ac.th

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSE 221/ICT221 Analysis and Design of AlgorithmsLecture 04:Time complexity analysis in form of Big-Oh Dr.SurasakMungsing E-mail: Surasak.mu@spu.ac.th

  2. Big-Oh Notation • Big-Ohwas introduced for functions’ growth rate comparison in 1927 based on asymptotic behavior • Big-Oh notation characterizes functions according to their growth rates: different functions with the same growth rate may be represented using the same O notation. • A description of a function in terms of big-Oh notation usually only provides an upper bound on the growth rate of the function. CSE221/ICT221 Analysis and Design of Algorithms

  3. Upper Bound • In general a function f(n) is O(g(n)) if  positive constants c and n0such that f(n)  c  g(n)  n  n0 e.g. if f(n)=1000n and g(n)=n2, n0> 1000 and c = 1 then f(n0) < 1.g(n0) and we say that f(n) = O(g(n)) • The O notation indicates 'bounded above by a constant multiple of.' CSE221/ICT221 Analysis and Design of Algorithms

  4. CSE221NMT221/ICT221 การวิเคราะห์และออกแบบขั้นตอนวิธี Big-Oh, the Asymptotic Upper Bound • Because big O notation discards multiplicative constants on the running time, and ignores efficiency for low input sizes, it does not always reveal the fastest algorithm in practice or for practically-sized data sets, but the approach is still very effective for comparing the scalability of various algorithms as input sizes become large. • If an algorithm‘s time complexity is in the order of O(n2) then it’s growth rate of computation time will not be faster than a quadratic function for input that is large enough • Some upper bounds may be too broad, for example saying that 2n2 = O(n3) • By definition if c = 1 and n0 = 2, it is better to say that 2n2 = O(n2)

  5. Example 1 For all n>6, g(n) > 1 f(n). f (n)is in big-O of g(n) then, f(n)is inO(g(n)). CSE221/ICT221 Analysis and Design of Algorithms

  6. Example 2 There exists n0 such that for all n>n0, If f(n) < 1 g(n) then f(n)is inO(g(n)) CSE221/ICT221 Analysis and Design of Algorithms

  7. Example 3 There exists n0=5, c=3.5, for all n>n0, if f(n) < c h(n) thenf(n) is in O(h(n)). CSE221/ICT221 Analysis and Design of Algorithms

  8. CSE221/ICT221 Analysis and Design of Algorithms

  9. Exercise on O-notation • Show that f(n)=3n2+2n+5 is in O(n2) 10 n2 = 3n2 + 2n2 + 5n2  3n2 + 2n + 5 for n  1  f(n) orf(n) ≤10 n2 considerc = 10, n0 = 1 f(n) ≤c g(n2 ) for n n0 thenf(n) is inO(n2) CSE221/ICT221 Analysis and Design of Algorithms

  10. Usage of Big-Oh • We should always write Big-Oh in the most simple form e.g. 3n2+2n+5 = O(n2) It is not wrong to write these functions in term of Big-Oh as below, but the most appropriate form should be in the most simple form • 3n2+2n+5 = O(3n2+2n+5) • 3n2+2n+5 = O(n2+n) • 3n2+2n+5 = O(3n2) CSE221/ICT221 Analysis and Design of Algorithms

  11. Exercise on O-notation • f1(n) = 10 n + 25 n2 • f2(n) = 20 n log n + 5 n • f3(n) = 12 n log n + 0.05 n2 • f4(n) = n1/2 + 3 n log n • O(n2) • O(n log n) • O(n2) • O(n log n) CSE221/ICT221 Analysis and Design of Algorithms

  12. Classification of Function : BIG-Oh • A function f(n) is said to be of at most logarithmic growth if f(n) = O(log n) • A function f(n) is said to be of at most quadratic growth if f(n) = O(n2) • A function f(n) is said to be of at most polynomial growth if f(n) = O(nk), for some natural number k > 1 • A function f(n) is said to be of at most exponential growth if there is a constant c, such that f(n) = O(cn), and c > 1 • A function f(n) is said to be of at most factorial growth if f(n) = O(n!). CSE221/ICT221 Analysis and Design of Algorithms

  13. Classification of Function : BIG-Oh (cont.) • A function f(n) is said to have constant running time if the size of the input n has no effect on the running time of the algorithm (e.g., assignment of a value to a variable). The equation for this algorithm is f(n) = c • Other logarithmic classifications: f(n) = O(n log n) f(n) = O(log log n) CSE221/ICT221 Analysis and Design of Algorithms

  14. Big O Fact A polynomial of degree k is O(nk) Proof: ถ้าf(n) = bknk + bk-1nk-1 + … + b1n + b0 และให้ai = | bi | ดังนั้นf(n)  aknk + ak-1nk-1 + … + a1n + a0 CSE221/ICT221 Analysis and Design of Algorithms

  15. Some Rules Transitivity f(n) = O(g(n)) and g(n) = O(h(n))f(n) = O(h(n)) Addition f(n) + g(n) = O(max { f(n) ,g(n)}) Polynomials a0 + a1n + … + adnd = O(nd) Heirachy of functions n + log n = O(n); 2n + n3 = O(2n) CSE221/ICT221 Analysis and Design of Algorithms

  16. Some Rules Base of Logs ignored logan = O(logbn) Power inside logs ignored log(n2) = O(log n) Base and powers in exponents not ignored 3n is not O(2n) a(n)2 is not O(an) CSE221/ICT221 Analysis and Design of Algorithms

  17. Big-Oh Complexity • O(1) The cost of applying the algorithm can be bounded independently of the value of n. This is called constant complexity. • O(log n) The cost of applying the algorithm to problems of sufficiently large size n can be bounded by a function of the form k log n, where k is a fixed constant. This is called logarithmic complexity. • O(n)linear complexity • O(n log n) n lg n complexity • O(n2) quadratic complexity CSE221/ICT221 Analysis and Design of Algorithms

  18. Big-Oh Complexity (cont.) • O(n3) cubic complexity • O(n4) quadratic complexity • O(n32) polynomial complexity • O(cn) If constant c>1, then this is called exponential complexity • O(2n) exponential complexity • O(en) exponential complexity • O(n!) factorial complexity CSE221/ICT221 Analysis and Design of Algorithms

  19. Practical Complexity t < 500 CSE221/ICT221 Analysis and Design of Algorithms

  20. Practical Complexity t < 5000 CSE221/ICT221 Analysis and Design of Algorithms

  21. Practical Complexity CSE221/ICT221 Analysis and Design of Algorithms

  22. Things to Remember in Analysis • Constants or low-order terms are ignored if f(n) = 2n2 then f(n) = O(n2) • running time and memory are important resources for algorithm and very large input affects algorithm performance the most • Parameter N, normally means size of input N may refers to degree of polynomial, size of input file for data sorting, or number of nodes in graph CSE221/ICT221 Analysis and Design of Algorithms

  23. Things to Remember in Analysis • Worst caseanalysis means the worst-case execution time of particular concern (it is important to know how much time might be needed in the worst case to guarantee that the algorithm will always finish on time) • Average performance ,and also worst-case), performance is the most used in algorithm analysis, using probabilistic analysis techniques, especially expected value, to determine expected running times (i.e. the case of typical input data) CSE221/ICT221 Analysis and Design of Algorithms

  24. t1 Block #1 Block #2 t2 General Rules for Analysis (1) 1. Consecutive statements • count only the most time required of the consecutive block of statements • count only the most time required of the consecutive loops t1+ t2 = max(t1,t2) CSE221/ICT221 Analysis and Design of Algorithms

  25. Block #1 t1 Block #2 t2 General Rules for Analysis(2) 2. If/Else if cond then S1 else S2 Max(t1,t2) CSE221/ICT221 Analysis and Design of Algorithms

  26. General Rules for Analysis(3) 3. For Loops • Running time of a for-loop is at most the running time of the statements inside the for-loop times number of iterations for (i = sum = 0; i < n; i++) sum += a[i]; • for loop iterates n times, executes 2 assignment statements each iteration ==> asymptotic complexity of O(n) CSE221/ICT221 Analysis and Design of Algorithms

  27. General Rules for Analysis(4) 4. Nested For-Loops Analyze inside-out: total time is the product of time required for each loop for (i =0; i < n; i++) for (j = 0, sum = a[0]; j <= i ; j++) sum += a[j]; printf("sum for subarray - through %d is %d\n", i, sum); CSE221/ICT221 Analysis and Design of Algorithms

  28. General Rules for Analysis CSE221/ICT221 Analysis and Design of Algorithms

  29. General Rules for Analysis • Analysis strategy: • analyze from inside out • analyze function calls first CSE221/ICT221 Analysis and Design of Algorithms

  30. CSE221/ICT221 Analysis and Design of Algorithms

More Related