1 / 43

Time Complexity

Time Complexity. Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma. Objectives (Section 7.6). The concepts of space complexity and time complexity Use the step count to derive a function of the time complexity of a program Asymptotics and orders of magnitude

ona
Download Presentation

Time Complexity

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma

  2. Objectives (Section 7.6) • The concepts of space complexity and time complexity • Use the step count to derive a function of the time complexity of a program • Asymptotics and orders of magnitude • The big-O and related notations • Time complexity of recursive algorithms

  3. Motivation n arr: sorted

  4. Evaluate An Algorithm • Two important measures to evaluate an algorithm • Space complexity • Time complexity • Space complexity • The maximum storage space needed for an algorithm • Expressed as a function of the problem size • Relatively easy to evaluate

  5. Time complexity • Determining the number of steps (operations) needed as a function of the problem size • Our focus

  6. Step Count • Count the exact number of steps needed for an algorithm as a function of the problem size • Each atomic operation is counted as one step: • Arithmetic operations • Comparison operations • Other operations, such as “assignment” and “return”

  7. Algorithm 1 1 int count_1(int n) 2 { 3 sum = 0 4 for i=1 to n { 5 for j=i to n { 6 sum++ 7 } 8 } 9 return sum 10 } 1 1 The running time is Note:

  8. Algorithm 2 1 int count_2(int n) 2 { 3 sum = 0 4 for i=1 to n { 5 sum += n+1-i 6 } 7 return sum 8 } 1 1 The running time is

  9. Algorithm 3 1 int count_3(int n) 2 { 3 sum = n(n+1)/2 4 return sum 5 } 4 1 The running time is 5 time unit

  10. Asymptotics • An exact step count is usually unnecessary • Too dependent on programming languages and programmer’s style • But make little difference in whether the algorithm is feasible or not • A change in fundamental method can make a vital difference • If the number of operations is proportional to n, then double n will double the running time • If the number of operations is proportional to 2n, doubling n will square the number of operations

  11. Example: • Assume that a computation that takes 1 second may involve 106 operations • Also assume that double the problem size will require 1012 operations • Increase running time from 1 second to 11.5 days • 1012 operations / 106 operations per second = 106 second  11.5 days

  12. Instead of an exact step count, we want a notation that • accurately reflects the increase of computation time with the size, but • ignores details that has little effect on the total • Asymptotics: the study of functions of a parameter n, as n becomes larger and larger without bound

  13. Orders of Magnitude • The idea: • Suppose function f(n)measures the amount of work done by an algorithm on a problem of size n • Compare f(n) for large values of n, with some well-known function g(n) whose behavior we already understand • To compare f(n) against g(n): • take the quotient f(n) / g(n), and • take the limit of the quotient as n increases without bound

  14. Definition • If then: f(n)has strictly smaller order of magnitudethan g(n). • If is finite and nonzero then: f(n)has the same order of magnitudeas g(n). • If then: f(n)has strictly greater order of magnitudethan g(n).

  15. Common choices for g(n): • g(n) = 1 Constant function • g(n) = log n Logarithmic function • g(n) = n Linear function • g(n) = n2 Quadratic function • g(n) = n3 Cubic function • g(n) = 2n Exponential function

  16. Notes: • The second case, when f(n) and g(n) have the same order of magnitude, includes all values of the limit except 0 and  • Changing the running time of an algorithm by any nonzero constant factor will not affect its order of magnitude

  17. Polynomials • If f(n) is a polynomial in n with degree r , then f(n) has the same order of magnitude as nr • If r < s, then nrhas strictly smaller order of magnitude than ns • Example 1: • 3n2 - 100n - 25 has strictly smaller order than n3

  18. Example 2: • 3n2 - 100n - 25 has strictly greater order than n • Example 3: • 3n2 - 100n - 25 has the same order as n2

  19. Logarithms • The order of magnitude of a logarithm does not depend on the base for the logarithms • Let loga n and logb n be logarithms to two different bases a > 1 and b > 1 • Since the base for logarithms makes no difference to the order of magnitude, we just generally write log without a base

  20. Compare the order of magnitude of a logarithm log n with a power of n, say nr (r > 0) • It is difficult to calculate the quotient log n / nr • Need some mathematical tool • L’Hôpital’s Rule • Suppose that: f(n)and g(n)are differentiable functions for all sufficiently large n, with derivatives f’(n)and g’(n), respectively • and • exists • Then exists and

  21. Use L’Hôpital’s Rule • Conclusion • log n has strictly smaller order of magnitude than any positive power nrof n, r > 0.

  22. Exponential Functions • Compare the order of magnitude of an exponential function an with a power of n, and nr (r > 0) • Use L’Hôpital’s Rule again (pp. 308) • Conclusion: • Any exponential function anfor any real number a > 1 has strictly greater order of magnitude than any power nrof n, for any positive integer r

  23. Compare the order of magnitude of two exponential functions with different bases, an and bn • Assume 0  a < b, • Conclusion: • If 0 a < b then anhas strictly smaller order of magnitude than bn

  24. Common Orders • For most algorithm analyses, only a short list of functions is needed • 1 (constant), log n (logarithmic), n (linear), n2 (quadratic), n3 (cubic), 2n (exponential) • They are in strictly increasing order of magnitude • One more important function: n log n (see pp. 309) • The order of some advanced sorting algorithms • n log n has strictly greater order of magnitude than n • n log n has strictly smaller order of magnitude than any power nrfor any r > 1

  25. Growth Rate of Common Functions

  26. The Big-O and Related Notations • These notations are pronounced “little oh”, “Big Oh”, “Big Theta”, and “Big Omega”, respectively.

  27. Examples • On a list of length n, sequential search has running time (n) • On an ordered list of length n, binary search has running time (log n) • Retrieval from a contiguous list of length n has running time O(1) • Retrieval from a linked list of length n has running time O(n). • Any algorithm that uses comparisons of keys to search a list of length n must make (log n)comparisons of keys

  28. If f(n) is a polynomial in n of degree r , then f(n) is (nr) • If r < s, then nris o(ns) • If a > 1 and b > 1, then loga(n) is (logb(n)) • log n is o(nr)for any r > 0 • For any real number a > 1 and any positive integer r, nris o(an) • If 0 a < b then anis o(bn)

  29. Algorithm 4 1 int count_0(int n) 2 { 3 sum = 0 4 for i=1 to n { 5 for j=1 to n { 6 If i<=j then 7 sum++ 8 } 9 } 10 return sum 11 } O(1) O(n) O(n2) O(n2) O(n2) O(1) The running time is O(n2)

  30. Summary of Running Times

  31. Asymptotic Running Times

  32. More Examples 1) int x = 0; for (int i = 0; i < 100; i++) x += i; 2) int x = 0; for (int i = 0; i < n2; i++) x += i; * Assume that the value of n is the size of the problem

  33. 3) int x = 0; for (int i = 1; i < n; i *= 2) x += i; 4) int x = 0; for (int i = 1; i < n; i++) for (int j = 1; j < i; j++) x += i + j;

  34. 5) int x = 0; for (int i = 1; i < n; i++) for (int j = i; j < 100; j++) x += i + j; 6) int x = 0; for (int i = 1; i < n; i++) for (int j = n; j > i; j /= 3) x += i + j; 7) int x = 0; for (int i = 1; i < n * n; i++) for (int j = 1; j < i; j++) x += i + j;

  35. Review: Arithmetic Sequences/Progressions • An arithmetic sequence is a sequence of numbers such that the difference of any two successive members of the sequence is a constant • If the first term of an arithmetic sequence is a1 and the common difference of successive members is d, then the nth term an of the sequence is:

  36. Analyzing Recursive Algorithms • Often a recurrence equation is used as the starting point to analyze a recursive algorithm • In the recurrence equation, T(n) denotes the running time of the recursive algorithm for an input of size n • We will try to convert the recurrence equation into a closed form equation to have a better understanding of the time complexity • Closed Form: No reference to T(n) on the right side of the equation • Conversions to the closed form solution can be very challenging

  37. Example: Factorial int factorial (int n) /* Pre: n is an integer no less than 0 Post: The factorial of n (n!) is returned Uses: The function factorial recursively */ { if (n == 0) return 1; else return n * factorial (n - 1); } } 1

  38. The time complexity of factorial(n) is: • T(n) is an arithmetic sequence with the common difference 4 of successive members and T(0) equals 2 • The time complexity of factorial is O(n) 3+1: The comparison is included

  39. Recurrence Equations Examples • Divide and conquer: Recursive merge sorting template <class Record> void Sortable_list<Record> :: recursive_merge_sort( int low, int high) /* Post: The entries of the sortable list between index low and high have been rearranged so that their keys are sorted into non-decreasing order. Uses: The contiguous List */ { if (high > low) { recursive_merge_sort(low, (high + low) / 2); recursive_merge_sort((high + low) / 2 + 1, high); merge(low, high); } }

  40. The time complexity of recursive_merge_sort is: • To obtain a closed form equation for T(n), we assume n is a power of 2 • When i = log2n, we have: • The time complexity is O(nlogn)

  41. Fibonacci numbers int fibonacci(int n) /* fibonacci : recursive version */ { if (n <= 0) return 0; else if (n == 1) return 1; else return fibonacci(n − 1) + fibonacci(n − 2); }

  42. The time complexity of fibonacci is: • Theorem (in Section A.4): If F(n) is defined by a Fibonacci sequence, then F(n) is (gn), where • The time complexity is exponential: O(gn)

More Related