1 / 21

Algorithmic Complexity Analysis: Time & Space Metrics

Understand time & space complexity measures for algorithmic analysis. Learn worst-case vs. average-case complexity, with examples of linear and binary search. Delve into complexity orders and problem tractability.

kwynkoop
Download Presentation

Algorithmic Complexity Analysis: Time & Space Metrics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Module #7:Complexity of Algorithms Rosen 5th ed., §2.3 ~21 slides, ~1 lecture (c)2001-2002, Michael P. Frank

  2. What is complexity? • TIme complexity : a metric to analyze time efficiency of an algorithm that is executed on a computer. • Space complexity : a metric to analyze memory space required to execute an algorithm (c)2001-2002, Michael P. Frank

  3. Our focus §2.3: Algorithmic Complexity • The algorithmic complexity of a computation is some measure of how difficult it is to perform the computation. • Measures some aspect of cost of computation (in a general sense of cost). • Common complexity measures: • “Time” complexity: # of ops or steps required • “Space” complexity: # of memory bits req’d (c)2001-2002, Michael P. Frank

  4. Complexity Depends on Input • Most algorithms have different complexities for inputs of different sizes. (E.g. searching a long list takes more time than searching a short one.) • Therefore, complexity is usually expressed as a function of input length. • This function usually gives the complexity for the worst-case input of any given length. (c)2001-2002, Michael P. Frank

  5. Worst-Case Complexity vs. Average-Case Complexity • Average-case complexity analyzes average number of operations used to solve problem over all inputs of a given size. • What’s the average-case performance of the linear search algorithm? • Sol: If x is the i-th term of the list, it takes 2i+1 comparisons. Hence, in average • (3+5+7++(2n+1))/n = n+2, which is (n) (c)2001-2002, Michael P. Frank

  6. Example 1: Max algorithm • Problem: Find the simplest form of the exact order of growth () of the worst-case time complexity (w.c.t.c.) of the max algorithm, assuming that each line of code takes some constant time every time it is executed (with possibly different times for different lines of code). (c)2001-2002, Michael P. Frank

  7. Complexity analysis of max proceduremax(a1, a2, …, an: integers) max:=a1 t1 fori:= 2 ton t2 ifai > maxthen max:=ai t3 returnv t4 What’s an expression for the exact total worst-case time? (Not its order of growth.) Times for each execution of each line. (c)2001-2002, Michael P. Frank

  8. Complexity analysis, cont. proceduremax(a1, a2, …, an: integers) v:=a1 t1 fori:= 2 ton t2 ifai > vthen v:=ai t3 returnv t4 w.c.t.c.: Times for each execution of each line. (c)2001-2002, Michael P. Frank

  9. Complexity analysis, cont. Now, what is the simplest form of the exact () order of growth of t(n)? (c)2001-2002, Michael P. Frank

  10. Example 2: Linear Search procedurelinear search (x: integer, a1, a2, …, an: distinct integers)i:= 1 t1while (i n  x  ai) t2i:=i + 1 t3ifi n then location:=i t4elselocation:= 0 t5return location t6 (c)2001-2002, Michael P. Frank

  11. Linear search analysis • Worst case time complexity order: • Best case: • Average case, if item is present: (c)2001-2002, Michael P. Frank

  12. Review §2.2: Complexity • Algorithmic complexity = cost of computation. • Focus on time complexity (space & energy are also important.) • Characterize complexity as a function of input size: Worst-case, best-case, average-case. • Use orders of growth notation to concisely summarize growth properties of complexity fns. (c)2001-2002, Michael P. Frank

  13. Example 3: Binary Search procedurebinary search (x:integer, a1, a2, …, an: distinct integers)i:= 1 j:=nwhile i<j beginm:=(i+j)/2ifx>amtheni := m+1 else j := mendifx = aithenlocation:=ielselocation:= 0returnlocation Key question:How many loop iterations? (1) (1) (1) (c)2001-2002, Michael P. Frank

  14. Binary search analysis • Suppose n=2k. • Original range from i=1 to j=n contains n elems. • Each iteration: Size ji+1 of range is cut in half. • Loop terminates when size of range is 1=20 (i=j). • Therefore, number of iterations is k = log2n= (log2n)= (log n) • Even for n2k (not an integral power of 2),time complexity is still (log2n) = (log n). (c)2001-2002, Michael P. Frank

  15. Names for some orders of growth • (1) Constant • (logcn) Logarithmic (same order c) • (logcn) Polylogarithmic • (n) Linear • (nc) Polynomial • (cn), c>1 Exponential • (n!) Factorial (With ca constant.) (c)2001-2002, Michael P. Frank

  16. Problem Complexity • The complexity of a computational problem or task is (the order of growth of) the complexity of the algorithm with the lowest order of growth of complexity for solving that problem or performing that task. • E.g. the problem of searching an ordered list has at most logarithmic time complexity. (Complexity is O(log n).) (c)2001-2002, Michael P. Frank

  17. Tractable vs. intractable • A problem or algorithm with at most polynomial time complexity is considered tractable (or feasible). P is the set of all tractable problems. • A problem or algorithm that has more than polynomial complexity is considered intractable (or infeasible). • Note that n1,000,000 is technically tractable, but really impossible. nlog log log n is technically intractable, but easy. Such cases are rare though. (c)2001-2002, Michael P. Frank

  18. Unsolvable problems • Turing discovered in the 1930’s that there are problems unsolvable by any algorithm. • Or equivalently, there are undecidable yes/no questions, and uncomputable functions. • Example: the halting problem. • Given an arbitrary algorithm and its input, will that algorithm eventually halt, or will it continue forever in an “infinite loop?” (c)2001-2002, Michael P. Frank

  19. Pvs.NP • NP is the set of problems for which there exists a tractable algorithm for checking solutions to see if they are correct. • We know PNP, but the most famous unproven conjecture in computer science is that this inclusion is proper (i.e., that PNP rather than P=NP). • Whoever first proves it will be famous! (c)2001-2002, Michael P. Frank

  20. Computer Time Examples (125 kB) (1.25 bytes) Assume time = 1 ns (109 second) per op, problem size = n bits, #ops a function of n as shown. (c)2001-2002, Michael P. Frank

  21. Things to Know • Definitions of algorithmic complexity, time complexity, worst-case complexity; names of orders of growth of complexity. • How to analyze the worst case, best case, or average case order of growth of time complexity for simple algorithms. (c)2001-2002, Michael P. Frank

More Related