1 / 10

Efficiency (Chapter 2)

Efficiency (Chapter 2). Efficiency of Algorithms. Difficult to get a precise measure of the performance of an algorithm or program Can characterize a program by how the execution time or memory requirements increase as a function of increasing input size Big-O notation

thu
Download Presentation

Efficiency (Chapter 2)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Efficiency (Chapter 2)

  2. Efficiency of Algorithms • Difficult to get a precise measure of the performance of an algorithm or program • Can characterize a program by how the execution time or memory requirements increase as a function of increasing input size • Big-O notation • A simple way to determine the big-O of an algorithm or program is to look at the loops and to see whether the loops are nested

  3. Counting Executions • Generally, • A statement counts as “1” • Sequential statements add: • Statement; statement 2 • Loops multiply based on how many times they run

  4. Counting Example • Consider: • First time through outer loop, inner loop is executed n-1 times; next time n-2, and the last time once. • So we have • T(n) = 3(n – 1) + 3(n – 2) + … + 3 or • T(n) = 3(n – 1 + n – 2 + … + 1)

  5. Counting Example (continued) • We can reduce the expression in parentheses to: • n x (n – 1) 2 • So, T(n) = 1.5n2 – 1.5n • This polynomial is zero when n is 1. For values greater than 1, 1.5n2 is always greater than 1.5n2 – 1.5n • Therefore, we can use 1 for n0 and 1.5 for c to conclude that T(n) is O(n2)

  6. Big-O Growth

  7. Importance of Big-O • Doesn’t matter for small values of N • Shows how time grows with size • Regardless of comparative performance for small N, smaller big-O will eventually win • Technically, what we usually use when we say big-O, is really big-Theta

  8. Some Rules • If you have to read or write N items of data, your program is at least O(N) • Search programs range from O(log N) to O(N) • Sort programs range from O(N log N) to O(N2)

  9. Cases • Different data might take different times to run • Best case • Worst case • Average case • Example: consider sequential search…

More Related