1 / 14

Runtime Efficiency

Runtime Efficiency. Efficiency (Complexity). The rate at which storage or time grows as a function of the problem size. Two types of efficiency: Time efficiency Space efficiency There are always tradeoffs between these two efficiencies. Example: singly vs. doubly linked list.

jmeissner
Download Presentation

Runtime Efficiency

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Runtime Efficiency

  2. Efficiency (Complexity) • The rate at which storage or time grows as a function • of the problem size. • Two types of efficiency: • Time efficiency • Space efficiency • There are always tradeoffs between these two • efficiencies. • Example: singly vs. doubly linked list

  3. Time Efficiency • How do we improve the time efficiency of a program? • The 90/10 Rule • 90% of the execution time of a program is spent in • executing 10% of the code • So, how do we locate the critical 10%? • software metrics tools • global counters to locate bottlenecks (loop executions, • function calls)

  4. Time Efficiency Improvements • Possibilities (some better than others!) • Move code out of loops that does not belong there • (just good programming!) • Remove any unnecessary I/O operations (I/O operations • are expensive time-wise) • Code so that the compiled code is more efficient • Replace an inefficient algorithm (best solution) • Moral - Choose the most appropriate algorithm(s) BEFORE • program implementation

  5. Asymptotic Analysis (Runtime Analysis) • Independent of any specific hardware or software • Expresses the complexity of an algorithm in terms of its • relationship to some known function • The efficiency can be expressed as “proportional to” or • “on the order of some function • A common notation used is called Big-Oh: • T(n) = O(f(n)) • Said: T of n is “on the order of f(n)

  6. Example Algorithm and CorrespondingBig-Oh • Algorithm: • prompt the user for a filename and read it (100 “time units”) • open the file (50 “time units”) • read 15 data items from the file into an array (10 “time • units” per read) • close the file (50 “time units”) • Formula describing algorithm efficiency: • 200 + 10n where n=number of items read (here, 15) • Which term of the function really describes the growth pattern • as n increases? • Therefore, the Big-Oh for this algorithm is O(n).

  7. Common Functions in Big-Oh(Most Efficient to Least Efficient) • O(1) or O(c) • Constant growth. The runtime does not grow at all as a • function of n. It is a constant. Basically, it is any operation • that does not depend on the value of n to do its job. Has • the slowest growth pattern (none!). • Examples: • Accessing an element of an array containing n elements • Adding the first and last elements of an array containing • n elements

  8. Common Functions in Big-Oh (con’t) • O(lg(n)) • Logarithmic growth. The runtime growth is proportional to • the base 2 logarithm (lg) of n. • Example: • Binary search

  9. Common Functions in Big-Oh (con’t) • O(n) • Linear growth. Runtime grows proportional to the value of • n. • Examples: • Sequential (linear) search • Any looping over the elements of a one-dimensional • array (e.g., summing, printing)

  10. Common Functions in Big-Oh (con’t) • O(n lg(n)) • n log n growth. Any sorting algorithm that uses comparisons • between elements is O(n lg n). • Examples • Merge sort • Quicksort

  11. Common Functions in Big-Oh (con’t) • O(nk) • Polynomial growth. Runtime grows very rapidly. • Examples: • Bubble sort (O(n2)) • Selection (exchange) sort (O(n2)) • Insertion sort (O(n2))

  12. Common Functions in Big-Oh (con’t) • O(2n) • Exponential growth. Runtime grows extremely rapidly • as n increases. Are essentially useless except for very • small values of n. • Others • O(sqrt(n)) • O(n!)

  13. So, Which Algorithm Do I Choose? • What are my response time needs? • real-time • “go out to lunch” time • overnight time • What is the maximum size of n expected now and in • the future? • If n will always be very small, pick the algorithm that is • easiest to understand, debug, maintain, etc., and still • meets your response time needs

  14. So, Which Algorithm Do I Choose? (con’t) • How close to sorted are my values already? • The same algorithm can have different efficiencies • depending on the order of the values (best case, • average case, worst case) • Examples: • linear search • best case O(1) - value is first in list • average case O(1/2 n) • worst case O(n) - value is last in list • quicksort • best and average cases O(n lg n) • worst case O(n2) - values are already in order

More Related