290 likes | 297 Views
Recursion Algorithm Analysis Standard Algorithms. Chapter 7. Recursion. Consider things that reference themselves Cat in the hat in the hat in the hat … A picture of a picture Having a dream in your dream!! Recursion has its base in mathematical induction Recursion always has
E N D
Recursion • Consider things that reference themselves • Cat in the hat in the hat in the hat … • A picture of a picture • Having a dream in your dream!! • Recursion has its base in mathematical induction • Recursion always has • an anchor (or base or trivial) case • an inductive case
Recursion • A recursive function will call or reference itself. • Considerint R(int x) { return 1 + R(x); } • What is wrong with this picture? • Nothing will stop repeated recursion • Like an endless loop, but will eventually cause your program to run out of memory The problem is that this function has no anchor.
Recursion A proper recursive function will have • An anchor or base case • the function’s value is defined for one or more values of the parameters • An inductive or recursive step • the function’s value (or action) for the current parameter values is defined in terms of … • previously defined function values (or actions) and/or parameter values.
Recursive Example int Factorial(int n){ if (n == 0) return 1; else return n * Factorial(n - 1);} • Which is the anchor? • Which is the inductive or recursive part? • How does the anchor keep it from going forever?
A Bad Use of Recursion • Fibonacci numbers1, 1, 2, 3, 5, 8, 13, 21, 34f1 = 1, f2 = 1 … fn = fn -2 + fn -1 • A recursive functiondouble Fib (unsigned n){ if (n <= 2) return 1; else return Fib (n – 1) + Fib (n – 2); } • Why is this inefficient? • Note the recursion tree on pg 327
Uses of Recursion • Easily understood recursive functions are not always the most efficient algorithms • "Tail recursive" functions • When the last statement in the recursive function is a recursive invocation. • These are much more efficiently written with a loop • Elegant recursive algorithms • Binary search (see pg 328) • Palindrome checker (pg 330) • Towers of Hanoi solution (pg 336) • Parsing expressions (pg 338)
Comments on Recursion • Many iterative tasks can be written recursively • but end up inefficient However • There are many problems with good recursive solutions • And their iterative solutions are • not obvious • difficult to develop
Algorithm Efficiency • How do we measure efficiency • Space utilization – amount of memory required • Time required to accomplish the task • Time efficiency depends on : • size of input • speed of machine • quality of source code • quality of compiler These vary from one platform to another
Algorithm Efficiency • We can count the number of times instructions are executed • This gives us a measure of efficiency of an algorithm • So we measure computing time as:T(n) = computing time of an algorithm for input of size n = number of times the instructions are executed
Example: Calculating the Mean Task # times executed • Initialize the sum to 0 1 • Initialize index i to 0 1 • While i < n do following n+1 • a) Add x[i] to sum n • b) Increment i by 1 n • Return mean = sum/n 1 Total 3n + 4
Computing Time Order of Magnitude • As number of inputs increases • T(n) = 3n + 4 grows at a rate proportional to n • Thus T(n) has the "order of magnitude" n • The computing time of an algorithm on input of size n, • T(n) said to have order of magnitude f(n), • written T(n) is O(f(n)) if … there is some constant C such that • T(n) < Cf(n) for all sufficiently large values of n
Big Oh Notation Another way of saying this: • The complexityof the algorithm is O(f(n)). • Example: For the Mean-Calculation Algorithm: T(n) is O(n) • Note that constants and multiplicative factors are ignored.
Big Oh Notation • f(n) is usually simple: n, n2, n3, ...2n1, log2nn log2nlog2log2n
Big-O Notation • Cost function • A numeric function that gives performance of an algorithm in terms of one or more variables • Typically the variable(s) capture number of data items • Actual cost functions are hard to develop • Generally we use approximating functions
Function Dominance • Asymptotic dominance • g dominates f if there is a positive constant c such that • Example: suppose the actual cost function is • Both of these will dominate T(n) for sufficiently large values of n
Estimating Functions Characteristics for good estimating functions • It asymptotically dominates the actual time function • It is simple to express and understand • It is as close an estimate as possible Because any constant c > 1 will make n2 larger
Estimating Functions • Note how the c*n2 dominates Thus we use n2 as an estimate of the time required
Order of a Function • To express time estimates concisely we use the concept “order of a function” • Definition:Given two nonnegative functions f and g, the order of f is g, iff g asymptotically dominates f • Stated • “f is of order g” • “f = O(g)” big-O notationO stands for “Order”
Order of a Function • Note the possible confusion • The notation does NOT say “the order of g is f” nor does it say “f equals the order of g” • It does say “f is of order g”
Big-O Arithmetic • Given f and g functions, k a constant
Example: Calculating the Mean Task # times executed • Initialize the sum to 0 1 • Initialize index i to 0 1 • While i < n do following n+1 • a) Add x[i] to sum n • b) Increment i by 1 n • Return mean = sum/n 1 Total 3n + 4 Based on Big-O arithmetic this algorithm has O(n)
Worst-Case Analysis • The arrangement of the input items may affect the computing time. • How then to measure performance? • best case not very informative • average too difficult to calculate • worst case usual measure • Consider Linear search of the list a[0], . . . , a[n – 1].
Worst-Case Analysis Linear search of a[0] … a[n-1] Algorithm: • found = false. • loc = 0. • While (loc < n && !found ) • If item = a[loc] found = true // item found • Else Increment loc by 1 // keep searching • Worst case: Item not in the list: TL(n) is O(n) • Average case (assume equal distribution of values) is O(n)
Binary Search 1. found = false.2. first = 0.3. last = n – 1.4. While (first < last && !found ) 5. Calculate loc = (first + last) / 2.6. If item < a[loc] then 7. last = loc – 1. // search first half8. Else if item > a[loc] then9. first = loc + 1. // search last half 10. Else found = true. // item found • Each pass cuts the list in half • Worst case : item not in list TB(n) = O(log2n) Binary search of a[0] … a[n-1]
Common Computing Time Functions For our binary search
Computing in Real Time • Suppose each instruction can be done in 1 microsecond • For n = 256 inputs how long for various f(n)
Conclusion • Algorithms with exponential complexity • practical only for situations where number of inputs is small • Bubble sort has O(n2) • OK for n < 100 • Totally impractical for large n
Computing Times Of Recursive Functions // Towers of Hanoi void Move(int n, char source, char destination, char spare) { if (n <= 1) // anchor (base) case cout << "Move the top disk from " << source << " to " << destination << endl; else { // inductive case Move(n-1, source, spare, destination); Move(1, source, destination, spare); Move(n-1, spare, destination, source); } } T(n) = O(2n)