Week 12 wednesday
Download
1 / 35

CS322 - PowerPoint PPT Presentation


  • 235 Views
  • Uploaded on

Week 12 - Wednesday. CS322. Last time. What did we talk about last time? Asymptotic notation. Questions?. Logical warmup. A businesswoman has two cubes on her desk Every day she arranges both cubes so that the front faces show the current day of the month

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'CS322' - kyna


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

Last time
Last time

  • What did we talk about last time?

  • Asymptotic notation



Logical warmup
Logical warmup

  • A businesswoman has two cubes on her desk

  • Every day she arranges both cubes so that the front faces show the current day of the month

  • What numbers do you need on the faces of the cubes to allow this?

  • Note: Both cubes must be used for every day

3



Proving bounds
Proving bounds

  • Prove a  bound for g(x) = (1/4)(x – 1)(x + 1) for x R

  • Prove that x2 is not O(x)

    • Hint: Proof by contradiction


Polynomials
Polynomials

  • Let f(x) be a polynomial with degree n

    • f(x) = anxn + an-1xn-1 + an-2xn-2 … + a1x + a0

  • By extension from the previous results, if an is a positive real, then

    • f(x) is O(xs) for all integers s n

    • f(x) is (xr) for all integers r≤n

    • f(x) is (xn)

  • Furthermore, let g(x) be a polynomial with degree m

    • g(x) = bmxm + bm-1xm-1 + bm-2xm-2 … + b1x + b0

  • If an and bm are positive reals, then

    • f(x)/g(x) is O(xc) for real numbers c> n - m

    • f(x)/g(x) is not O(xc) for real numbers c < n -m

    • f(x)/g(x) is (xn- m)



Extending notation to algorithms
Extending notation to algorithms

  • We can easily extend our -, O-, and - notations to analyzing the running time of algorithms

  • Imagine that an algorithm A is composed of some number of elementary operations (usually arithmetic, storing variables, etc.)

  • We can imagine that the running time is tied purely to the number of operations

  • This is, of course, a lie

    • Not all operations take the same amount of time

    • Even the same operation takes different amounts of time depending on caching, disk access, etc.


Running time
Running time

  • First, assume that the number of operations performed by A on input size n is dependent only on n, not the values of the data

    • If f(n) is (g(n)), we say that Ais(g(n)) or that A is of order g(n)

  • If the number of operations depends not only on n but also on the values of the data

    • Let b(n) be the minimum number of operations where b(n) is (g(n)), then we say that in the best case, Ais(g(n)) or that A has a best case order of g(n)

    • Let w(n) be the maximum number of operations where w(n) is (g(n)), then we say that in the worst case, Ais(g(n)) or that A has a worst case order of g(n)



Computing running time
Computing running time

  • With a single for (or other) loop, we simply count the number of operations that must be performed:

    int p = 0;

    int x = 2;

    for( inti = 2; i <= n; i++ )

    p = (p + i)*x;

  • Counting multiplies and adds, (n – 1) iterations times 2 operations = 2n – 2

  • As a polynomial, 2n – 2 is (n)


Nested loops
Nested loops

  • When loops do not depend on each other, we can simply multiply their iterations (and asymptotic bounds)

    int p = 0;

    for( inti = 2; i <= n; i++ )

    for( int j = 3; j <= n; j++ )

    p++;

  • Clearly (n – 1)(n -2) is (n2)


Trickier nested loops
Trickier nested loops

  • When loops depend on each other, we have to do more analysis

    int s = 0;

    for( inti = 1; i <= n; i++ )

    for( int j = 1; j <= i; j++ )

    s = s + j*(i – j + 1);

  • What's the running time here?

  • Arithmetic sequence saves the day (for the millionth time)


Iterations with floor
Iterations with floor

  • When loops depend on floor, what happens to the running time?

    int a = 0;

    for( inti = n/2; i <= n; i++ )

    a = n - i;

  • Floor is used implicitly here, because we are using integer division

  • What's the running time? Hint: Consider n as odd or as even separately


Sequential search
Sequential search

  • Consider a basic sequential search algorithm:

    int search( int[]array, int n, int value)

    {

    for( inti = 0; i < n; i++ )

    if( array[i] == value )

    returni;

    return -1;

    }

  • What's its best case running time?

  • What's its worst case running time?

  • What's its average case running time?


Insertion sort algorithm
Insertion sort algorithm

  • Insertion sort is a common introductory sort

  • It is suboptimal, but it is one of the fastest ways to sort a small list (10 elements or fewer)

  • The idea is to sort initial segments of an array, insert new elements in the right place as they are found

  • So, for each new item, keep moving it up until the element above it is too small (or we hit the top)


Insertion sort in code
Insertion sort in code

public static void sort( int[]array, int n)

{

for( inti = 1; i < n; i++ )

{

intnext = array[i];

int j = i - 1;

while( j != 0 && array[j] > next )

{

array[j+1] = array[j];

j--;

}

array[j] = next;

}

}


Best case analysis of insertion sort
Best case analysis of insertion sort

  • What is the best case analysis of insertion sort?

  • Hint: Imagine the array is already sorted


Worst case analysis of insertion sort
Worst case analysis of insertion sort

  • What is the worst case analysis of insertion sort?

  • Hint: Imagine the array is sorted in reverse order


Average case analysis of insertion sort
Average case analysis of insertion sort

  • What is the average case analysis of insertion sort?

  • Much harder than the previous two!

  • Let's look at it recursively

  • Let Ek be the average number of comparisons needed to sort k elements

  • Ek can be computed as the sum of the average number of comparisons needed to sort k – 1 elements plus the average number of comparisons (x) needed to insert the kth element in the right place

    • Ek = Ek-1 + x


Finding x
Finding x

  • We can employ the idea of expected value from probability

  • There are k possible locations for the element to go

  • We assume that any of these k locations is equally likely

  • For each turn of the loop, there are 2 comparisons to do

  • There could be 1, 2, 3, … up to k turns of the loop

  • Thus, weighting each possible number of iterations evenly gives us


Finishing the analysis
Finishing the analysis

  • Having found x, our recurrence relation is:

    • Ek = Ek-1 + k + 1

  • Sorting one element takes no time, so E1 = 0

  • Solve this recurrence relation!

  • Well, if you really banged away at it, you might find:

    • En = (1/2)(n2 + 3n – 4)

  • By the polynomial rules, this is (n2) and so the average case running time is the same as the worst case




Exponential functions
Exponential functions

  • Well, they grow fast

  • Graph 2x for -3 ≤ x ≤ 3

  • When considering bx, it's critically important whether b > 1 (in which case bx grows very fast in the positive direction) or 0 < b < 1 (in which case bx grows very fast in the negative direction)

  • Graph bx when b > 1

  • Graph bxwhen 0 < b < 1

  • What happens when b = 1?

  • What happens when b ≤ 0?


Logarithmic function
Logarithmic function

  • The logarithmic function with base b, written logb is the inverse of the exponential function

  • Thus,

    • by = x logbx = y for b > 0 and b  1

  • Log is a "de-exponentiating" function

  • Log grows very slowly

  • We're interested in logb when b > 1, in which case logb is an increasing function

    • If x1 < x2, logb(x1) < logb(x2), for b > 1 and positive x1 and x2


Applying log
Applying log

  • How many binary digits are needed to represent a number n?

  • We can write n = 2k + ck-12k-1 + … c222 + c12 + c0 where ci is either 0 or 1

  • Thus, we need no more than k + 1 digits to represent n

  • We know that n < 2k + 1

  • Since 2k ≤ n < 2k+1, k ≤ log2n < k+1

  • The total number of digits we need k + 1 ≤ log2n + 1


Recurrence relations
Recurrence relations

  • Consider the following recurrence relation

    • a1 = 0

    • ak = ak/2 + 1 for integers k ≥ 2

  • What do you think its explicit formula is?

  • It turns out that an = log n

  • We can prove this with strong induction


Exponential and logarithmic orders
Exponential and logarithmic orders

  • For all real numbers b and r with b > 1 and r > 0

    • logbx ≤ xr, for sufficiently large x

    • xr ≤ bx, for sufficiently large x

  • These statements are equivalent to saying for all real numbers b and r with b > 1 and r > 0

    • logbx is O(xr)

    • xr ≤ O(bx)


Important things
Important things

  • We don't have time to show these things fully

  • xk is O(xklogbx)

  • xklogbx is O(xk+1)

  • The most common case you will see of this is:

    • x is O(x log x)

    • x log x is O(x2)

    • In other words, x log x is between linear and quadratic

  • logbx is (logc x) for all b > 1 and c > 1

    • In other words, logs are equivalent, no matter what base, in asymptotic notation

  • 1/2 + 1/3 + … + 1/n is (log2 n)




Next time
Next time…

  • Review for Exam 3

    • Relations, counting, graphs, and trees


Reminders
Reminders

  • Study for Exam 3

    • Monday in class

  • Finish Assignment 9

    • Due Friday by midnight

  • Talk on the Shadow programming language

    • Tonight in E281 at 6pm