Foundations of software design fall 2002 marti hearst
This presentation is the property of its rightful owner.
Sponsored Links
1 / 20

Foundations of Software Design Fall 2002 Marti Hearst PowerPoint PPT Presentation


  • 42 Views
  • Uploaded on
  • Presentation posted in: General

Foundations of Software Design Fall 2002 Marti Hearst. Lecture 11: Analysis of Algorithms, cont. Function Pecking Order. In increasing order. Where does n log n fit in?. Adapted from Goodrich & Tamassia. Plot them!. Both x and y linear scales. Convert y axis to log scale.

Download Presentation

Foundations of Software Design Fall 2002 Marti Hearst

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Foundations of software design fall 2002 marti hearst

Foundations of Software DesignFall 2002Marti Hearst

Lecture 11: Analysis of Algorithms, cont.


Function pecking order

Function Pecking Order

  • In increasing order

Where does n log n fit in?

Adapted from Goodrich & Tamassia


Plot them

Plot them!

Both x and y linear scales

Convert y axis to log scale

(that jump for large n happens because the last number is out of range)

Notice how much bigger 2^n is than n^k

This is why exponential growth is BAD BAD BAD!!


More plots

More Plots


Let s count some beer

Let’s Count Some Beer

  • A well-known “song”

    • “100 bottles of beer on the wall, 100 bottles of beer; you take one down, pass it around, 99 bottles of beer on the wall.”

    • “99 bottles of beer on the wall, 99 bottles of beer; you take one down, pass it around, 98 bottles of beer on the wall.”

    • “1 bottle of beer on the wall, 1 bottle of beer, you take it down, pass it around, no bottles of beer on the wall.”

    • HALT.

  • Let’s change the song to “N bottles of beer on the wall”. The number of bottles of beer passed around is Order what?


Let s count some ants

Let’s Count Some Ants

  • Another song:

    • The ants go marching 1 by 1

    • The ants go marching 2 by 2

    • The ants go marching 3 by 3

  • How ants are in the lead in each wave of ants?

    1 + 2 + 3 + … + n

  • Does this remind you of anything?


Graph it

Graph it!

Let’s plot beer(n) versus ants(n)

Ants


Foundations of software design fall 2002 marti hearst

Definition of Big-Oh

A running time is O(g(n)) if there exist constants n0 > 0 and c > 0 such that for all problem sizes n > n0, the running time for a problem of size n is at most c(g(n)).

In other words, c(g(n)) is an upper bound on the running time for sufficiently large n.

c g(n)

http://www.cs.dartmouth.edu/~farid/teaching/cs15/cs5/lectures/0519/0519.html


The crossover point

The Crossover Point

One function starts out faster for small values of n.

But for n > n0, the other function is always faster.

Adapted from http://www.cs.sunysb.edu/~algorith/lectures-good/node2.html


More formally

More formally

  • Let f(n) and g(n) be functions mapping nonnegative integers to real numbers.

  • f(n) is (g(n)) if there exist positive constants n0 and c such that for all n>=n0,f(n) <= c*g(n)

  • Other ways to say this:

    f(n) is orderg(n)

    f(n) is big-Oh ofg(n)

    f(n) is Oh ofg(n)

    f(n)  O(g(n)) (set notation)


Comparing running times

Comparing Running Times

Adapted from Goodrich & Tamassia


Analysis example phonebook

Analysis Example: Phonebook

  • Given:

    • A physical phone book

      • Organized in alphabetical order

    • A name you want to look up

    • An algorithm in which you search through the book sequentially, from first page to last

    • What is the order of:

      • The best case running time?

      • The worst case running time?

      • The average case running time?

    • What is:

      • A better algorithm?

      • The worst case running time for this algorithm?


Analysis example phonebook1

Analysis Example (Phonebook)

  • This better algorithm is called Binary Search

  • What is its running time?

    • First you look in the middle of n elements

    • Then you look in the middle of n/2 = ½*n elements

    • Then you look in the middle of ½ * ½*n elements

    • Continue until there is only 1 element left

    • Say you did this m times: ½ * ½ * ½* …*n

    • Then the number of repetitions is the smallest integer m such that


Analyzing binary search

Analyzing Binary Search

  • In the worst case, the number of repetitions is the smallest integer m such that

  • We can rewrite this as follows:

Multiply both sides by

Take the log of both sides

Since m is the worst case time, the algorithm is O(logn)


Analysis example

Analysis Example

“prefix averages”

You want this mapping from array of numbers to an array of averages of the preceding numbers (who knows why – not my example):

  • 10 15 20 25 30

    5/1 15/2 30/3 50/4 75/5 105/6

There are two straightforward algorithms:

One is easy but wasteful.

The other is more efficient, but requires insight into the problem.

Adapted from Goodrich & Tamassia


Analysis example1

Analysis Example

Adapted from Goodrich & Tamassia


Analysis example2

Analysis Example

  • For each position i in A, you look at the values for all the elements that came before

    • What is the number of positions in the largest part?

    • When i=n, you look at n positions

    • When i=n-1, you look at n-1 positions

    • When i=n-2, you look at n-2 positions

    • When i=2, you look at 2 positions

    • When i=1, you look at 1 position

  • This should look familiar …


Analysis example3

Analysis Example

A useful tool: store partial information in a variable!

Uses space to save time. The key – don’t divide s.

Eliminates one for loop – always a good thing to do.

Adapted from Goodrich & Tamassia


Summary analysis of algorithms

Summary: Analysis of Algorithms

  • A method for determining, in an abstract way, the asymptotic running time of an algorithm

    • Here asymptotic means as n gets very large

  • Useful for comparing algorithms

  • Useful also for determing tractability

    • Meaning, a way to determine if the problem is intractable (impossible) or not

    • Exponential time algorithms are usually intractable.

  • We’ll revisit these ideas throughout the rest of the course.


Next time

Next Time

  • Stacks and Queues


  • Login