1 / 11

# Analysis of Algorithms II - PowerPoint PPT Presentation

Analysis of Algorithms II. Basics. Before we attempt to analyze an algorithm, we need to define two things: How we measure the size of the input How we measure the time (or space) requirements

Related searches for Analysis of Algorithms II

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

## PowerPoint Slideshow about 'Analysis of Algorithms II' - mikhail

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

### Analysis of Algorithms II

• Before we attempt to analyze an algorithm, we need to define two things:

• How we measure the size of the input

• How we measure the time (or space) requirements

• Once we have done this, we find an equation that describes the time (or space) requirements in terms of the size of the input

• We simplify the equation by discarding constants and discarding all but the fastest-growing term

• Usually it’s quite easy to define the size of the input

• If we are sorting an array, it’s the size of the array

• If we are computing n!, the number n is the “size” of the problem

• Sometimes more than one number is required

• If we are trying to pack objects into boxes, the results might depend on both the number of objects and the number of boxes

• Sometimes it’s very hard to define “size of the input”

• Consider:f(n) = if n is 1, then 1; else if n is even, then f(n/2); else f(3*n + 1)

• The obvious measure of size, n, is not actually a very good measure

• To see this, compute f(7) and f(8)

• If we want to know how much time or space an algorithm takes, we can do empirical tests—run the algorithm over different sizes of input, and measure the results

• This is not analysis

• However, empirical testing is useful as a check on analysis

• Analysis means figuring out the time or space requirements

• Measuring space is usually straightforward

• Look at the sizes of the data structures

• Measuring time is usually done by counting characteristic operations

• Characteristic operation is a difficult term to define

• In any algorithm, there is some code that is executed the most times

• This is in an innermost loop, or a deepest recursion

• This code requires “constant time” (time bounded by a constant)

• Example: Counting the comparisons needed in an array search

• Informal definitions:

• Given a complexity function f(n),

• (f(n))is the set of complexity functions that are lower bounds on f(n)

• O(f(n))is the set of complexity functions that are upper bounds on f(n)

• (f(n))is the set of complexity functions that, given the correct constants, correctly describes f(n)

• Example: If f(n) = 17x3 + 4x – 12, then

• (f(n)) contains 1, x, x2, log x, x log x, etc.

• O(f(n))contains x4, x5, 2x, etc.

• (f(n))contains x3

• A function f(n) is O(g(n)) if there exist positive constants c and N such that, for all n > N, 0 < f(n) < cg(n)

• That is, if n is big enough (larger than N—we don’t care about small problems), then cg(n) will be bigger than f(n)

• Example: 5x2 + 6 is O(n3) because0 < 5n2 + 6 < 2n3 whenever n > 3 (c = 2, N = 3)

• We could just as well use c = 1, N = 6, or c = 50, N = 50

• Of course, 5x2 + 6 is also O(n4), O(2n), and even O(n2)

• A function f(n) is (g(n)) if there exist positive constants c and N such that, for all n > N, 0 < cg(n) < f(n)

• That is, if n is big enough (larger than N—we don’t care about small problems), then cg(n) will be smaller than f(n)

• Example: 5x2 + 6 is (n) because0 < 20n < 5n2 + 6 whenever n > 4 (c=20, N=4)

• We could just as well use c = 50, N = 50

• Of course, 5x2 + 6 is also O(log n), O(n), and even O(n2)

• A function f(n) is (g(n)) if there exist positive constants c1 andc2and N such that, for all n > N, 0 < c1g(n) < f(n) < c2g(n)

• That is, if n is big enough (larger than N), then c1g(n) will be smaller than f(n) and c2g(n) will be larger than f(n)

• In a sense,  is the “best” complexity of f(n)

• Example: 5x2 + 6 is (n2) becausen2< 5n2 + 6 < 6n2whenever n > 5 (c1 = 1, c2 = 6)

f(n) is O(g(n))

f(n) is (g(n))

f(n)

f(n)

cg(n)

N

N

c1g(n)

f(n) is (g(n))

f(n)

c2g(n)

N

Graphs

• Points to notice:

• What happens near the beginning(n < N) is not important

• cg(n) always passes through 0, but f(n) might not (why?)

• In the third diagram, c1g(n) and c2g(n) have the same “shape” (why?)

• For any function f(n), and large enough values of n,

• f(n) = O(g(n)) if cg(n) is greater than f(n),

• f(n) = theta(g(n)) if c1g(n) is greater than f(n) and c2g(n) is less than f(n),

• f(n) = omega(g(n)) if cg(n) is less than f(n),

• ...for suitably chosen values of c, c1, and c2

• The formal definitions were taken, with some slight modifications, from Introduction to Algorithms, by Thomas H. Cormen, Charles E. Leiserson, Donald L. Rivest, and Clifford Stein