- By
**mauve** - Follow User

- 84 Views
- Uploaded on

Download Presentation
## PowerPoint Slideshow about ' Order of growth' - mauve

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Order of growth

Suppose you have analyzed two algorithms and expressed their run times in terms of the size of the input:

- Algorithm A: takes 100 n + 1 steps to solve a problem with size n;
- Algorithm B: takes n2 + n + 1 steps.
The leading term is the term with the highest exponent.

The following table shows the run time of these algorithms for different problem sizes:

Notes for different problem sizes:

- At n=10, Algorithm A looks bad
- For Algorithm A, the leading term has a large coefficient, 100, which is why B does better than A for small n.

- But for n=100 they are about the same,
- for larger values of n, A is much better.
- any function that contains an n2 term will grow faster than a function whose leading term is n.
- Even if the run time of Algorithm A were n + 1000000, it would still be better than Algorithm B for sufficiently large n.

How to compare algorithms? for different problem sizes:

- for large problems, we expect an algorithm with a smaller leading term to be a better algorithm
- but for smaller problems, there may be a crossover point where another algorithm is better.
- The location of the crossover point depends on the details of the algorithms, the inputs and the hardware,

How to compare algorithms? for different problem sizes:

- If two algorithms have the same leading order term, it is hard to say which is better; the answer will depend on the details.
- they are considered equivalent, even if they have different coefficients.

Order of growth for different problem sizes:

An order of growth is a set of functions whose growth is considered equivalent.

Examples:

- 2n, 100n and n + 1 belong to the same order of growth, which is written O(n) in “Big-Oh notation”
- All functions with the leading term n2 belong to O(n2);
- What is the order of growth of n3 + n2?
- What about 1000000 n3 + n2. What about n3 + 1000000 n2?
- What is the order of growth of (n2 + n) * (n + 1)?

The following table shows some of the orders of growth that appear most commonly in algorithmic analysis, in increasing order of badness.

Asymptotic Analysis of Algorithms appear most commonly in algorithmic analysis, in increasing order of badness.(Asymptotic for large n)

big oh expressions greatly simplify the analysis of the running time of algorithms

- all that we get is an upper bound on the running time of the algorithm
- the result does not depend upon the values of the constants
- the result does not depend upon the characteristics of the computer and compiler actually used to execute the program!

8

Let appear most commonly in algorithmic analysis, in increasing order of badness.f and g be nonnegative functions on the positive integers

We write

f(n)=O(g(n))

And say that

f(n) is of order at most g(n) or,

f(n) is big oh of g(n) or,

g is an asymptotic upper bound for f

if there exist constants C1>0 and N1 such that

f(n) C1g(n),

for all n N1

Big O notation

9

Is 2 appear most commonly in algorithmic analysis, in increasing order of badness.n+1=O(2n)?

2n+1<=c*2n?

Yes, if c>=2 for all n

Is 22n=O(2n)?

22n<=c*2n?

2n*2n<=c*2n?

2n<=c? No

f(n)=5n3f(n)=O(n3)

g(n)= 3n2g(n)=O(n3)

but f(n) not equal g(n)

10

Notation appear most commonly in algorithmic analysis, in increasing order of badness.

O() is set of functions.

But common to abuse notation, writing

T(n) = O(…)

instead of T(n) O(…)

as well as T(n) = f(n) + O(…)

11

Conventions for Writing Big Oh Expressions appear most commonly in algorithmic analysis, in increasing order of badness.

- Ignore the multiplicative constants
- Instead of writing O(3n2), we simply write O(n2)
- If the function is constant (e.g. O(1024) we write O(1))

- Ignore the lower order terms
- Instead of writing O(nlogn+n+n2), we simply write O(n2)

12

Examples appear most commonly in algorithmic analysis, in increasing order of badness.

T(n) = 32n2 + 17n + 32

T(n)=O(n2), O(n3), not O(n)

NOTE: O(n2) tight bound, O(n3) not tight bound,

- n, n+1, n+80, 40n, n+log n is O(n)
- n2 + 10000000000n is O(n2)
- 3n2 + 6n + log n + 24.5 is O(n2)

13

Properties of Big Oh appear most commonly in algorithmic analysis, in increasing order of badness.

If f1(n)=O(g1(n)) and f2(n)=O(g2(n)) , then

f1(n)+f2(n)=O(max(g1(n),g2(n)))

If f1(n)=O(g1(n)) and f2(n)=O(g2(n)) , then

f1(n)X f2(n)=O(g1(n)X g2(n))

f(n) = O(g(n)) & g(n) = O(h(n)) f(n) = O(h(n))

14

e.g.6 appear most commonly in algorithmic analysis, in increasing order of badness. (revisited):

c1 O(1)

c2(n+1)O(n)

c3n O(n)

c4 O(1)

T(n)=c1+c2(n+1)+c3n+c4

T(n)=a+bnfor some constants a, b

T(n)=O(n)

Horner(int a[], n, x){

result=a[n]

for(i=n-1;i>=0,--i)

result=result*x+a[i]

return result

}

15

e.g.8 (revisited): appear most commonly in algorithmic analysis, in increasing order of badness. asymptotic analysis

O(1)

O(n)

O(n)

O(n2)

O(n2)

O(n)

T(n) = O(n2)

fun (int x, int n){

Sum=0;

For(i=0n){

P=1

For(j=0; j < n ;++j)

P*=x

Sum+=p

}

}

16

e.g.10 (revisited): Sequential search appear most commonly in algorithmic analysis, in increasing order of badness.

The worst case time

T(n) = an+b

The best case time

T(n) = constant

Obtain asymptotic O bound on the solution?

The worst case is O(n)

The best case is O(1)

17

e.g.11: appear most commonly in algorithmic analysis, in increasing order of badness.find a O-notation in terms of n for the number of times the statement x=x+1 is executed in the segment: for (i=1 to n) for (j=1 to i) x=x+1

i j

1 11

2 12

3 13

:

n 1n

cn=1+2+3+…+n

=n(n+1) / 2

O(n2)

18

e.g.12: appear most commonly in algorithmic analysis, in increasing order of badness. find a O-notation in terms of n for the number of times the statement x=x+1 is executed in the segment: j=n while ( j >= 1){ for (i=1 to j) x=x+1 j=j/2 }

j i

n 1n

n/2 1n/2

n/4 1n/4

:

n/2k 1n/2k

cn=n+n/2+n/4+…+ n/2k

=n(1+1/2+1/22+…+ 1/2k)

=n(1/(1-0.5))

O(n)

19

e.g.7 appear most commonly in algorithmic analysis, in increasing order of badness.(revisited):Obtain asymptotic O bound for recursive functions??? Solving Recurrence Relations-Repeated Substitution

T(n)=a if n=0

T(n)=T(n-1)+b if n>0

for some constants a, b

20

An Asymptotic Lower Bound-Omega appear most commonly in algorithmic analysis, in increasing order of badness.

- Let f and g be nonnegative functions on the positive integers
- We write
- f(n)= (g(n))
- And say that
- f(n) is of order at least g(n) or,
- f(n) is omega of g(n) or,
- g is an asymptotic lower bound for f
- if there exist constants C2>0 and N2 such that
- f(n) C2g(n),
- for all n N2

21

2n+13 appear most commonly in algorithmic analysis, in increasing order of badness. O( ? )

Also, O(n2), … Can always weaken the bound.

O(n)

2n+13 ( ? )

(n), also (log n), (1), …

2n O(n) ? (n) ?

(n), not O(n).

nlog n O(n5) ?

No. Thus, (n5).

Let appear most commonly in algorithmic analysis, in increasing order of badness.f and g be nonnegative functions on the positive integers

We write

f(n)= (g(n))

And say that

f(n) is of order g(n) or,

f(n) is theta of g(n) or,

g is an asymptotic tight bound for f

if

f(n)=O(g(n)) and f(n)= (g(n))

23

e.g. appear most commonly in algorithmic analysis, in increasing order of badness.

T(n) = 32n2 + 17n + 32

we can ignore the multiplicative constants and the lower order terms

T(n)=O(n2), O(n3), not O(n)

NOTE: O(n2) tight bound, O(n3) not tight bound,

Since 32n2 + 17n + 32

T(n)= (n2), (n), not (n3)

T(n)= (n2), not (n), not (n3)

Properties appear most commonly in algorithmic analysis, in increasing order of badness.

Transitivity

f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))

f(n) = O(g(n)) & g(n) = O(h(n)) f(n) = O(h(n))

f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))

Symmetry

f(n) = (g(n)) if and only if g(n) = (f(n))

Transpose Symmetry

f(n) = O(g(n)) if and only if g(n) = (f(n))

25

Download Presentation

Connecting to Server..