- 97 Views
- Uploaded on
- Presentation posted in: General

Orders of Growth

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Orders of Growth

Rosen 5th ed., §2.2

- For functions over numbers, we often need to know a rough measure of how fast a function grows.
- If f(x) is faster growing than g(x), then f(x) always eventually becomes larger than g(x) in the limit (for large enough values of x).
- Useful in engineering for showing that one design scales better or worse than another.

First of all, what is order of growth? It is a gross measure of the resources required by an algorithm as the input size becomes larger.

Suppose algorithm A to solve a problem requires a complexity in the form of function f(x) where x is the input size.

Likewise, another algorithm B for the same problem will require the complexity of g(x).

- Suppose you are designing a web site to process user data (e.g., financial records).
- Suppose database program A takes fA(n)=30n+8 microseconds to process any n records, while program B takes fB(n)=n2+1 microseconds to process the n records.
- Which program do you choose, knowing you’ll want to support millions of users?

There are two kinds of programs or algorithms.

The answer is obvious. Everybody (who can do the math) will choose the program A.

fA(n)=30n+8

- On a graph, asyou go to theright, a fastergrowingfunctioneventuallybecomeslarger...

Value of function

fB(n)=n2+1

Increasing n

This plotting shows how fast each algorithm’s complexity increases.

I guess two functions will meet around 30 or 31.

- We say fA(n)=30n+8is order n, or O(n). It is, at most, roughly proportional to n.
- fB(n)=n2+1 is order n2, or O(n2). It is roughly proportional to n2.
- Any O(n2) function is faster-growing than any O(n) function.
- For large numbers of user records, the O(n2) function will always take more time.

Why do we use this notation O(n)?

The exact expression of complexity of algorithm is somewhat long and perplexing. So we want to grasp the essence of the complexity.

That’s why we use O(n) notation and let’s see how we can derive the

O() notation from the original complexity equation.

Let g be any function RR.

- Define “at most order g”, written O(g), to be: {f:RR | c,k: x>k: f(x) cg(x)}.
- “Beyond some point k, function f is at most a constant c times g (i.e., proportional to g).”

- “f is at most order g”, or “f is O(g)”, or “f=O(g)” all just mean that fO(g).
- Sometimes the phrase “at most” is omitted.

- Note that f is O(g) so long as any values of c and k exist that satisfy the definition.
- But: The particular c, k, values that make the statement true are not unique: Any larger value of c and/or k will also work.
- You are not required to find the smallest c and k values that work.

You should prove that the values you choose do work.

- Show that 30n+8 is O(n).
- Show c,k: n>k:30n+8 cn.
- Let c=31, k=8. Assume n>k=8. Thencn = 31n = 30n + n > 30n+8, so 30n+8 < cn.

- Show c,k: n>k:30n+8 cn.
- Show that n2+1 is O(n2).
- Show c,k: n>k: n2+1 cn2.
- Let c=2, k=1. Assume n>1. Then cn2 = 2n2 = n2+n2 > n2+1, or n2+1< cn2.

- Show c,k: n>k: n2+1 cn2.

When we simplify the complexity by the big-o notation, we normally focus on the highest-order term.

Why? Because it will signify how fast the function will change

In the first case, the highest-order term is linear term

In the 2nd case, the highest-order term is quadratic term

cn =31n

n>k=8

- Note 30n+8 isn’tless than nanywhere (n>0).
- It isn’t evenless than 31neverywhere.
- But it is less than31neverywhere tothe right of n=8.

30n+8

30n+8O(n)

Value of function

n

Increasing n

- an xn + ... + a1 x + a0 is O( xn ) for any real numbers an , ..., a0 and any nonnegative number n .

A polynomial function of a variable x of a degree n will have this form in general.

- Big O, as a relation (ch. 7), is transitive: fO(g) gO(h) fO(h)
- Sums of functions:If gO(f) and hO(f), then g+hO(f).

- c>0, O(cf)=O(f+c)=O(fc)=O(f)
- f1O(g1) f2O(g2)
- f1 f2 O(g1g2)
- f1+f2 O(g1+g2) = O(max(g1,g2)) = O(g1) if g2O(g1)

- f,g & constantsa,bR, with b0,
- af = O(f); (e.g. 3x2 = O(x2))
- f+O(f) = O(f); (e.g. f = x2+x = O(x2))

- Also, if f=(1) (at least order 1), then:
- |f|1-b = O(f); (e.g.x1 = O(x))
- (logb |f|)a= O(f). (e.g. log x = O(x))
- g=O(fg) (e.g.x = O(x log x))
- fg O(g) (e.g.x log x O(x))
- a=O(f) (e.g. 3 = O(x))

- If fO(g) and gO(f) then we say “g and f are of the same order” or “f is (exactly) order g” and write f(g).
- Another equivalent definition:(g) {f:RR | c1c2kR+ x>k: |c1g(x)||f(x)||c2g(x)| }
- “Everywhere beyond some point k, f(x)lies in between two multiples of g(x).”

- Mostly like rules for O( ), except:
- f,g>0 & constantsa,bR, with b>0,af (f), but Same as with O.f (fg) unless g=(1) Unlike O.|f|1-b (f), and Unlike with O.(logb |f|)c (f). Unlike with O.
- The functions in the latter two cases we say are strictly of lower order than (f).

- (g) = {f | gO(f)}“The functions that are at least orderg.”
- o(g) = {f | c>0 k x>k : |f(x)| < |cg(x)|}“The functions that are strictly lower order than g.” o(g) O(g) (g).
- (g) = {f | c>0 k x>k : |cg(x)| < |f(x)|}“The functions that are strictly higher order than g.” (g) (g) (g).

- O:Big-Oh (≤)
- f(n) is O(g(n)) if f(n) is asymptotically
less than or equal to g(n)

- f(n) is O(g(n)) if f(n) is asymptotically
- :big-Omega (≥)
- f(n) is (g(n)) if f(n) is asymptotically
greater than or equal to g(n)

- f(n) is (g(n)) if f(n) is asymptotically
- :big-Theta (=)
- f(n) is (g(n)) if f(n) is asymptotically equal to g(n)

- o:little-oh (<)
- f(n) is o(g(n)) if f(n) is asymptotically
strictly less than g(n)

- f(n) is o(g(n)) if f(n) is asymptotically
- :little-omega (>)
- f(n) is (g(n)) if is asymptotically
strictly greater than g(n)

- f(n) is (g(n)) if is asymptotically

f (n) is o (g (n)) if

lim n→∞(f (n) / g (n)) = 0

log log n

log n

log2 n

sqrt n

n

n log n

n2

nk

2n

n!

nn

Hint: Can use L’Hopital’s rule

O(nk): tractable

O(2n ), O(n!): intractable

- let’s write
fg to mean fo(g)

f~g to mean f(g)

- Note that
- Let k>1. Then the following are true:1 log log n log n ~ logkn logknn1/k n n log nnkkn n! nn…

- Subset relations between order-of-growth sets.

RR

( f )

O( f )

• f

o( f )

( f )

( f )

- A function that is O(x), but neither o(x) nor (x):

Definitions of order-of-growth sets, g:RR

- O(g) {f|c>0 k x>k |f(x)| |cg(x)|}
- o(g) {f | c>0 k x>k |f(x)| < |cg(x)|}
- (g) {f | gO(f)}
- (g) {f | go(f)}
- (g) O(g) (g)