1 / 20

Asymptotic Notation (O, Ω,  )

Asymptotic Notation (O, Ω,  ). Describes the behavior of the time or space complexity for large instance characteristics Common asymptotic functions 1 (constant), log n , n (linear) n log n , n 2 , n 3 2 n ( exponential), n!

nisbet
Download Presentation

Asymptotic Notation (O, Ω,  )

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Asymptotic Notation (O, Ω, ) • Describes the behavior of the time or space complexity for large instance characteristics • Common asymptotic functions • 1 (constant), log n, n (linear) • n log n, n2 , n3 • 2n ( exponential), n! • where n usually refer to the number of instance of data (data的個數)

  2. Common asymptotic functions

  3. Common asymptotic functions

  4. Common asymptotic functions

  5. Big-O notation • The big O notation provides an upper bound for the function f • Definition: • f(n) = O(g(n)) iff positive constants c and n0 exist such that f(n)  c g(n) for all n, nn0 • e.g. f(n) = 10n2 + 4n + 2 then, f(n) = O(n2), or O(n3), O(n4), …

  6. Ω(Omega) notation • The Ω notation provides a lower bound for the function f • Definition: • f(n) = Ω(g(n)) iff positive constants c and n0 exist such that f(n)  c g(n) for all n, nn0 • e.g. f(n) = 10n2 + 4n + 2 then, f(n) = Ω(n2), or Ω(n), Ω(1)

  7.  notation • The  notation is used when the function f can be bounded both from above and below by the same function g • Definition: • f(n) = (g(n)) iff positive constants c1and c2, and an n0 exist such that c1g(n) f(n) c2g(n) for all n, nn0 • e.g. f(n) = 10n2 + 4n + 2 then, f(n) = Ω(n2), and f(n)=O(n2), therefore, f(n)=  (n2)

  8. Practical Complexities 109 instructions/second

  9. Impractical Complexities 109instructions/second

  10. Faster Computer Vs Better Algorithm Algorithmic improvement more useful than hardware improvement. E.g. 2n to n3

  11. Amortized Complexity of Task Sequence • Suppose that a sequence of n tasks is performed. • The worst-case cost of a task is cwc. • Let ci be the (actual) cost of the ith task in this sequence. • So, ci <=cwc, 1 <= i <= n. • n *cwc is an upper bound on the cost of the sequence. • j *cwc is an upper bound on the cost of the first j tasks.

  12. Task Sequence • Let cavg be the average cost of a task in this sequence. • So, cavg = Sci/n. • n *cavg is the cost of the sequence. • j *cavg is not an upper bound on the cost of the first j tasks. • Usually, determining cavg is quite hard.

  13. Amortized complexity • At times, a better upper bound than j *cwc or n *cwc on sequence cost is obtained using amortized complexity. • The purpose of analyzing amortized complexity is to get tighter upper bound of the actual cost

  14. Amortized Complexity • The amortized complexity of a task is the amount you charge the task. • The conventional way to bound the cost of doing a task n times is to use one of the expressions • n*(worst-case cost of task) • S(worst-case costof task i) • The amortized complexity way to bound the cost of doing a task n times is to use one of the expressions • n*(amortized cost of task) • S(amortized cost of task i)

  15. Amortized Complexity • The amortized complexity/cost of individual tasks in any task sequence must satisfy: S(actual costof task i) <= S(amortized cost of task i) • So, we can use S(amortized cost of task i) as a bound on the actual complexity of the task sequence.

  16. Amortized Complexity • The amortized complexity of a task may bear no direct relationship to the actual complexity of the task.

  17. Amortized Complexity • In conventional complexity analysis, each task is charged an amount that is >= its cost. S(actual costof task i) <= S(worst-case cost of task i) • In amortized analysis, some tasks may be charged an amount that is < their cost. S(actual costof task i) <= S(amortized cost of task i)

  18. Amortized complexity example: • suppose that a sequence of insert/delete operations: i1, i2, d1, i3, i4, i5, i6, i7, i8, d2, i9 actual cost of each insert is 1 and d1 is 8, d2 is 12. So, the total cost is 29. • In convention, for the upper bound of the cost of the sequence of tasks, we will consider the worst case of each operations; i.e., the cost of insert is 1 and the cost of delete is 12. Therefore, the upper bound of the cost of the sequence of tasks is 9*1+12*2 = 33

  19. Amortized complexity • In amortization scheme we charge the actual cost of some operations to others. • The amortized cost of each insert could be 2, d1 and d2 become 6. total is 9*2+6*2=30, which is less than worst case upper bound 33.

  20. Amortized Complexity • more examples • the amortized complexity of a sequence of Union and Find operations (will be discuss in chapter 8) • the amortized complexity of a sequence of inserts and deletes of binomial heaps

More Related