1 / 29

Analysis of algorithms

Analysis of algorithms. Best case: easiest Worst case Average case: hardest . input: 7,5,1,4,3 7, 5 ,1,4,3 5,7, 1 ,4,3 1,5,7, 4 ,3 1,4,5,7, 3 1,3,4,5,7. Straight insertion sort. M: # of data movements in straight insertion sort 1 5 7 4 3

roxanne
Download Presentation

Analysis of algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Analysis of algorithms • Best case: easiest • Worst case • Average case: hardest

  2. input: 7,5,1,4,3 7,5,1,4,3 5,7,1,4,3 1,5,7,4,3 1,4,5,7,3 1,3,4,5,7 Straight insertion sort

  3. M: # of data movements in straight insertion sort 1 5 7 4 3 temporary d3=2 Analysis of # of movements

  4. best case: already sorted di = 0 for 1  i  n M = 2(n  1) = (n) worst case: reversely sorted d1 = n  1 d2 = n  2 : di = n  i dn = 0 Analysis by inversion table

  5. average case: xj is being inserted into the sorted sequence x1 x2 ... x j-1 • the probability that xj is the largest: 1/j • takes 2 data movements • the probability that xj is the second largest : 1/j • takes 3 data movements : • # of movements for inserting xj: 1 4 7 5

  6. Straight selection sort • 7 5 1 4 3 1 5 7 4 3 1 3 7 4 5 1 3 4 7 5 1 3 4 5 7 • Only consider # of changes in the flag which is used for selecting the smallest number in each iteration. • best case: (n2) • worst case: (n2) • average case: (n2)

  7. Quick sort • Recursively apply the same procedure.

  8. Best case: (n log n) • A list is split into two sublists with almost equal size. • log n rounds are needed. • In each round, n comparisons (ignoring the element used to split) are required.

  9. Worst case: (n2) In each round, the number used to split is either the smallest or the largest.

  10. Average case: (n log n)

  11. Lower bound • Def: A lower bound of aproblem is the least time complexity required for any algorithm which can be used to solve this problem. ☆worst case lower bound ☆average case lower bound • The lower bound for a problem is not unique. • e.g. (1), (n), (n log n) are all lower bounds for sorting. • ((1), (n) are trivial)

  12. At present, if the highest lower bound of a problem is (n log n) and the time complexity of the best algorithm is O(n2). • We may try to find a higher lower bound. • We may try to find a better algorithm. • Both of the lower bound and the algorithm may be improved. • If the present lower bound is (n log n) and there is an algorithm with time complexity O(n log n), then the algorithm is optimal.

  13. The worst case lower bound of sorting 6 permutations for 3 data elements a1 a2 a3 1 2 3 1 3 2 2 1 3 2 3 1 3 1 2 3 2 1

  14. Straight insertion sort: • input data: (2, 3, 1) (1) a1:a2 (2) a2:a3, a2a3 (3) a1:a2, a1a2 • input data: (2, 1, 3) (1)a1:a2, a1a2 (2)a2:a3

  15. Decision tree for straight insertion sort

  16. Lower bound of sorting • To find the lower bound, we have to find the smallest depth of a binary tree. • n! distinct permutations n! leaf nodes in the binary decision tree. • balanced tree has the smallest depth: log(n!) = (n log n) lower bound for sorting: (n log n)

  17. Method 1:

  18. Method 2: • Stirling approximation: • n!  • log n!  log  n log n (n log n)

  19. Heapsort—An optimal sorting algorithm • A heap : parent  son

  20. output the maximum and restore: • Heapsort: construction output

  21. input data: 4, 37, 26, 15, 48 restore the subtree rooted at A(2): restore the tree rooted at A(1): Phase 1: construction

  22. Phase 2: output

  23. Implementation • using a linear array not a binary tree. • The sons of A(h) are A(2h) and A(2h+1). • time complexity: O(n log n)

  24. Time complexityPhase 1: construction

  25. Time complexity Phase 2: output

  26. Quicksort & Heapsort • Quicksort is optimal in the average case. ((n log n) in average ) • (i)worst case time complexity of heapsort is (n log n) (ii)average case lower bound: (n log n) • average case time complexity of heapsort is (n log n) • Heapsort is optimal in the average case.

More Related