1 / 61

6.006- Introduction to Algorithms

6.006- Introduction to Algorithms. Lecture 10 Prof. Constantinos Daskalakis CLRS 8.1-8.4. Menu. Show that Q ( n lg n ) is the best possible running time for a sorting algorithm. Design an algorithm that sorts in Q ( n ) time. Hint: maybe the models are different ?. Comparison sort.

leal
Download Presentation

6.006- Introduction to Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 6.006- Introduction to Algorithms Lecture 10 Prof. Constantinos Daskalakis CLRS 8.1-8.4

  2. Menu • Show that Q(nlgn) is the best possible running time for a sorting algorithm. • Design an algorithm that sorts in Q(n) time. • Hint: maybe the models are different ?

  3. Comparison sort All the sorting algorithms we have seen so far are comparison sorts: only use comparisons to determine the relative order of elements. So the elements could be numbers, water-samples compared on the basis of their concentration in chloride, etc. The best running time that we’ve seen for comparison sorting is O(nlogn). Is O(nlogn)the best we can do? Decision trees can help us answer this question.

  4. a1:a2 a2:a3 a1:a3 a1:a3 a2:a3 a2a1a3 a1a2a3 a3a1a2 a2a3a1 a3a2a1 a1a3a2 Decision-tree £ ³ A recipe for sorting n things áa1, a2, …, anñ - Nodes are suggested comparisons:ai:ajmeans compareaitoaj. ³ £ ³ £ - Branching direction depends on outcome of comparisons. £ ³ £ ³ - Leaves are labeled with permutations corresponding to the outcome of the sorting.

  5. a1:a2 a2:a3 a1:a3 a1:a3 a2:a3 a2a1a3 a1a2a3 a3a1a2 a2a3a1 a3a2a1 a1a3a2 Decision-tree example Sort áa1, a2, a3ñ = á 9, 4, 6 ñ: 9 ³ 4 ³ £ ³ £ £ ³ £ ³ • Each internal node is labeled ai:ajfor i, j Î {1, 2,…, n}. • The left subtree shows subsequent comparisons if ai£aj. • The right subtree shows subsequent comparisons if ai³aj. • Each leaf contains a permutation áp(1), p(2),…, p(n)ñ to indicate that the ordering ap(1) £ap(2)£L£ap(n)was found.

  6. a1:a2 a2:a3 a1:a3 a1:a3 a2:a3 a2a1a3 a1a2a3 a3a1a2 a2a3a1 a3a2a1 a1a3a2 Decision-tree example Sort áa1, a2, a3ñ = á 9, 4, 6 ñ: £ 9 ³ 6 ³ £ £ ³ £ ³ • Each internal node is labeled ai:ajfor i, j Î {1, 2,…, n}. • The left subtree shows subsequent comparisons if ai£aj. • The right subtree shows subsequent comparisons if ai³aj. • Each leaf contains a permutation áp(1), p(2),…, p(n)ñ to indicate that the ordering ap(1) £ap(2)£L£ap(n)was found.

  7. a1:a2 a2:a3 a1:a3 a1:a3 a2:a3 a2a1a3 a1a2a3 a3a1a2 a2a3a1 a3a2a1 a1a3a2 Decision-tree example Sort áa1, a2, a3ñ = á 9, 4, 6 ñ: £ ³ £ £ ³ ³ 4 £ 6 • Each internal node is labeled ai:ajfor i, j Î {1, 2,…, n}. • The left subtree shows subsequent comparisons if ai£aj. • The right subtree shows subsequent comparisons if ai³aj. • Each leaf contains a permutation áp(1), p(2),…, p(n)ñ to indicate that the ordering ap(1) £ap(2)£L£ap(n)was found.

  8. a1:a2 a2:a3 a1:a3 a1:a3 a2:a3 a2a1a3 a1a2a3 a3a1a2 a2a3a1 a3a2a1 a1a3a2 Decision-tree example Sort áa1, a2, a3ñ = á 9, 4, 6 ñ: 4 £ 6 £ 9 • Each internal node is labeled ai:ajfor i, j Î {1, 2,…, n}. • The left subtree shows subsequent comparisons if ai£aj. • The right subtree shows subsequent comparisons if ai³aj. • Each leaf contains a permutation áp(1), p(2),…, p(n)ñ to indicate that the ordering ap(1) £ap(2)£L£ap(n)was found.

  9. Decision-tree model A decision tree can model the execution of any comparison sort: • One tree for each input size n. • A path from the root to the leaves of the tree represents a trace of comparisons that the algorithm may perform. • The running time of the algorithm = the length of the path taken. • Worst-case running time = height of tree.

  10. Lower bound for decision-tree sorting Theorem. Any decision tree for n elements must have height W(nlogn). • Proof. (Hint: how many leaves are there?) • The tree must contain ³n! leaves, since there are n! possible permutations. • A height-h binary tree has £ 2hleaves. • For it to be able to sort it must be that:  2h³ n! h³log(n!) (logis mono. increasing) ³log ((n/e)n) (Stirling’s formula) = nlog n – nlog e = W(nlog n) .

  11. Sorting in linear time Counting sort: No comparisons between elements. • Input: A[1 . . n], where A[j]Î{1, 2, …, k}. • Output: B[1 . . n], a sorted permutation of A • Auxiliary storage: C[1 . . k].

  12. Counting-sort example one index for each possible key stored in A n=5, k=4 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: C: B:

  13. Loop 1: initialization 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 0 0 0 0 C: B: • fori 1tok • doC[i]  0

  14. Loop 2: count frequencies 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 0 0 0 1 C: B: • forj 1ton • doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|

  15. Loop 2: count frequencies 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 0 0 1 C: B: • forj 1ton • doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|

  16. Loop 2: count frequencies 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 0 1 1 C: B: • forj 1ton • doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|

  17. Loop 2: count frequencies 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 0 1 2 C: B: • forj 1ton • doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|

  18. Loop 2: count frequencies 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 0 2 2 C: B: • forj 1ton • doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|

  19. Loop 2: count frequencies 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 0 2 2 C: B: • forj 1ton • doC[A[j]] C[A[j]] + 1⊳C[i] = |{key = i}|

  20. [A parenthesis: a quick finish 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 0 2 2 C: B: Walk through frequency array and place the appropriate number of each key in output array…

  21. A parenthesis: a quick finish 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 0 2 2 C: 1 B:

  22. A parenthesis: a quick finish 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 0 2 2 C: 1 B:

  23. A parenthesis: a quick finish 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 0 2 2 C: 1 3 3 B:

  24. A parenthesis: a quick finish 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 0 2 2 C: 1 3 3 4 4 B: B is sorted! but it is not “stably sorted”…]

  25. Loop 3: from frequencies to cumulative frequencies… 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 0 2 2 C: B: • fori 2tok • doC[i] C[i] + C[i–1]⊳C[i] = |{key £i}|

  26. B: Loop 3: from frequencies to cumulative frequencies… 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 0 2 2 C: 1 1 2 2 C': • fori 2tok • doC[i] C[i] + C[i–1]⊳C[i] = |{key £i}|

  27. B: Loop 3: from frequencies to cumulative frequencies… 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 0 2 2 C: 1 1 3 2 C': • fori 2tok • doC[i] C[i] + C[i–1]⊳C[i] = |{key £i}|

  28. B: Loop 3: from frequencies to cumulative frequencies… 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 0 2 2 C: 1 1 3 5 C': • fori 2tok • doC[i] C[i] + C[i–1]⊳C[i] = |{key £i}|

  29. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 1 3 5 C: B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  30. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 1 3 5 C: There are exactly 3 elements ≤A[5]. So where should I place A[5]? B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  31. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 1 3 5 C: Used-up one 3; update counter in C for the next 3 that shows up... 3 B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  32. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 1 2 5 C: 3 B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  33. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 1 2 5 C: 3 B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  34. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 1 2 5 C: There are exactly 5 elements ≤A[4]. So where should I place A[4]? 3 B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  35. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 1 2 4 C: 3 4 B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  36. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 1 2 4 C: 3 4 B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  37. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 1 2 4 C: 3 4 B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  38. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 1 1 4 C: 3 3 4 B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  39. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 1 1 4 C: 3 3 4 B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  40. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 1 1 1 4 C: 3 3 4 B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  41. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 0 1 1 4 C: 1 3 3 4 B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  42. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 0 1 1 4 C: 1 3 3 4 B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  43. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 0 1 1 4 C: 1 3 3 4 B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  44. Loop 4: permute elements of A 1 2 3 4 5 1 2 3 4 4 1 3 4 3 A: 0 1 1 3 C: 1 3 3 4 4 B: • forj ndownto1 • doB[C[A[j]]]  A[j] • C[A[j]]  C[A[j]] – 1

  45. Counting sort • fori 1tok • doC[i]  0 • forj 1ton • doC[A[ j]] C[A[ j]] + 1 • fori 2tok • doC[i] C[i] + C[i–1] • forjndownto1 • doB[C[A[ j]]]  A[ j] • C[A[ j]] C[A[ j]] – 1 Q(k) Q(n) Q(k) using cumulative frequencies build sorted permutation store in C the cumulative frequencies of different keys in A, i.e. C[i] = |{key £i}| store in C the frequencies of the different keys in A i.e. C[i] = |{key =i}| Q(n) Q(n+k)

  46. Running time • If k = O(n), then counting sort takes Q(n) time. • But, sorting takes W(nlgn) time! • Where’s the fallacy? • Answer: • Comparison sorting takes W(nlgn) time. • Counting sort is not a comparison sort. • In fact, not a single comparison between elements occurs!

  47. 4 1 3 4 3 A: 1 3 3 4 4 B: Stable sorting Counting sort is a stable sort: it preserves the input order among equal elements. This does not seem useful for this example, but imagine a situation where each element stored in A comes with some “personalized information” (wait 2 slides…).

  48. Radix sort • Origin: Herman Hollerith’s card-sorting machine for the 1890 U.S. Census. (See Appendix .) • Digit-by-digit sort. • Hollerith’s original (bad) idea: sort on most-significant digit first. • Good idea: Sort on least-significant digit first with auxiliary stable sort.

  49. 3 2 9 7 2 0 7 2 0 3 2 9 4 5 7 3 5 5 3 2 9 3 5 5 6 5 7 4 3 6 4 3 6 4 3 6 8 3 9 4 5 7 8 3 9 4 5 7 4 3 6 6 5 7 3 5 5 6 5 7 7 2 0 3 2 9 4 5 7 7 2 0 3 5 5 8 3 9 6 5 7 8 3 9 Operation of radix sort

  50. 7 2 0 3 2 9 3 2 9 3 5 5 4 3 6 4 3 6 8 3 9 4 5 7 • Sort on digit t 3 5 5 6 5 7 4 5 7 7 2 0 6 5 7 8 3 9 Correctness of radix sort • Induction on digit position • Assume that the numbers are sorted by their low-order t – 1 digits.

More Related