1 / 91

Sorting

Sorting. Outline and Reading. Bubble Sort (§6.4) Merge Sort (§11.1) Quick Sort (§11.2) Radix Sort and Bucket Sort (§11.3) Selection (§11.5) Summary of sorting algorithms. Bubble Sort. Bubble Sort. 5 7 2 6 9 3. Bubble Sort. 5 7 2 6 9 3.

dezso
Download Presentation

Sorting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sorting

  2. Outline and Reading • Bubble Sort (§6.4) • Merge Sort (§11.1) • Quick Sort (§11.2) • Radix Sort and Bucket Sort (§11.3) • Selection (§11.5) • Summary of sorting algorithms

  3. Bubble Sort

  4. Bubble Sort 5 7 2 6 9 3

  5. Bubble Sort 5 7 2 6 9 3

  6. Bubble Sort 5 7 2 6 9 3

  7. Bubble Sort 5 2 7 6 9 3

  8. Bubble Sort 5 2 7 6 9 3

  9. Bubble Sort 5 2 6 7 9 3

  10. Bubble Sort 5 2 6 7 9 3

  11. Bubble Sort 5 2 6 7 9 3

  12. Bubble Sort 5 2 6 7 3 9

  13. Bubble Sort 5 2 6 7 3 9

  14. Bubble Sort 2 5 6 7 3 9

  15. Bubble Sort 2 5 6 7 3 9

  16. Bubble Sort 2 5 6 7 3 9

  17. Bubble Sort 2 5 6 7 3 9

  18. Bubble Sort 2 5 6 3 7 9

  19. Bubble Sort 2 5 6 3 7 9

  20. Bubble Sort 2 5 6 3 7 9

  21. Bubble Sort 2 5 6 3 7 9

  22. Bubble Sort 2 5 3 6 7 9

  23. Bubble Sort 2 5 3 6 7 9

  24. Bubble Sort 2 5 3 6 7 9

  25. Bubble Sort 2 3 5 6 7 9

  26. Bubble Sort 2 3 5 6 7 9

  27. Two loops each proportional to n T(n) = O(n2) It is possible to speed up bubble sort using the last index swapped, but doesn’t affect worst case complexity Complexity of Bubble Sort AlgorithmbubbleSort(S, C) Inputsequence S, comparator C Outputsequence S sorted according to C for ( i = 0; i < S.size(); i++ ) for ( j = 1; j < S.size() – i; j++ ) if ( S.atIndex ( j – 1 ) > S.atIndex ( j ) ) S.swap ( j-1, j ); return(S)

  28. 7 2  9 4  2 4 7 9 7  2  2 7 9  4  4 9 7  7 2  2 9  9 4  4 Merge Sort

  29. Merge sort is based on the divide-and-conquer paradigm. It consists of three steps: Divide: partition input sequence S into two sequences S1and S2 of about n/2 elements each Recur: recursively sort S1and S2 Conquer: merge S1and S2 into a unique sorted sequence Merge Sort AlgorithmmergeSort(S, C) Inputsequence S, comparator C Outputsequence S sorted • according to C if S.size() > 1 { (S1, S2) := partition(S, S.size()/2) S1 := mergeSort(S1, C) S2 := mergeSort(S2, C) S := merge(S1, S2) } return(S)

  30. D&C algorithm analysis with recurrence equations • Divide-and conquer is a general algorithm design paradigm: • Divide: divide the input data S into k (disjoint) subsets S1,S2, …, Sk • Recur: solve the sub-problems associated with S1, S2, …, Sk • Conquer: combine the solutions for S1and S2 into a solution for S • The base case for the recursion are sub-problems of constant size • Analysis can be done using recurrence equations (relations) • When the size of all sub-problems is the same (frequently the case) the recurrence equation representing the algorithm is: T(n) = D(n) + k T(n/c) + C(n) • Where • D(n) is the cost of dividing S into the k subproblems, S1, S2, S3, …., Sk • There are k subproblems, each of size n/c that will be solved recursively • C(n) is the cost of combining the subproblem solutions to get the solution for S

  31. The running time of Merge Sort can be expressed by the recurrence equation: T(n) = 2T(n/2) + M(n) We need to determine M(n), the time to merge two sorted sequences each of size n/2. Now, back to mergesort… AlgorithmmergeSort(S, C) Inputsequence S, comparator C Outputsequence S sorted • according to C if S.size() > 1 { (S1, S2) := partition(S, S.size()/2) S1 := mergeSort(S1, C) S2 := mergeSort(S2, C) S := merge(S1, S2) } return(S)

  32. Merging Two Sorted Sequences Algorithmmerge(A, B) Inputsequences A and B withn/2 elements each Outputsorted sequence of A  B S empty sequence whileA.isEmpty() B.isEmpty() ifA.first()<B.first() S.insertLast(A.removeFirst()) else S.insertLast(B.removeFirst()) whileA.isEmpty() S.insertLast(A.removeFirst()) whileB.isEmpty() S.insertLast(B.removeFirst()) return S • The conquer step of merge-sort consists of merging two sorted sequences A and B into a sorted sequence S containing the union of the elements of A and B • Merging two sorted sequences, each with n/2 elements and implemented by means of a doubly linked list, takes O(n) time • M(n) = O(n)

  33. So, the running time of Merge Sort can be expressed by the recurrence equation: T(n) = 2T(n/2) + M(n) = 2T(n/2) + O(n) = O(nlogn) And the complexity of mergesort… AlgorithmmergeSort(S, C) Inputsequence S, comparator C Outputsequence S sorted • according to C if S.size() > 1 { (S1, S2) := partition(S, S.size()/2) S1 := mergeSort(S1, C) S2 := mergeSort(S2, C) S := merge(S1, S2) } return(S)

  34. Merge Sort Execution Tree (recursive calls) • An execution of merge-sort is depicted by a binary tree • each node represents a recursive call of merge-sort and stores • unsorted sequence before the execution and its partition • sorted sequence at the end of the execution • the root is the initial call • the leaves are calls on subsequences of size 0 or 1 7 2  9 4  2 4 7 9 7  2  2 7 9  4  4 9 7  7 2  2 9  9 4  4

  35. 7 2 9 4  2 4 7 9 3 8 6 1  1 3 8 6 7 2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1 Execution Example • Partition 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

  36. 7 2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1 Execution Example (cont.) • Recursive call, partition 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 8 6

  37. 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1 Execution Example (cont.) • Recursive call, partition 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 8 6 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6

  38. 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 Execution Example (cont.) • Recursive call, base case 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 8 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

  39. Execution Example (cont.) • Recursive call, base case 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 8 6 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

  40. Execution Example (cont.) • Merge 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 8 6 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

  41. Execution Example (cont.) • Recursive call, …, base case, merge 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 8 6 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

  42. Execution Example (cont.) • Merge 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 8 6 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

  43. Execution Example (cont.) • Recursive call, …, merge, merge 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 6 8 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

  44. Execution Example (cont.) • Merge 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 6 8 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

  45. Analysis of Merge-Sort • The height h of the merge-sort tree is O(log n) • at each recursive call we divide in half the sequence, • The work done at each levelis O(n) • At level i, we partition and merge 2i sequences of size n/2i • Thus, the total running time of merge-sort is O(n log n)

  46. Summary of Sorting Algorithms (so far)

  47. 7 4 9 6 2  2 4 6 7 9 4 2  2 4 7 9 7 9 2  2 9  9 Quick-Sort

  48. Quick-sort is a randomized sorting algorithm based on the divide-and-conquer paradigm: Divide: pick a random element x (called pivot) and partition S into L elements less than x E elements equal x G elements greater than x Recur: sort L and G Conquer: join L, Eand G Quick-Sort x x L G E x

  49. Assumption: random pivot expected to give equal sized sublists The running time of Quick Sort can be expressed as: T(n) = 2T(n/2) + P(n) T(n) - time to run quicksort() on an input of size n P(n) - time to run partition() on input of size n Analysis of Quick Sort using Recurrence Relations AlgorithmQuickSort(S,l,r) Inputsequence S, ranks l and r Output sequence S with the elements of rank between l and rrearranged in increasing order ifl  r return i a random integer between l and r x S.elemAtRank(i) (h,k) Partition(x) QuickSort(S,l,h - 1) QuickSort(S,k + 1,r)

  50. Partition Algorithmpartition(S,p) Inputsequence S, position p of pivot Outputsubsequences L,E, G of the elements of S less than, equal to, or greater than the pivot, resp. L,E, G empty sequences x S.remove(p) whileS.isEmpty() y S.remove(S.first()) ify<x L.insertLast(y) else if y=x E.insertLast(y) else{ y > x } G.insertLast(y) return L,E, G • We partition an input sequence as follows: • We remove, in turn, each element y from S and • We insert y into L, Eor G,depending on the result of the comparison with the pivot x • Each insertion and removal is at the beginning or at the end of a sequence, and hence takes O(1) time • Thus, the partition step of quick-sort takes O(n) time

More Related