1 / 65

CSCE-221 Sorting

CSCE-221 Sorting. Prof. Lopoli Zhipei Yan 10/3/2019. From Dr. Scott Schafer and notes of Prof. Lupoli http://faculty.cs.tamu.edu/schaefer/teaching/221_Fall2018/Lectures/Sorting.ppt. Outline. Bubble Sort Insertion Sort Merge Sort Quick Sort Heap sort, Radix Sort, Bucket Sort

gloriahall
Download Presentation

CSCE-221 Sorting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSCE-221Sorting Prof. Lopoli Zhipei Yan 10/3/2019 From Dr. Scott Schafer and notes of Prof. Lupoli http://faculty.cs.tamu.edu/schaefer/teaching/221_Fall2018/Lectures/Sorting.ppt

  2. Outline • Bubble Sort • Insertion Sort • Merge Sort • Quick Sort • Heap sort, Radix Sort, Bucket Sort • All Sort Examples assume ascending order • 2 • 3 • 5 • 6 • 9

  3. Bubble Sort

  4. Video & Short inLab exercise • https://youtu.be/3CeuJQyFI7A • PLEASE ASK ANY QUESTIONS DURING THE VIDEO!!

  5. Two loops each proportional to n T(n) = O(n2) It is possible to speed up bubble sort using the last index swapped, but doesn’t affect worst case complexity Complexity of Bubble Sort AlgorithmbubbleSort(S, C) Inputsequence S, comparator C Outputsequence S sorted according to C for ( i = 0; i < S.size(); i++ ) for ( j = 1; j < S.size() – i; j++ ) if ( S.atIndex ( j – 1 ) > S.atIndex ( j ) ) S.swap ( j-1, j ); return(S)

  6. Insertion Sort • Algorithm • starts from the smallest, to the largest. • Left most end of the array is sorted • Looks at the next element (key), and places it in the correct ordered spot. • Need to move elements to make space for key • remember as a “deck of cards”, each one you draw you immediately put in order • again n-1 passes to complete

  7. Video & Short inLab exercise • https://youtu.be/y0_LvwFAuT8

  8. Insertion Sort Source : http://www.cs.sfu.ca/CourseCentral/225/jmanuch/lec/6-2.ppt

  9. 1 2 12 16 13 15 35

  10. Two nested loops - dependent Top loop runs n time Bottom loop move elements to the right if they are > than key Time Complexity T(n) = O(n2) Complexity of Insertion Sort AlgorithmInsertion(S, C) Inputsequence S, comparator C Outputsequence S sorted according to C for ( i = 1; i < S.size(); i++ ) { key = S[i]; for ( j = i - 1; j > 0 && S[j] > key; j-- ) { S[j + 1] = S[j]; } S[j + 1] = key; } return S;

  11. 7 2  9 4  2 4 7 9 7  2  2 7 9  4  4 9 7  7 2  2 9  9 4  4 Merge Sort

  12. Merge sort is based on the divide-and-conquer paradigm. It consists of three steps: Divide: partition input sequence S into two sequences S1and S2 of about n/2 elements each Recur: recursively sort S1and S2 Conquer: merge S1and S2 into a unique sorted sequence Merge Sort AlgorithmmergeSort(S, C) Inputsequence S, comparator C Outputsequence S sorted • according to C if S.size() > 1 { (S1, S2) := partition(S, S.size()/2) S1 := mergeSort(S1, C) S2 := mergeSort(S2, C) S := merge(S1, S2) } return(S)

  13. D&C algorithm analysis with recurrence equations • Divide-and conquer is a general algorithm design paradigm: • Divide: divide the input data S into k (disjoint) subsets S1,S2, …, Sk • Recur: solve the sub-problems associated with S1, S2, …, Sk • Conquer: combine the solutions for S1, S2, …, Sk into a solution for S • The base case for the recursion are sub-problems of constant size • Analysis can be done using recurrence equations (relations) • When the size of all sub-problems is the same (frequently the case) the recurrence equation representing the algorithm is: T(n) = D(n) + k T(n/c) + C(n) • Where • D(n) is the cost of dividing S into the k subproblems, S1, S2, S3, …., Sk • There are k subproblems, each of size n/c that will be solved recursively • C(n) is the cost of combining the subproblem solutions to get the solution for S

  14. The running time of Merge Sort can be expressed by the recurrence equation: T(n) = 2T(n/2) + M(n) We need to determine M(n), the time to merge two sorted sequences each of size n/2. Now, back to mergesort… AlgorithmmergeSort(S, C) Inputsequence S, comparator C Outputsequence S sorted • according to C if S.size() > 1 { (S1, S2) := partition(S, S.size()/2) S1 := mergeSort(S1, C) S2 := mergeSort(S2, C) S := merge(S1, S2) } return(S)

  15. Merging Two Sorted Sequences Algorithmmerge(A, B) Inputsequences A and B withn/2 elements each Outputsorted sequence of A  B S empty sequence whileA.isEmpty() B.isEmpty() ifA.first()<B.first() S.insertLast(A.removeFirst()) else S.insertLast(B.removeFirst()) whileA.isEmpty() S.insertLast(A.removeFirst()) whileB.isEmpty() S.insertLast(B.removeFirst()) return S • The conquer step of merge-sort consists of merging two sorted sequences A and B into a sorted sequence S containing the union of the elements of A and B • Merging two sorted sequences, each with n/2 elements and implemented by means of a doubly linked list, takes O(n) time • M(n) = O(n)

  16. So, the running time of Merge Sort can be expressed by the recurrence equation: T(n) = 2T(n/2) + M(n) = 2T(n/2) + O(n) = O(nlogn) And the complexity of mergesort… AlgorithmmergeSort(S, C) Inputsequence S, comparator C Outputsequence S sorted • according to C if S.size() > 1 { (S1, S2) := partition(S, S.size()/2) S1 := mergeSort(S1, C) S2 := mergeSort(S2, C) S := merge(S1, S2) } return(S)

  17. Merge Sort Execution Tree (recursive calls) • An execution of merge-sort is depicted by a binary tree • each node represents a recursive call of merge-sort and stores • unsorted sequence before the execution and its partition • sorted sequence at the end of the execution • the root is the initial call • the leaves are calls on subsequences of size 0 or 1 7 2  9 4  2 4 7 9 7  2  2 7 9  4  4 9 7  7 2  2 9  9 4  4

  18. 7 2 9 4  2 4 7 9 3 8 6 1  1 3 6 8 7 2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1 Execution Example • Partition 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

  19. 7 2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1 Execution Example (cont.) • Recursive call, partition 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 6 8

  20. 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1 Execution Example (cont.) • Recursive call, partition 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 6 8 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6

  21. 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 Execution Example (cont.) • Recursive call, base case 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 6 8 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

  22. Execution Example (cont.) • Recursive call, base case 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 6 8 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

  23. Execution Example (cont.) • Merge 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 6 8 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

  24. Execution Example (cont.) • Recursive call, …, base case, merge 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 6 8 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

  25. Execution Example (cont.) • Merge 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 6 8 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

  26. Execution Example (cont.) • Recursive call, …, merge, merge 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 6 8 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

  27. Execution Example (cont.) • Merge 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 7 2  9 4  2 4 7 9 3 8 6 1  1 3 6 8 7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

  28. Analysis of Merge-Sort • The height h of the merge-sort tree is O(log n) • at each recursive call we divide in half the sequence, • The work done at each levelis O(n) • At level i, we partition and merge 2i sequences of size n/2i • Thus, the total running time of merge-sort is O(n log n)

  29. Summary of Sorting Algorithms (so far)

  30. 7 4 9 6 2  2 4 6 7 9 4 2  2 4 7 9 7 9 2  2 9  9 Quick-Sort

  31. Quick Sort https://upload.wikimedia.org/wikipedia/commons/6/6a/Sorting_quicksort_anim.gif

  32. Quick-sort is a randomized sorting algorithm based on the divide-and-conquer paradigm: Divide: pick a random element x (called pivot) and partition S into L elements less than x E elements equal x G elements greater than x Recur: sort L and G Conquer: join L, Eand G Quick-Sort x x L G E x

  33. Assumption: random pivot expected to give equal sized sublists The running time of Quick Sort can be expressed as: T(n) = 2T(n/2) + P(n) T(n) - time to run quicksort() on an input of size n P(n) - time to run partition() on input of size n Analysis of Quick Sort using Recurrence Relations AlgorithmQuickSort(S,l,r) Inputsequence S, ranks l and r Output sequence S with the elements of rank between l and rrearranged in increasing order ifl  r return i a random integer between l and r x S.elemAtRank(i) (h,k) Partition(x) QuickSort(S,l,h - 1) QuickSort(S,k + 1,r)

  34. Partition Algorithmpartition(S,p) Inputsequence S, position p of pivot Outputsubsequences L,E, G of the elements of S less than, equal to, or greater than the pivot, resp. L,E, G empty sequences x S.remove(p) whileS.isEmpty() y S.remove(S.first()) ify<x L.insertLast(y) else if y=x E.insertLast(y) else{ y > x } G.insertLast(y) return L,E, G • We partition an input sequence as follows: • We remove, in turn, each element y from S and • We insert y into L, Eor G,depending on the result of the comparison with the pivot x • Each insertion and removal is at the beginning or at the end of a sequence, and hence takes O(1) time • Thus, the partition step of quick-sort takes O(n) time

  35. Assumption: random pivot expected to give equal sized sublists The running time of Quick Sort can be expressed as: T(n) = 2T(n/2) + P(n) = 2T(n/2) + O(n) = O(nlogn) So, the expected complexity of Quick Sort AlgorithmQuickSort(S,l,r) Inputsequence S, ranks l and r Output sequence S with the elements of rank between l and rrearranged in increasing order ifl  r return i a random integer between l and r x S.elemAtRank(i) (h,k) Partition(x) QuickSort(S,l,h - 1) QuickSort(S,k + 1,r)

  36. Quick-Sort Tree • An execution of quick-sort is depicted by a binary tree • Each node represents a recursive call of quick-sort and stores • Unsorted sequence before the execution and its pivot • Sorted sequence at the end of the execution • The root is the initial call • The leaves are calls on subsequences of size 0 or 1 7 4 9 6 2  2 4 6 7 9 4 2  2 4 7 9 7 9 2  2 9  9

  37. Worst-case Running Time • The worst case for quick-sort occurs when the pivot is the unique minimum or maximum element • One of L and G has size n - 1 and the other has size 0 • The running time is proportional to n + (n - 1) + … + 2 + 1 = O(n2) • Alternatively, using recurrence equations, T(n) = T(n-1) + O(n) = O(n2) …

  38. In-Place Quick-Sort • Quick-sort can be implemented to run in-place • In the partition step, we use swap operations to rearrange the elements of the input sequence such that • the elements less than the pivot have rank less than l • the elements greater than or equal to the pivot have rank greater than l • The recursive calls consider • elements with rank less than l • elements with rank greater than l AlgorithminPlaceQuickSort(S,a,b) Inputsequence S, ranks a and b ifa  b return p S[b] l a r b-1 • while (l≤r) while(l ≤ r & S[l] ≤ p) l++ while(rl & p ≤ S[r]) r-- if (l<r) swap(S[l], S[r]) swap(S[l], S[b]) inPlaceQuickSort(S,a,l-1) inPlaceQuickSort(S,l+1,b)

  39. In-Place Partitioning • Perform the partition using two indices to split S into L and E&G (a similar method can split E&G into E and G). • Repeat until l and r cross: • Scan l to the right until finding an element > x. • Scan r to the left until finding an element < x. • Swap elements at indices l and r l r 3 2 5 1 0 7 3 5 9 2 7 9 8 9 7 9 6 (pivot = 6) l r 3 2 5 1 0 7 3 5 9 2 7 9 8 9 7 9 6

  40. Summary of Sorting Algorithms (so far)

  41. Handout ExerciseComplete Survey in Ecampus • Questions?

  42. References • http://faculty.cs.tamu.edu/schaefer/teaching/221_Fall2018/Lectures/Sorting.ppt • https://en.wikipedia.org/wiki/Insertion_sort

  43. Bubble Sort 5 7 2 6 9 3

  44. Bubble Sort 5 7 2 6 9 3

  45. Bubble Sort 5 7 2 6 9 3

  46. Bubble Sort 5 2 7 6 9 3

  47. Bubble Sort 5 2 7 6 9 3

  48. Bubble Sort 5 2 6 7 9 3

  49. Bubble Sort 5 2 6 7 9 3

  50. Bubble Sort 5 2 6 7 9 3

More Related