1 / 19

Sorting and "Big Oh"

Sorting and "Big Oh". ASFA AP Computer Science A. Adapted for ASFA from a presentation by: Barb Ericson Georgia Tech ericson@cc.gatech.edu Aug 2007. Learning Goals. Understand several sorting algorithms Selection Sort, Insertion Sort, Merge Sort, Quicksort

eris
Download Presentation

Sorting and "Big Oh"

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sorting and "Big Oh" ASFA AP Computer Science A Adapted for ASFA from a presentation by: Barb Ericson Georgia Tech ericson@cc.gatech.edu Aug 2007 SortingBigOh

  2. Learning Goals • Understand several sorting algorithms • Selection Sort, Insertion Sort, Merge Sort, Quicksort • Understand how to evaluate the efficiency of an algorithm • Best case, worst case, average case, and a general idea of which is faster • Also "Big Oh" SortingBigOh

  3. Sorting • We often want to put data into some order • Ascending or descending • Sorting is often shown with array data • But you can sort anything that implements the Comparable Interface • Sort Pictures by size • So that the SlideShow class shows the smallest first • Sort Sounds by length • So that a PlayList plays the shortest sound first SortingBigOh

  4. What is Big-O? Big-O refers to the order of an algorithm runtime growth in relation to the number of items Focus on dominant terms and ignore less significant terms and constants as n grows For example: Selection sort – number of compares is n*(n-1) / 2 n*(n-1) / 2 = n2 / 2 – n/2 the n2 / 2 term “dominates” the n /2 term – because as n gets larger n2 get much larger, faster than n does focus on n2 / 2 term the constant term is ignored, thus selection sort has order of n-squared: O(n2) SortingBigOh

  5. What is Big-O? Big-O refers to the order of an algorithm runtime growth in relation to the number of items Focus on dominant terms and ignore less significant terms and constants as n grows For example: Merge sort – each call to sort requires 2 recursive sorts and 2 merges of an array of half size the merge operation takes (n/2 – 1) compares O(n) the sorting operation will cause one more level of sorts to be done when the array doubles, two more levels of sorts to be done when the array quadruples, thus sorting is O(m) where n = 2m m = log2(n)  O(log(n)) combined, the merge sort has O(nlog(n)) SortingBigOh

  6. What is Big-O? Big-O refers to the order of an algorithm runtime growth in relation to the number of items I. O(l) - constant time (Push and pop elements on a stack) II. O(n) - linear time The algorithm requires a number of steps proportional to the size of the task. (Finding the minimum of a list) III. O(n2) - quadratic time The number of operations is proportional to the size of the task squared. (Selection and Insertion sort) IV. O(log n) - logarithmic time (Binary search on a sorted list) V. O(n log n) - "n log n " time (Merge sort and quicksort) SortingBigOh

  7. Selection Sort Algorithm • Search the entire array for the smallest element and then swap the smallest with the first element (at index 0) • Continue through the rest of the array doing the same thing (second time with index 1) • This uses a nested loop • The outer loop runs from i = 0 to i < a.length – 1 • Inner loop runs from j = i+1 to j < a.length SortingBigOh

  8. public void selectionSort() { int maxCompare = a.length - 1; int smallestIndex = 0; // loop from 0 to one before last item for (int i = 0; i < maxCompare; i++) { // set smallest index to the one at i smallestIndex = i; // loop from i+1 to end of the array for (int j = i + 1; j < a.length; j++) { if (a[j] < a[smallestIndex]) { smallestIndex = j; } } // swap the one at i with the one at smallest index swap(i,smallestIndex); this.printArray("after loop body when i = " + i); } } private void swap(int i, int j) { if (i != j) { int temp = a[i]; a[i] = a[j]; a[j] = temp; } } private void printArray(String message) { System.out.println(message); for (int i : a) System.out.print(i + " "); System.out.println(); } Selection Sort Code SortingBigOh

  9. How Efficient is Selection Sort? • We make n-1 comparisons the first time • Then n-2 comparisons • Then n-3 comparisons • And so on till • 3, 2, 1 • This is a total of • (n-1) + (n-2) + (n-3) + … + 3 + 2 + 1 • The equation for the number of steps in an array of n elements is • (n * (n-1)) / 2 SortingBigOh

  10. Selection Sort Efficiency • It doesn’t matter if the array was sorted or not when we start • Best case == worst case == average case • This algorithm will always take this long! • Big O • (n * (n-1)) / 2 is (n2-n) / 2 • Keep only the item that grows the fastest as n gets really big so this is O(n2) SortingBigOh

  11. Insertion Sort Algorithm • Loop through the array and insert the next element in the array into the sorted portion in the correct position • Moving larger values to the right to make room • Start with the second item in the array • At index 1 • Use a temporary variable to hold the value at the current index SortingBigOh

  12. Insertion Sort Code public void insertionSort() { int temp = 0; int pos = 0; // loop from second element on for (int i = 1; i < a.length; i++) { // save current value at i and set position to i temp = a[i]; pos = i; // shift right any larger elements while (0 < pos && temp < a[pos - 1]) { a[pos] = a[pos - 1]; pos--; } a[pos] = temp; this.printArray("after loop body when i = " + i); } } SortingBigOh

  13. Insertion Sort Efficiency • Best case • The array is in sorted order when you start • Only n-1 comparisons with no swapping • Worst case • The array is sorted in decreasing order • Need (n * (n – 1)) / 2 comparisons and lots of swapping • Average case • On average need (n * (n-1)) / 4 comparisons • Big O • (n * (n-1)) / 4 is (n2-n) / 4 • Keep only the item that grows the fastest as n gets really big so this is O(n2) SortingBigOh

  14. Merge Sort Algorithm • If the current array length is 1 return • Base case on the recursion • Else • Break the array into two arrays • Copy the elements from the original into the new arrays • Create new ArraySorter objects with the new arrays • Do a recursive call to mergeSort on the new ArraySorter objects • Merge the sorted arrays into the original array SortingBigOh

  15. public void mergeSort() { // check if there is only 1 element if (a.length == 1) return; // otherwise create two new arrays int[] left = new int[a.length / 2]; for (int i = 0; i < left.length; i++) left[i] = a[i]; int[] right = new int[a.length - left.length]; for (int i = left.length, j=0; i < a.length; i++, j++) right[j] = a[i]; // create new ArraySorter objects ArraySorter sorter1 = new ArraySorter(left); sorter1.printArray("sorter1"); ArraySorter sorter2 = new ArraySorter(right); sorter2.printArray("sorter2"); // do the recursive call sorter1.mergeSort(); sorter2.mergeSort(); // merge the resulting arrays merge(left,right); this.printArray("After merge"); } Merge Sort Code SortingBigOh

  16. /** * Method to merge back into the current array * @param left sorted left array * @param right the sorted right array */ private void merge(int[] left, int[] right) { int leftIndex = 0; // current left index int rightIndex = 0; // current right index int i = 0; // current index in a // merge the left and right arrays into a while (leftIndex < left.length && rightIndex < right.length) { if (left[leftIndex] < right[rightIndex]) { a[i] = left[leftIndex]; leftIndex++; } else { a[i] = right[rightIndex]; rightIndex++; } i++; } // copy any remaining in left for (int j = leftIndex; j < left.length; j++) { a[i] = left[j]; i++; } // copy any remaining in right for (int j = rightIndex; j < right.length; j++) { a[i] = right[j]; i++; } } Merge Code SortingBigOh

  17. Merge Sort Efficiency • Merge sort is usually more efficient than insertion sort and always more efficient than selection sort • Best case == Worst Case == Average Case • Merge n elements m times where n = 2m • Big O • About n * m times and m is log2(n) so it is n * log(n) • O(n log(n)) SortingBigOh

  18. Summary • See animations of algorithms at • http://vision.bc.edu/~dmartin/teaching/sorting/anim-html/all.html • Students need to understand how to sort data • Selection Sort, Insertion Sort, Merge Sort, Quicksort • All students should have some idea of the efficiency of each algorithm • should understand best, average, and worst case • need to know "Big Oh" SortingBigOh

  19. Merge Sort Timing vs. Selection Sort Figure 2:Merge Sort Timing (blue) versus Selection Sort (red)

More Related