1 / 52

Sorting

Sorting. Algorithms and Analysis. Robert Duncan. Refresher on Big-O. Hierarchy of Big-O functions from slowest to fastest. O(2^N) Exponential O(N^2) Quadratic O(N log N) Linear/Log O(N) Linear O(log N) Log O(1) Constant. Generic running times. O(N log N) vs. O(N^2).

xandy
Download Presentation

Sorting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sorting Algorithms and Analysis Robert Duncan

  2. Refresher on Big-O Hierarchy of Big-O functions from slowest to fastest • O(2^N) Exponential • O(N^2) Quadratic • O(N log N) Linear/Log • O(N) Linear • O(log N) Log • O(1) Constant

  3. Generic running times

  4. O(N log N) vs. O(N^2)

  5. Sorting Algorithms of O(N^2) Bubble Sort Selection Sort Insertion Sort Sorting Algorithms of O(N log N) Heap Sort Merge Sort Quick Sort Two Common Categories

  6. For small values of N • It is important to note that all algorithms appear to run equally as fast for small values of N. • For values of N from the thousands to the millions, The differences between O(N^2) and O(N log N) become dramatically apparent

  7. O(N^2) Sorts • Easy to program • Simple to understand • Very slow, especially for large values of N • Almost never used in professional software

  8. Bubble Sort • The most inefficient of the O(n^2) algorithms • Simplest sorting algorithm available • Works by comparing sequential items, and swapping them if the first one is larger than the second. It makes as many passes through an array as are needed to complete the sort

  9. Bubble Sort – Pass 1 5 5 5 5 6 1 1 1 6 5 4 4 6 1 1 6 6 4 4 4 3 3 3 3 3 2 2 2 2 2

  10. Bubble Sort – Pass 2 6 6 6 6 6 5 5 5 5 5 1 1 4 4 4 4 4 1 1 1 3 3 3 3 3 2 2 2 2 2

  11. Bubble Sort – Pass 3 6 6 6 6 6 5 5 5 5 5 4 4 4 4 4 1 3 3 3 3 3 1 1 1 1 2 2 2 2 2

  12. Bubble Sort – Pass 4 6 6 6 6 6 5 5 5 5 5 4 4 4 4 4 3 3 3 3 3 2 2 2 2 2 1 1 1 1 1

  13. Selection Sort • More efficient than Bubble Sort, but not as efficient as Insertion Sort • Works by finding the largest element in the list and swapping it with the last element, effectively reducing the size of the list by 1.

  14. Selection Sort – Pass 1-3 6 8 10 2 4 6 8 4 2 10 6 2 4 8 10

  15. Selection Sort – Pass 4-5 4 2 6 8 10 2 4 6 8 10 2 4 6 8 10

  16. Insertion Sort • One of the most efficient of the O(n^2) algorithms • Roughly twice as fast as bubble sort • Works by taking items from unsorted list and inserting them into the proper place.

  17. Insertion Sort

  18. Insertion Sort

  19. O(N log N) Sorts • Fast • Efficient • Complicated, not easy to understand • Most make extensive use of recursion and complex data structures

  20. Heap Sort • Slowest O(N log N) algorithm. • Although the slowest of the O(N log N) algorithms, it has less memory demands than Merge and Quick sort.

  21. Heap Sort • Works by transferring items to a heap, which is basically a binary tree in which all parent nodes have greater values than their child nodes. The root of the tree, which is the largest item, is transferred to a new array and then the heap is reformed. The process is repeated until the sort is complete.

  22. Forming the heap from an unsorted array 32 26 19 11 7 14 2

  23. Populating the new array 26 19 11 7 14 2

  24. Reforming the heap 19 26 14 11 7 2

  25. Reforming the heap 26 19 14 11 7 2

  26. Repeat the process 19 14 11 7 2

  27. Repeat the process 14 19 2 11 7

  28. Repeat the process 19 14 2 11 7

  29. Repeat the process 14 2 11 7

  30. Merge Sort • Uses recursion. Slightly faster than heap, but uses twice as much memory from the 2nd array. • Sometimes called “divide and conquer” sort. • Works by recursively splitting an array into two equal halves, sorting the items, then re-merging them back into a new array.

  31. Quick Sort • The most efficient O(N log N) algorithm available • Works by first randomly choosing a pivot point which is hopefully a median element in the array. All elements less than the pivot are transferred to a new array, and all elements greater are transferred to a second array.

  32. Quick Sort • These new arrays are recursively quick sorted until they have been split down to a single element. The sorted elements smaller than the pivot are placed in a new array, then the pivot is placed after, and finally the elements greater than the pivot. These elements are now sorted.

  33. Quick Sort • Note: Although it is the quickest sorting algorithm, a badly chosen pivot may cause Quick Sort to run at O(N^2). • Different versions of quick sort address this problem in different ways. • For example, one way is to randomly choose 3 elements, then use the median element as the pivot.

  34. Pivot

  35. Pivot

  36. Pivot

  37. Pivot

  38. Pivot

  39. What’s the point

  40. Binary Searching • Binary searches achieve O(log N) efficiency on sorted data • Similar to High-Low game • Each execution eliminates half of the elements to search for • Although hashing offers a quicker search of O(1), binary searches are simpler, and use much less memory.

  41. Binary Searching 14?

  42. Binary Searching 14?

More Related