1 / 19

Design and Analysis of Algorithms

Design and Analysis of Algorithms. Khawaja Mohiuddin Assistant Professor, Department of Computer Sciences Bahria University, Karachi Campus, Contact: khawaja.mohiuddin@bimcs.edu.pk Lecture # 6 – Sorting Algorithms. More Sorting Algorithms. Quick Sort

Audrey
Download Presentation

Design and Analysis of Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Design and Analysis of Algorithms • Khawaja Mohiuddin • Assistant Professor, • Department of Computer Sciences • Bahria University, Karachi Campus, • Contact: • khawaja.mohiuddin@bimcs.edu.pk • Lecture # 6 – Sorting Algorithms

  2. More Sorting Algorithms • Quick Sort • The quick sort algorithm works by sub-dividing an array into two pieces and then calling itself recursively to sort the pieces • The following pseudo-code shows the algorithm at a high level: Quicksort (Data : values[], Integer: start, Integer: end) Pick a dividing item from the array. Call it divider. Move items < divider to the front of the array. Move items >= divider to the end of the array. Let middle be the index between the pieces where divider is put. // Recursively sort the two halves of the array Quicksort (values, start, middle -1) Quicksort (values, middle + 1, end) End Quicksort

  3. More Sorting Algorithms • Quick Sort (contd.)

  4. More Sorting Algorithms • Quick Sort (contd.) • In the above example, we can pick the first value, 6 for divider • In the middle image, values less than divider have been moved to the beginning of the array, and values greater than or equal to divider have been moved to the end of the array • The divider item is shaded at index 6 • Notice that one other item has value 6, and it comes after the divider in the array • The algorithm then calls itself recursively to sort the two pieces of the array before and after the divider item • The result is shown at the bottom in the above example

  5. More Sorting Algorithms • Quick Sort (contd.) • All the items in the original array are present at each level of the tree, so each level of the tree contains N items • If we add up the items that each call to quicksort must examine at any level of the tree, we get N items

  6. More Sorting Algorithms • Quick Sort (contd.) • That means the calls to quicksort on any level require N steps • The tree is logN levels tall, and each level requires N steps, so the algorithm’s total runtime is O(N logN) • Like heapsort, quicksort has O(N logN) expected performance • Quicksort can have O(N2) performance in worst case, in which all the dividing item is less than in the part of the array that is dividing (The worst case also occurs if all the items in the array have the same value) • Heapsort has O(N logN) performance in all cases, so it is in some sense safer and more elegant • But in practice, quicksort is usually faster than heapsort, so it is the algorithm of choice for most programmers • It is also the algorithm that is used in most libraries • It is also parallelizable

  7. More Sorting Algorithms • Quick Sort (contd.) • The following pseudo-code shows the entire quicksort algorithm at a low level: //Sort the indicated part of the array Quicksort(Data: values[], Integer:start, Integer: end) If(start >= end) Then Return // If the list has no more than one element, it’s sorted Integer: divider = values [start] // Use the first item as the dividing item Integer: lo = start //Move items <divider to the front of the array and Integer: hi = end // items >= divider to the end of the array While (True) //Search the array from back to front starting at “hi” to find the last item where // value < “divider.” Move that item into the hole. The hole is now where that item was. While (values [hi] >=divider) hi = hi - 1 If (hi <= lo) Then <Break out of the inner While loop.> End While

  8. More Sorting Algorithms • Quick Sort (contd.) If (hi <= lo) Then //The left and right pieces have met in the middle so we are done // Put the divider here, and break out of the outer while loop. values[lo]= divider <Break out of the outer while loop.> End If values[lo] = values[hi] //Move the value we found to the lower half // Search the array from front to back starting at “lo” to find the first item where value >= “divider”.Move that item into the hole. The hole is now where that item was. lo = lo + 1 While (values[lo] < divider) lo = lo + 1 If(lo >= hi) Then <Break out of the inner while loop.> End While If (lo>= hi) Then lo=hi // The left and right pieces have met in the middle so we are done. values[hi] = divider // Put the divider here, and break out of the outer while loop. End If values [hi] = values[lo] // Move the value we found to the upper half. End While Quicksort(values, start, lo – 1) // Recursively sort the two halves. Quicksort(values, lo + 1, end) End Quicksort

  9. More Sorting Algorithms • Merge Sort • Like quicksort, mergesortuses a divide-and-conquer strategy • Instead of picking a dividing item and splitting the items into two groups holding items that are larger and smaller than the dividing item, mergesort splits the items into two equal halves of equal size • It then recursively calls itself to sort the two halves • When the recursive calls to mergesortreturn the algorithm merges the two sorted halves into a combined sorted list • The following pseudo-code shows the algorithm: Mergesort(Data: values[], Data:scratch[], Integer: start, Integer: end) If(start == end) Then Return // If the array contains only one item, it is already sorted. Integer: midpoint = (start + end) /2 // Break the array into left and right halves Mergesort (values, scratch, start, midpoint) // Call mergesort to sort the two halves Mergesort(values, scratch, midpoint + 1, end) Integer: left_index = start // Merge the two sorted halves

  10. More Sorting Algorithms • Merge Sort (contd.) Integer: right_index = midpoint + 1 Integer: scratch_index = left_index While (left_index <=midpoint) And (right_index <=end) If(values[left_index] <= values[right_index]) Then scratch[scratch_index] = values[left_index] left_index = left_index + 1 Else scratch[scratch_index] = values[right_index] right_index = right_index + 1 End If scratch_index= scratch_index + 1 End While

  11. More Sorting Algorithms • Merge Sort (contd.) For i = left_index To midpoint // Finish copying whichever half is not empty scratch[scratch_index] = values[i] scratch_index= scratch_index + 1 Next i For i = right_index To end scratch[scratch_index] = values[i] scratch_index = scratch_index + 1 Next i For i = start To end // Copy the values back into the original values array. values [i] = scratch[i] Next i End Mergesort

  12. More Sorting Algorithms • Merge Sort (contd.) • This algorithm also has O(N log N) run time • Like heapsort, mergesort’s run time also does not depend on the initial arrangement of the items, so it always has O(N log N) run time • And does not have a disastrous worst case like quicksort does • Like quicksort, mergesort is parrallelizable • Mergesort is particularly useful when all the data to be sorted won’t fit in memory at once • For example, suppose a program needs to sort 1 billion customer records, each of which occupies 1 MB. Loading all the data into memory at once would require 1015bytes of memory, or 1000 TB, which is much more than most computers have • The mergesort algorithm, however, doesn’t need to load that much memory all at once. The algorithm doesn’t even need to look at any of the items in the array until after its recursive calls to itself have returned. • At that point, the algorithm walks through the two sorted halves in a linear fashion and merges them. Moving through the items linearly reduces the computer’s need to page memory to and from disk.

  13. More Sorting Algorithms • Counting Sort • Countingsort works if the values we are sorting are integers that lie in a relatively small range • For example, if we need to sort 1 million integers with values between 0 and 1,000, coutingsort can provide amazingly fast performance • The basic idea behind countingsort is to count the number of items in the array that have each value • Then it is relatively easy to copy each value, in order, the required number of times back into the array • Then the following pseudo-code shows the countingsort algorithm: Countingsort(Integer: values[], Integer: max_value) Integer: counts [0 To max_value] // Make an array to hold the counts For i =0 To max_value // Initialize the array to hold the counts. // (This is not necessary in all programming languages.) counts [i] = 0 Next i

  14. More Sorting Algorithms • Counting Sort (contd.) // Count the items with each value For i = 0 To <Length of values> - 1 count[values[i]] = count[values[i]] + 1 // Add 1 to the count for this value Next i // Copy the values back into the array Integer: index = 0 For i=0 To max_value // Copy the value i into the array counts[i] times For j= 1 to counts[i] values[index] = i index = index + 1 Next j Next i End Countingsort

  15. More Sorting Algorithms • Counting Sort (contd.) • Let M be the number of items in the counts array (so M = max_value +1) and let N be the number of items in the values array, if your programming language doesn’t automatically initialize the counts array so that it contains 0s, the algorithm spends M steps initializing the array. It then takes N steps to count the values in the array • The algorithm finishes by copying the values back into the original array • Each value is copied once, so that part takes N steps. If any of the entries in the counts array is still 0, the program also spends some time skipping over that entry • In the worst-case, if all the values are the same, so that the counts array contains mostly 0s, it takes M steps to skip over the 0 entities • That makes the total runtime O(2 * N + M) = O(N+M) • If M is relatively small compared to N, this is much smaller than the O(N log N) performance given by other algorithms previously • In one test, quicksort(worst-case) took 4.29 seconds to sort 1 million items with values between 0 and 1,000, but it took countingsort only 0.03 seconds. • With Similar values, heapsort took roughtly 1.02 seconds

  16. More Sorting Algorithms • Bucket Sort • The bucketsort algorithm (also called bin sort) works by dividing items into buckets • It sorts the buckets either by recursively calling bucketsort or by using some other algorithm and then concatenates the buckets’ contents back into the original array • The following pseudo-code shows the algorithm at a high level: Bucketsort (Data : values[]) <Make buckets.> <Distribute the items into the buckets.> <Sort the buckets.> <Gather the bucket values back into the original array.> End Bucketsort

  17. More Sorting Algorithms 0-19 20-39 40-59 60-79 80-99 Distribute 19 10 9 7 29 63 61 85 91 97 0-19 20-39 40-59 60-79 80-99 Sort 7 9 10 19 29 61 63 85 91 97 Gather • Bucket Sort (contd.) • For example, we have an array as shown below:

  18. More Sorting Algorithms • Bucket Sort (contd.) • The buckets can be stacks, linked lists, queues, arrays or any other data structure that you find convenient • If the array contains N fairly evenly distributed items, distributing them into the buckets requires N steps times whatever time it takes to place an item in a bucket • By ignoring the constant time to place an item in a bucket,distributing the items take O(N) steps • If we use M buckets, sorting each bucket requires an expected F(N/M) steps, where F is the runtime function of the sorting algorithm that we use to sort the buckets • Multiplying this by the number of buckets M, the total time to sort all the buckets is O(M * F(N/M)) • After the sorted buckets, gathering their values back into the array requires O(N) steps • Total runtime: O(N) + O(M * F(N/M)) + O(N) = O(N + M * F(N/M)) • If M is a fixed fraction of N, N/M is a constant, so F(N/M) is also a constant and this simplifies to O(N+M) • In practice, M must be a relatively large fraction of N for the algorithm to perform well. • Unlike countingsort, bucketsort’s performance does not depend on the range of the values • Instead, it depends on the number of buckets we use

  19. More Sorting Algorithms Summary – Sorting Algorithms

More Related