1 / 29

Algorithms

Algorithms. Ch 2: Getting Started Ming-Te Chi. 2.1 Insertion sort. Example: Sorting problem Input : A sequence of n numbers Output : A permutation of the input sequence such that . The number that we wish to sort are known as the keys.

majed
Download Presentation

Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Algorithms Ch 2: Getting Started Ming-Te Chi Ch2 Getting Started

  2. 2.1 Insertion sort • Example: Sorting problem • Input: A sequence of n numbers • Output: A permutation of the input sequence such that . The number that we wish to sort are known as the keys. Ch2 Getting Started

  3. Pseudocode Insertion sort Insertion-sort(A) 1 forj2 to length[A] 2 do keyA[j] 3 Insert A[j] into the sorted sequence A[1..j-1] 4 i  j - 1 5 whilei > 0 and A[i] > key 6 doA[i+1] A[i] 7 i  i - 1 8 A[i +1] key Ch2 Getting Started

  4. The operation of Insertion-Sort Ch2 Getting Started

  5. Sorted in place • The numbers are rearranged within the array A, with at most a constant number of them sorted outside the array at any time Ch2 Getting Started

  6. Loop invariant Ch2 Getting Started • Initialization: • It is true prior to the first iteration of the loop • Maintenance: • If it is true before an iteration of loop, it remains true before next iteration. • Termination: • When the loop terminates, the invariant gives us a useful property that show that the alogorithm is correct.

  7. 2.2 Analyzing algorithms Analyzing an algorithm has come to mean predicting the resources that the algorithm requires. • Resources: memory, communication bandwidth, logic gate(hardware), and time. • Assumption(Model):RAM(Random Access Model). single processor • (We shall have occasion to investigate models for parallel computers and digital hardware.) Ch2 Getting Started

  8. The best notion for input size depends on the problem being studied. The running time of an algorithm on a particular input is the number of primitive operations or “steps” executed. It is convenient to define the notion of step so that it is as machine-independent as possible. 2.2 Analyzing algorithms Ch2 Getting Started

  9. Analyzing insertion sort Ch2 Getting Started

  10. the running time Ch2 Getting Started

  11. The best case running time The best case • for j = 2, 3, …, n : Linear function on n Ch2 Getting Started

  12. The worst case running time • for j = 2,3,…,n : quadratic function on n. Ch2 Getting Started

  13. Worst-case and average-case analysis • Usually, we concentrate on finding only on the worst-caserunning time • Reason: • It is an upper bound on the running time. • The worst case occurs fair often for some algorithm. • The average case is often as bad as the worst case. For example, the insertion sort. Again, quadratic function. (Why?) Ch2 Getting Started

  14. Ch2 Getting Started In some particular cases, we shall be interested in average-case, or expect running time of an algorithm.

  15. Order of growth • The leading term of formula (ex: ) • It is the rate of growth, or order of growth, of the running time that really interests us. (More on this subject in a later lecture.) Ch2 Getting Started

  16. 2.3 Designing algorithms There are many ways to design algorithms: • Incremental approach: insertion sort • Divide-and-conquer: merge sort • recursive: • Dividethe problem into a number of subproblems. • Conquerthesubproblems by solving them recursively. • Combine the solutions to the subproblems into the solution for the original problem. Ch2 Getting Started

  17. Merge sort Key operation of Merge sort • Merge two sorted subsequences • Use an auxiliary procedure Merge(A, p, q, r) A: an array p, q, r: indices with p <= q < r Ch2 Getting Started

  18. Merge(A,p,q,r) 1 n1 q – p + 1 2 n2 r – q 3 create array L[ 1 .. n1 + 1 ] and R[ 1 ..n2 + 1 ] 4 fori  1 to n1 5 do L[ i ] A[ p + i – 1 ] 6 forj 1 to n2 7 do R[ i ] A[ q + j ] 8 L[n1 + 1]   9 R[n2 + 1]   Ch2 Getting Started

  19. Merge(A,p,q,r) 10 i 1 11 j 1 12 for k  p to r 13 do if L[ i ]  R[ j ] 14 then A[ k ] L[ i ] 15 i  i + 1 16 elseA[ k ] R[ j ] 17 j  j + 1 Ch2 Getting Started

  20. Example of Merge(A,p,q,r) Ch2 Getting Started

  21. Ch2 Getting Started

  22. MERGE-SORT(A,p,r) 1 if p < r 2 then q(p+r)/2 3 MERGE-SORT(A,p,q) 4 MERGE-SORT(A,q+1,r) 5 MERGE(A,p,q,r) Ch2 Getting Started

  23. Example of merge sort Ch2 Getting Started

  24. Analysis of merge sort • Use recurrence to analyze divide-and-conquer algorithms • Analysis of merge sort Ch2 Getting Started

  25. Analysis of merge sort where the constant c represents the time require to solve problems of size 1 as well as the time per array element of the divide and combine steps. Ch2 Getting Started

  26. Recursion tree Ch2 Getting Started

  27. Ch2 Getting Started

  28. Ch2 Getting Started

  29. Merge sort Outperforms insertion sort! Ch2 Getting Started

More Related