1 / 13

Algorithms

Algorithms. For the next few weeks, we are studying optimization problems Analogous to some problems of differential calculus. E.g. “What number exceeds its square by the greatest amount?” Input contains a list of numbers A function is applied to a selection of values in the list

egalbraith
Download Presentation

Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Algorithms • For the next few weeks, we are studying optimization problems • Analogous to some problems of differential calculus. E.g. “What number exceeds its square by the greatest amount?” • Input contains a list of numbers • A function is applied to a selection of values in the list • individual value, subset, or permutation • We seek the selection that gives us the max or min value of the function • E.g. Which pair of values has the greatest difference? • Algorithm needs to be correct and efficient.

  2. Algorithms • Consider: What is the difference between Dijkstra’s algorithm and the TSP? Common algorithm strategies: • Brute force • All combinations of a set: 2n • All permutations of a set: n! • Random sampling • Greedy (Ch. 10) • Often a good approach • Divide & conquer (Ch. 11) • Dynamic programming (Ch. 12) • When greedy doesn’t give us the best answer

  3. Examples • What is the probability that the product of two 3-digit numbers ends in 0? • We can quickly enumerate all instances of the problem • What if we tried 7-digit numbers instead? • Brute force impractical. Why? • Better to randomize the input, and do as many trials as desired to achieve an acceptably precise answer. • Exhaustive comparison of two similar sorting methods • For small array size, can generate all permutations. • Otherwise, choose random permutations.

  4. Chapter 10 • “Greedy algorithms” • It’s a strategy of solving some problems • Need to make a series of choices • Each choice is made to maximize current benefit, ignoring the future. • Often, some sorting is necessary. • How do we know a greedy technique works? We try to show that no other algorithm can achieve a better result. • Several examples

  5. (1) Maximizing jobs • Given a set of jobs, with specific start and finish times, what is the maximum # of jobs we can schedule? • Assume that you can’t do 2 jobs at the same time. • Assume time is represented as a whole number, and you may start a job as soon as another is finished. • Try this: Sort list of jobs ascending by finish time. W = set of jobs we’ll do: initialize to empty. for j = 1 to n: if (job j doesn’t conflict with W) add j to W return W

  6. Example • Suppose we have this list of jobs, with (start, finish) times. j1=(1,4) j2=(3,5) j3=(0,6) j4=(4,7) j5=(3,8) j6=(5,9) j7=(6,10) j8=(8,11) • Jobs are already sorted.  • Add j1 to W. • We see that j2 and j3 conflict. (They don’t get reconsidered later.) • Add j4 to W. • We see that j5, j6 and j7 conflict. • Finally, add j8 to W.

  7. Optimal • This greedy algorithm produces an optimal schedule. • Goal is to maximize # jobs, given fixed start and finish times. • If the algorithm can schedule k jobs, no other algorithm (an optimal solution) would be able to give you k+1. • Proof by contradiction • Let i1, i2, i3, … ik be the set of jobs given by greedy algorithm. • Let j1, j2, j3, … jm be the set of jobs in an optimal solution. (m > k) • Define r: The largest integer where first r jobs in each set match. • The corresponding #r+1 job in each list differ. In particular, jr+1 finishes after ir+1. • In the optimal solution, after job jr, ir+1 finishes no later than jr+1. So, replace jr+1 with ir+1. The set of jobs is still legal and optimal. But we have a contradiction, because now the first r+1 jobs match.

  8. (2) Minimizing rooms • A conference consists of many events: lectures and workshops of various lengths. • Like a job, each event has a known start and finish time. • To simplify the model, again we’ll assume time is expressed in whole numbers. • Produce a schedule that minimizes the number of rooms. • Assume that no 2 jobs may take place at same time in same place. • Inside a room, an event may begin as soon as another one ends. • Similar to previous problem except we have to schedule all events. We are not necessarily interested if 1 room has a maximal number of events taking place inside.

  9. Algorithm Sort list of events ascending by start time. numRooms = 0 for i = 1 to n: if event #i is compatible with some room r schedule event #i in room r. else ++numRooms schedule event #i in the new room return room list schedule • Example e1(0,7) e2(0,3) e3(4,7) e4(4,10) e5(8,11) e6(8,11) e7(10,15) e8(12,15)

  10. Optimal • We can scan the list of events to see what the minimum number of rooms is. At each time t, see how many rooms are needed. • Want to show the greedy algorithm will generate a schedule with this optimal number of rooms. • Let d = # rooms the greedy algorithm schedules. • “Critical instant” is when Room #d is created. We only allocate Room #d if there exists an event #i conflicting with other d–1 rooms. • Event #i and other d – 1 events at that time all finish after #i’s start time, or else there would have been no conflict. • The other d – 1 events started before event #i. • Therefore, at the time event #i starts, we indeed have d events taking place at the same time. So the optimal schedule must use d rooms.

  11. (3) Fractional knapsack • You discover a treasure chest. But you can only carry away some of the treasure. Each type of treasure has some weight and some value. • Subject to a total weight constraint, need to maximize total value of treasure to haul away. • Formally: • For each item in some set, we have a value vi and a weight wi. We may select xi of the item. • Limits: total weight W you may take away, and available weight wi of each item. • We wish to maximize xi vi / wi. You can think of xi / wi as what proportion of all the wi you are taking. • For example, chest may contain silver and gold; nickels and dimes, etc.

  12. Algorithm We’re given L, list of items to loot. Sort L descending by v_i/w_i. This ratio is a rate, such as value per pound. w = 0 while w < weightLimit: x_i = Take as much of L[0] as you can. w += x_i If L[0] depleted, remove it from list. • Not hard to see that there is no other way to return with more valuable loot given our weight limit. • Final note: what is the complexity of the algorithms we’ve seen?

  13. 0/1 Knapsack problem • A thief wants to rob a store. Item i has value vi and weight wi. Knapsack has max capacity W. • Want to take as valuable a load as possible. Which items to take? • “0/1” means each item is taken or not. No fractions. • Greedy algorithm doesn’t work. For example (W=5): • Item 1 has value 3, weight 1 (ratio 3.0) • Item 2 has value 5, weight 2 (ratio 2.5) • Item 3 has value 6, weight 3 (ratio 2.0) • Taking items 1+2 (no room for #3): total value 3 + 5 = 8. • Taking items 1+3 (no room for #2): total value 3 + 6 = 9. • Taking items 2+3 (no room for #1): total value 5 + 6 = 11. •  Greedy algorithm actually gave “worst” solution.

More Related