Optimization problems
This presentation is the property of its rightful owner.
Sponsored Links
1 / 64

Optimization Problems PowerPoint PPT Presentation


  • 102 Views
  • Uploaded on
  • Presentation posted in: General

Optimization Problems. Optimization problem: a problem of finding the best solution from all feasible solutions. Two common techniques: Greedy Algorithms (local) Dynamic Programming (global). Greedy Algorithms. Greedy algorithms typically consist of A set of candidate solutions

Download Presentation

Optimization Problems

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Optimization problems

Optimization Problems

  • Optimization problem: a problem of finding the best solution from all feasible solutions.

  • Two common techniques:

    • Greedy Algorithms (local)

    • Dynamic Programming (global)


Greedy algorithms

Greedy Algorithms

Greedy algorithms typically consist of

  • A set of candidate solutions

  • Function that checks if the candidates are feasible

  • Selection function indicating at a given time which is the most promising candidate not yet used

  • Objective functiongiving the value of a solution; this is the function we are trying to optimize


Examples of greedy algorithms

Examples of Greedy Algorithms

  • Graph Algorithms

    • Breath First Search (shortest path 4 un-weighted graph)

    • Dijkstra’s (shortest path) Algorithm

    • Minimum Spanning Trees

  • Data compression

    • Huffman coding

  • Scheduling

    • Activity Selection

    • Minimizing time in system

    • Deadline scheduling

  • Other Heuristics

    • Coloring a graph

    • Traveling Salesman

    • Set-covering


Elements of greedy strategy

Elements of Greedy Strategy

  • Greedy-choice property: A global optimal solution can be arrived at by making locally optimal (greedy) choices

  • Optimal substructure: an optimal solution to the problem contains within it optimal solutions to sub-problems

    • Be able to demonstrate that if A is an optimal solutioncontainings1, then the setA’ = A - {s1} is an optimal solution to a smaller problem w/os1.


Analysis

Analysis

  • The selection function is usually based on the objective function; they may be identical. But, often there are several plausible ones.

  • At every step, the procedure chooses the best candidate, without worrying about the future. It never changes its mind: once a candidate is included in the solution, it is there for good; once a candidate is excluded, it’s never considered again.

  • Greedy algorithms do NOT always yield optimal solutions, but for many problems they do.


Breadth first traversal

Breadth-First Traversal

  • A breadth-first traversal

    • visits a vertex and

    • then each of the vertex's neighbors

    • before advancing

Cost ?

O(|V|+|E|)


Breadth first traversal1

2

3

3

2

2

3

1

S

S

S

2

1

1

2

3

2

3

Finished

Discovered

Undiscovered

Breadth-First Traversal

Implementation?


Example bfs

Example (BFS)

r

s

t

u

0

v

w

y

x

Q: s

0


Example bfs1

Example (BFS)

r

s

t

u

0

1

1

v

w

y

x

Q: w r

1 1


Example bfs2

Example (BFS)

r

s

t

u

2

0

1

2

1

v

w

y

x

Q: r t x

1 2 2


Example bfs3

Example (BFS)

r

s

t

u

2

0

1

2

2

1

v

w

y

x

Q: t x v

2 2 2


Example bfs4

Example (BFS)

r

s

t

u

2

0

3

1

2

2

1

v

w

y

x

Q: x v u

2 2 3


Example bfs5

Example (BFS)

r

s

t

u

2

0

3

1

2

3

2

1

v

w

y

x

Q: v u y

2 3 3


Example bfs6

Example (BFS)

r

s

t

u

2

0

3

1

2

3

2

1

v

w

y

x

Q: u y

3 3


Example bfs7

Example (BFS)

r

s

t

u

2

0

3

1

2

3

2

1

v

w

y

x

Q: y

3


Example bfs8

Example (BFS)

r

s

t

u

2

0

3

1

2

3

2

1

v

w

y

x

Q:


Example bfs9

Example (BFS)

r

s

t

u

2

0

3

1

2

3

2

1

v

w

y

x

Breath First Tree


Bfs application

r

r

s

s

t

t

u

u

2

0

0

3

1

2

3

2

1

v

v

w

w

y

y

x

x

BFS : Application

  • Solves the shortest path problem for un-weighted graph


Dijkstra s algorithm 4 4

Dijkstra’s algorithm (4.4)

1

10

9

2

3

s

4

6

7

5

2

y

x

An adaptation of BFS


Example

Example

u

v

1

10

9

2

3

0

s

4

6

7

5

2

y

x


Example1

Example

u

v

1

10

10

9

2

3

0

s

4

6

7

5

5

5

2

y

x


Example2

Example

u

v

1

8

14

10

9

2

3

0

s

4

6

7

5

7

7

5

2

y

x


Example3

Example

u

v

1

8

8

13

10

9

2

3

0

s

4

6

7

5

7

5

2

y

x


Example4

Example

u

v

1

9

8

9

10

9

2

3

0

s

4

6

7

5

7

5

2

y

x


Example5

Example

u

v

1

8

9

10

9

2

3

0

s

4

6

7

5

7

5

2

y

x


Dijkstra s algorithm

Dijkstra’s Algorithm

  • Assumes no negative-weight edges.

  • Maintains a set S of vertices whose shortest path from s has been determined.

  • Repeatedly selects u in V – S with minimum shortest path estimate (greedy choice).

  • Store V – S in

    priority queue Q.

    Cost:

    ?

O(V2)


Minimal spanning trees

Minimal Spanning Trees

Minimal Spanning Tree (MST) Problem:

Input: An undirected, connected graph G.

Output: The subgraph of G that

  • keeps the vertices connected;

  • has minimum total cost;

    (the sum of the values of the edges in the subset is at the minimum)


Mst example

MST Example


Mst example1

MST Example


Greedy algorithms1

Greedy Algorithms

  • Kruskal's algorithm. Start with T = . Consider edges in ascending order of cost. Insert edge e in T unless doing so would create a cycle.

  • Prim's algorithm. Start with some root node s and greedily grow a tree T from s outward. At each step, add the cheapest edge e to T that has exactly one endpoint in T.

  • Reverse-Delete algorithm. Start with T = E. Consider edges in descending order of cost. Delete edge e from T unless doing so would disconnect T.

  • All three algorithms produce an MST.


Kruskal s mst algorithm

Kruskal’s MST Algorithm

  • choose each vertex to be in its own MST;

  • merge two MST’s that have the shortest edge between them;

  • repeat step 2 until no tree to merge.

  • Implementation ?

    • Union-Find


Prim s mst algorithm

Prim’s MST Algorithm

A greedy algorithm.

  • choose any vertex N to be the MST;

  • grow the tree by picking the least cost edge connected to any vertices in the MST;

  • repeat step 2 until the MST includes all the vertices.


Prim s mst demo

Prim’s MST demo

http://www-b2.is.tokushimau.ac.jp/~ikeda/

suuri/dijkstra/PrimApp.shtml?demo1


Huffman coding

Huffman Coding

Huffman codes

–- very effective technique for compressing data,

saving 20% - 90%.


Coding

Coding

Problem:

  • Consider a data file of 100,000 characters

  • You can safely assume that there are many

    a,e,i,o,u, blanks, newlines, few q, x, z’s

  • Want to store it compactly

    Solution:

  • Fixed-length code, ex. ASCII, 8 bits per character

  • Variable length code, Huffman code

    (Can take advantage of relative freq of letters to save space)


Example6

000

360

001

126

010

126

011

111

100

96

101

72

110

21

111

6

918

Example

  • Fixed-length code, need ? bits for each char

3

Char

Frequency

Code

Total Bits

E

120

L

42

D

42

U

37

C

32

M

24

K

7

Z

2


Example cont

37

E:120

L:42

D:42

U:37

C:32

M:24

K:7

Z:2

Example (cont.)

Char

Code

0

1

0

1

1

0

0

1

0

1

0

1

0

1

 Complete binary tree


Example cont1

Example (cont.)

  • Variable length code

    (Can take advantage of relative freq of letters to save space)

    - Huffman codes

Char

Code


Huffman tree construction 1

Huffman Tree Construction (1)

  • Associate each char with weight (= frequency) to form a subtree of one node (char, weight)

  • Group all subtrees to form a forest

  • Sort subtrees by ascending weight of subroots

  • Merge the first two subtrees (ones with lowest weights)

  • Assign weight of subroot with sum of weights of two children.

  • Repeat 3,4,5 until only one tree in the forest


Huffman tree construction 2

Huffman Tree Construction (2)


Huffman tree construction 3

M

Huffman Tree Construction (3)


Assigning codes

char

Freq

C

32

128

D

42

126

E

120

120

M

24

120

K

7

42

L

42

126

U

37

111

Z

2

12

785

Assigning Codes

Compare with: 918

~15% less

Code

Bits

1110

101

0

11111

111101

110

100

111100


Huffman coding tree

Huffman Coding Tree


Coding and decoding

Coding and Decoding

Char

Code

  • DEED:

  • MUCK:

010000000010

101011100110

Char

Code

  • DEED:

  • MUCK:

10100101

111111001110111101


Prefix codes

Prefix codes

A set of codes is said to meet the prefix property if no code in the set is the prefix of another. Such codes are called prefix codes.

Huffman codes are prefix codes.

Char

Code


Coin changing

Coin Changing


Coin changing1

Coin Changing

  • Goal. Given currency denominations: 1, 5, 10, 25, 100, devise a method to pay amount to customer using fewest number of coins.

  • Ex: 34¢.

  • Cashier's algorithm. At each iteration, add coin of the largest value that does not take us past the amount to be paid.

  • Ex: $2.89.


Coin changing greedy algorithm

Sort coins denominations by value: c1 < c2 < … < cn.

S  

while (x  0) {

let k be largest integer such that ck x

if (k = 0)

return "no solution found"

x  x - ck

S  S  {k}

}

return S

coins selected

Coin-Changing: Greedy Algorithm

  • Cashier's algorithm. At each iteration, add coin of the largest value that does not take us past the amount to be paid.

  • Q. Is cashier's algorithm optimal?


Coin changing analysis of greedy algorithm

Coin-Changing: Analysis of Greedy Algorithm

  • Observation. Greedy algorithm is sub-optimal for US postal denominations: 1, 10, 21, 34, 37, 44, 70, 100, 350, 1225, 1500.

  • Counterexample. 140¢.

    • Greedy: 100, 37, 1, 1, 1.

    • Optimal: 70, 70.

Greedy algorithm

failed!


Coin changing analysis of greedy algorithm1

k

ck

All optimal solutionsmust satisfy

Max value of coins1, 2, …, k-1 in any OPT

1

1

P 4

-

2

5

N  1

4

3

10

N + D2

4 + 5 = 9

4

25

Q 3

20 + 4 = 24

5

100

no limit

75 + 24 = 99

Coin-Changing: Analysis of Greedy Algorithm

  • Theorem. Greed is optimal for U.S. coinage: 1, 5, 10, 25, 100.

  • Proof. (by induction on x)

    • Let ck be the kth smallest coin

    • Consider optimal way to change ck  x < ck+1 : greedy takes coin k.

    • We claim that any optimal solution must also take coin k.

      • if not, it needs enough coins of type c1, …, ck-1to add up to x

      • table below indicates no optimal solution can do this

    • Problem reduces to coin-changing x - ck cents, which, by induction, is optimally solved by greedy algorithm.


Coin changing analysis of greedy algorithm2

k

k

ck

ck

All optimal solutionsmust satisfy

All optimal solutionsmust satisfy

Max value of coins1, 2, …, k-1 in any OPT

Max value of coins1, 2, …, k-1 in any OPT

1

1

1

1

P 4

P 9

-

-

2

5

N  1

4

2

10

P + D8

9

3

10

N + D2

4 + 5 = 9

3

25

Q 3

40 + 4 = 44

4

25

Q 3

20 + 4 = 24

4

100

no limit

75 + 44 = 119

5

100

no limit

75 + 24 = 99

Coin-Changing: Analysis of Greedy Algorithm

  • Theorem. Greed is optimal for U.S. coinage: 1, 5, 10, 25, 100.

    • Consider optimal way to change ck  x < ck+1 : greedy takes coin k.

    • We claim that any optimal solution must also take coin k.

Kevin’s problem


Knapsack problem

Knapsack Problem


Knapsack problem1

Knapsack Problem

  • 0-1 knapsack: A thief robbing a store finds n items; the ith item is worth vidollars and weighs wipounds, where vi and wiare integers. He wants to take as valuable a load as possible, but he can only carry at most W pounds. What items should he take?

     Dynamic programming

  • Fractional knapsack: Same set up. But, the thief can take fractions of items, instead of making a binary (0-1) choice for each item.


Fractional knapsack problem

Item

Value

Weight

1

1

1

2

6

2

Let W=11

3

18

5

4

22

6

5

28

7

Fractional Knapsack Problem

Answer for 0-1 knapsack problem:

OPT: { 4, 3 }

value = 22 + 18 = 40

  • Greedy method: repeatedly add item with max ratio vi/wi.

  • Value:

> 45


Activity selection problem

Activity-selection Problem


Activity selection problem1

a

b

c

d

e

f

g

h

Time

0

1

2

3

4

5

6

7

8

9

10

11

Activity-selection Problem

  • Input: Set S of n activities, a1, a2, …, an.

    • si = start time of activity i.

    • fi = finish time of activity i.

  • Output: Subset Aof max # of compatible activities.

    • Two activities are compatible, if their intervals don’t overlap.


Interval scheduling greedy algorithms

Interval Scheduling: Greedy Algorithms

Greedy template. Consider jobs in some order. Take each job provided it's compatible with the ones already taken.

Earliest start time:

Consider jobs in ascending order of start time sj.

Earliest finish time:

Consider jobs in ascending order of finish time fj.

Shortest interval:

Consider jobs in ascending order of interval length fj - sj.

Fewest conflicts:

For each job, count the number of conflicting jobs cj. Schedule in ascending order of conflicts cj.


Interval scheduling greedy algorithm

Sort jobs by finish times so that f1 f2 ...  fn.

A 

for j = 1 to n {

if (job j compatible with A)

A  A {j}

}

return A

jobs selected

Interval Scheduling: Greedy Algorithm

  • Greedy algorithm. Consider jobs in increasing order of finish time. Take each job provided it's compatible with the ones already taken.

  • Implementation. O(n log n).

    • Remember job j* that was added last to A.

    • Job j is compatible with A if sj fj*.


Interval scheduling analysis

Interval Scheduling: Analysis

Theorem. Greedy algorithm is optimal.

Proof: (by contradiction)

  • Assume greedy is not optimal, and let's see what happens.

  • Let i1, ... ik denote set of jobs selected by greedy.

  • Let j1, ... jm denote set of jobs in the optimal solution withi1 = j1, i2 = j2, ..., ir = jr for the largest possible value of r.

job ir+1 finishes before jr+1

Greedy:

i1

i1

ir

ir+1

ir+1

j1

j2

jr

jr+1

. . .

OPT:

Still optimal with a bigger value than r : ir+1=jr+1 contradiction!

why not replace job jr+1 with job ir+1?


Weighted interval scheduling

a

b

c

d

e

f

g

h

Time

0

1

2

3

4

5

6

7

8

9

10

11

Weighted Interval Scheduling

  • Weighted interval scheduling problem.

    • Job j starts at sj, finishes at fj, and has weight or value vj .

    • Two jobs compatibleif they don't overlap.

    • Goal: find maximum weight subset of mutually compatible jobs.

Greedy algorithm?


Set covering one of karp s 21 np complete problems

Set Covering - one of Karp's 21 NP-complete problems

  • Given:

    • a set of elements B

    • a set S of n sets {Si} whose union equals the universe B

  • Output: cover of B

    • A subset of S whose union = B

  • Cost:

    • Number of sets picked

  • Goal:

    • Minimum cost cover


How many walmart centers should walmart build in ohio

How many Walmart centers should Walmart build in Ohio?

For each town t, St = {towns that are within 30 miles of it}

-- a Walmart center at t will cover all towns in St .


Set covering greedy approach

Set Covering - Greedy approach

while (not all covered)

Pick Si with largest uncovered elements

Proof: Let nt be # of elements not covered after t iterations. There must be a set with ≥ nt/k elements:

nt+1 ≤ nt - nt/k ≤ n0(1-1/k)t+1 ≤ n0(e-1/k)t+1

nt < 1 when t=klnn.


Dynamic programming

Dynamic programming


  • Login