1 / 44

Linear Programming: A Mathematical Optimization Technique

Learn about linear programming, a mathematical method for selecting the best solution from available options, and how it can be applied in different scenarios.

stacybaker
Download Presentation

Linear Programming: A Mathematical Optimization Technique

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSV881: Low-Power DesignLinear Programming – A Mathematical Optimization Technique Vishwani D. Agrawal James J. Danaher Professor Dept. of Electrical and Computer Engineering Auburn University, Auburn, AL 36849 vagrawal@eng.auburn.edu http://www.eng.auburn.edu/~vagrawal Lectures 8 and 9,: Linear Programming

  2. What is Linear Programming • Linear programming (LP) is a mathematical method for selecting the best solution from the available solutions of a problem. • Method: • State the problem and define variables whose values will be determined. • Develop a linear programming model: • Write the problem as an optimization formula (a linear expression to be minimized or maximized) • Write a set of linear constraints • An available LP solver (computer program) gives the values of variables. Lectures 8 and 9,: Linear Programming

  3. Types of LPs • LP – all variables are real. • ILP – all variables are integers. • MILP – some variables are integers, others are real. • A reference: • S. I. Gass, An Illustrated Guide to Linear Programming, New York: Dover, 1990. Lectures 8 and 9,: Linear Programming

  4. A Single-Variable Problem • Consider variable x • Problem: find the maximum value of x subject to constraint, 0 ≤ x ≤ 15. • Solution: x = 15. Constraint satisfied x 15 0 Solution x = 15 Lectures 8 and 9,: Linear Programming

  5. Single Variable Problem (Cont.) • Consider more complex constraints: • Maximize x, subject to following constraints: • x ≥ 0 (1) • 5x ≤ 75 (2) • 6x ≤ 30 (3) • x ≤ 10 (4) 0 5 10 15 x (1) (2) (3) (4) All constraints satisfied Solution, x = 5 Lectures 8 and 9,: Linear Programming

  6. A Two-Variable Problem • Manufacture of chairs and tables: • Resources available: • Material: 400 boards of wood • Labor: 450 man-hours • Profit: • Chair: $45 • Table: $80 • Resources needed: • Chair • 5 boards of wood • 10 man-hours • Table • 20 boards of wood • 15 man-hours • Problem: How many chairs and how many tables should be manufactured to maximize the total profit? Lectures 8 and 9,: Linear Programming

  7. Formulating Two-Variable Problem • Manufacture x1 chairs and x2 tables to maximize profit: P = 45x1 + 80x2 dollars • Subject to given resource constraints: • 400 boards of wood, 5x1 + 20x2 ≤ 400 (1) • 450 man-hours of labor, 10x1 + 15x2 ≤ 450 (2) • x1 ≥ 0 (3) • x2 ≥ 0 (4) Lectures 8 and 9,: Linear Programming

  8. Solution: Two-Variable Problem 40 30 20 10 0 P = 2200 Best solution: 24 chairs, 14 tables Profit = 45×24 + 80×14 = 2200 dollars Man-power constraint (1) Tables, x2 (24, 14) Material constraint (3) P = 0 (4) 0 10 20 30 40 50 60 70 80 90 Chairs, x1 increasing (2) Profit decresing Lectures 8 and 9,: Linear Programming

  9. Change Profit of Chair to $64/Unit • Manufacture x1 chairs and x2 tables to maximize profit: P = 64x1 + 80x2 dollars • Subject to given resource constraints: • 400 boards of wood, 5x1 + 20x2 ≤ 400 (1) • 450 man-hours of labor, 10x1 + 15x2 ≤ 450 (2) • x1 ≥ 0 (3) • x2 ≥ 0 (4) Lectures 8 and 9,: Linear Programming

  10. Solution: $64 Profit/Chair P = 2880 40 30 20 10 0 Best solution: 45 chairs, 0 tables Profit = 64×45 + 80×0 = 2880 dollars Man-power constraint (1) Tables, x2 (24, 14) Material constraint (3) P = 0 (4) 0 10 20 30 40 50 60 70 80 90 Chairs, x1 (2) increasing Profit decresing Lectures 8 and 9,: Linear Programming

  11. A Dual Problem • Explore an alternative. • Questions: • Should we make tables and chairs? • Or, auction off the available resources? • To answer this question we need to know: • What is the minimum price for the resources that will provide us with same amount of revenue from sale as the profits from tables and chairs? • This is the dual of the original problem. Lectures 8 and 9,: Linear Programming

  12. Formulating the Dual Problem • Revenue received by selling off resources: • For each board, w1 • For each man-hour, w2 • Minimize 400w1 + 450w2 • Subject to constraints: • 5w1 + 10w2 ≥ 45 • 20w1 + 15w2 ≥ 80 • w1 ≥ 0 • w2 ≥ 0 Resources: Material: 400 boards Labor: 450 man-hrs Profit: Chair: $45 Table: $80 Resources needed: Chair 5 boards of wood 10 man-hours Table 20 boards of wood 15 man-hours Lectures 8 and 9,: Linear Programming

  13. The Duality Theorem • If the primal has a finite optimum solution, so does the dual, and the optimum values of the objective functions are equal. Lectures 8 and 9,: Linear Programming

  14. Primal problem Fixed resources Maximize profit Variables: x1 (number of chairs) x2 (number of tables) Maximize profit 45x1+80x2 Subject to: 5x1 + 20x2 ≤ 400 10x1 + 15x2 ≤ 450 x1 ≥ 0 x2 ≥ 0 Solution: x1 = 24 chairs, x2 = 14 tables Profit = $2200 Dual Problem Fixed profit Minimize value Variables: w1 ($ value/board of wood) w2 ($ value/man-hour) Minimize value 400w1+450w2 Subject to: 5w1 + 10w2 ≥ 45 20w1 + 15w2 ≥ 80 w1 ≥ 0 w2 ≥ 0 Solution: w1 = $1, w2 = $4 value = $2200 Primal-Dual Problems Lectures 8 and 9,: Linear Programming

  15. LP for n Variables n minimize Σcj xj Objective function j =1 n subject to Σaij xj ≤ bi,i = 1, 2, . . ., m j =1 n Σcij xj = di,i = 1, 2, . . ., p j =1 Variables: xj Constants: cj, aij, bi, cij, di Lectures 8 and 9,: Linear Programming

  16. Algorithms for Solving LP • Simplex method • G. B. Dantzig, Linear Programming and Extension, Princeton, New Jersey, Princeton University Press, 1963. • Ellipsoid method • L. G. Khachiyan, “A Polynomial Algorithm for Linear Programming,” Soviet Math. Dokl., vol. 20, pp. 191-194, 1984. • Interior-point method • N. K. Karmarkar, “A New Polynomial-Time Algorithm for Linear Programming,” Combinatorica, vol. 4, pp. 373-395, 1984. • Course website of Prof. Lieven Vandenberghe (UCLA), http://www.ee.ucla.edu/ee236a/ee236a.html Lectures 8 and 9,: Linear Programming

  17. Basic Ideas of Solution methods Extreme points Extreme points Objective function Objective function Constraints Constraints Simplex: search on extreme points. Complexity: polynomial in n, number of variables Interior-point methods: Successively iterate with interior spaces of analytic convex boundaries. Complexity: O(n3.5L), L = no. of int. values Lectures 8 and 9,: Linear Programming

  18. Integer Linear Programming (ILP) • Variables are integers. • Complexity is exponential – higher than LP. • LP relaxation • Convert all variables to real, preserve ranges. • LP solution provides guidance. • Rounding LP solution can provide a non-optimal solution. Lectures 8 and 9,: Linear Programming

  19. Traveling Salesperson Problem (TSP) 4 6 12 5 27 1 12 18 15 20 19 10 2 3 5 Lectures 8 and 9,: Linear Programming

  20. Solving TSP: Five Cities Distances (dij) in miles (symmetric TSP, general TSP is asymmetric) Lectures 8 and 9,: Linear Programming

  21. Search Space: No. of Tours • Asymmetric TSP tours • Five-city problem: 4 × 3 × 2 × 1 = 24 tours • Ten-city problem: 362,880 tours • 15-city problem: 87,178,291,200 tours • 50-city problem: 49! = 6.08×1062 tours Time for enumerative search assuming 1 μs per tour evaluation = 1.93×1055 years Lectures 8 and 9,: Linear Programming

  22. A Greedy Heuristic Solution Tour length = 10 + 5 + 12 + 6 + 27 = 60 miles (non-optimal) Lectures 8 and 9,: Linear Programming

  23. ILP Variables, Constants and Constraints 4 x14 ε [0,1] d14 = 12 5 x15 ε [0,1] 1 d15 = 27 Integer variables: xij = 1, travel i to j xij = 0, do not travel i to j Real constants: dij = distance from i to j x12 ε [0,1] d12 = 18 x13 ε [0,1] d13 = 10 2 3 x12 + x13 + x14 + x15 = 1 four other similar equations Lectures 8 and 9,: Linear Programming

  24. Objective Function and ILP Solution 5 i - 1 Minimize ∑ ∑ xij × dij i = 1 j = 1 5 ∑ xij = 1, for all i, i.e., every node i has exactly one outgoing edge. j = 1 j ≠ i Lectures 8 and 9,: Linear Programming

  25. ILP Solution d54 = 6 4 5 d45 = 6 1 d21 = 18 d13 = 10 2 3 d32 = 5 Total length = 45 but not a single tour Lectures 8 and 9,: Linear Programming

  26. Additional Constraints for Single Tour • Following constraints prevent split tours. For any subset S of cities, the tour must enter and exit that subset: ∑ xij ≥ 2 for all S, |S| < 5 i ε S j ε S Remaining set At least two arrows must cross this boundary. Any subset Lectures 8 and 9,: Linear Programming

  27. ILP Solution 4 d54 = 6 d41 = 12 5 1 d25 = 20 d13 = 10 2 3 d32 = 5 Total length = 53 Lectures 8 and 9,: Linear Programming

  28. Characteristics of ILP • Worst-case complexity is exponential in number of variables. • Linear programming (LP) relaxation, where integer variables are treated as real, gives a lower bound on the objective function. • Recursive rounding of relaxed LP solution to nearest integers gives an approximate solution to the ILP problem. • K. R. Kantipudi and V. D. Agrawal, “A Reduced Complexity Algorithm for Minimizing N-Detect Tests,” Proc. 20th International Conf. VLSI Design, January 2007, pp. 492-497. Lectures 8 and 9,: Linear Programming

  29. Why ILP Solution is Exponential? LP solution found in polynomial time (bound on ILP solution) Must try all 2n roundoff points Second variable Constraints First variable Objective (maximize) Lectures 8 and 9,: Linear Programming

  30. ILP Example: Test Minimization • A combinational circuit has n test vectors that detect m faults. Each test detects a subset of faults. Find the smallest subset of test vectors that detects all m faults. • ILP model: • Assign an integer variable tiε[0,1] to ith test vector Ti such that ti= 1 means we select Ti, otherwise ti = 0 means we eliminate Ti • Define an integer constant fijε [0,1] such that fij= 1, if ith vector Ti detects jthfault Fj, otherwise fij= 0, if ith vector Ti does not detect jth fault Fj Values of constants fij are determined by fault simulation Lectures 8 and 9,: Linear Programming

  31. Test Data Select test Ti if ti = 1 n tests m faults fij = 1; vector Ti detects fault Fj Lectures 8 and 9,: Linear Programming

  32. Test Minimization by ILP n minimize Σti Objective function i =1 n subject to Σfij ti ≥ 1,j = 1, 2, . . . , m i =1 Lectures 8 and 9,: Linear Programming

  33. Four-Bit ALU Circuit 74181 14 inputs, 8 outputs Lectures 8 and 9,: Linear Programming

  34. Finding LP/ILP Solvers • R. Fourer, D. M. Gay and B. W. Kernighan, AMPL: A Modeling Language for Mathematical Programming, South San Francisco, California: Scientific Press, 1993. Several of programs described in this book are available to Auburn users. • B. R. Hunt, R. L. Lipsman, J. M. Rosenberg, K. R. Coombes, J. E. Osborn and G. J. Stuck, A Guide to MATLAB for Beginners and Experienced Users, Cambridge University Press, 2006. • Search the web. Many programs with small number of variables can be downloaded free. Lectures 8 and 9,: Linear Programming

  35. A Circuit Optimization Problem • Given: • Circuit netlist • Cell library with multiple versions for each cell • Select cell versions to optimize a specified characteristic of the circuit. Typical characteristics are: • Area • Power • Delay • Example: Minimize power for given delay. Lectures 8 and 9,: Linear Programming

  36. Gate Library: NAND(X), X = 0 or 1 • X: an integer variable for each gate. • X = 0, choose gate with small delay • Delay = d × fo, where fo = number of fanouts for gate • Power = 3 × p × fo • d and p are parameters of technology • X = 1, choose gate with low power • Delay = 2 × d × fo • Power = 0.5 × p ×fo • Normalized gate delay = [(1 – X) + 2 X] ×fo • Normalized power = [3(1 – X) + 0.5 X] ×fo • Normalization: d = 1, p = 1 Lectures 8 and 9,: Linear Programming

  37. Example: One-Bit Full Adder 2 CO SUM A B C 1 3 2 1 1 3 1 1 Number of fanouts, fo Lectures 8 and 9,: Linear Programming

  38. Define Arrival Time Variables, Tk Tk = Latest signal arrival time at output of gate k T9 2 T1 = T2 = T3 = 0 CO SUM T1 A B C T5 1 T4 T7 T10 3 2 1 T6 T8 T12 T2 1 3 1 T3 1 T11 Number of fanouts, fo Lectures 8 and 9,: Linear Programming

  39. Constraint: Gate k in the Circuit • Ti = signal arrival time at ith input of gate k • Tk = signal arrival time at gate k output • Tk≥ Ti + (1 – Xk) fo(k) + 2 Xkfo(k), for all i • Where, fo(k) = fanout number of gate k Xk = 0, choose fast cell for k Xk = 1, choose low power cell for k Lectures 8 and 9,: Linear Programming

  40. Arrival Time Constraints on Gate 7 T9 2 T7 ≥ T5 + (1 – X7) 2 + 2 X7 ✕ 2 T1 = T2 = T3 = 0 T7 ≥ T6 + (1 – X7) 2 + 2 X7 ✕ 2 CO SUM T1 A B C T5 1 T4 T7 T10 3 2 1 T6 T8 T2 T12 1 3 1 T3 1 T11 Number of fanouts, fo Lectures 8 and 9,: Linear Programming

  41. Clock Constraints • Ti = 0, for all primary inputs i • To ≤ Tc, clock period, for all primary outputs o Combinational Logic Register Register Clock Lectures 8 and 9,: Linear Programming

  42. Critical Path Constraints T9 ≤ Tc 2 T1 = T2 = T3 = 0 CO SUM T1 A B C T5 1 T4 T7 T10 3 2 1 T12 ≤ Tc T6 T8 T2 1 3 1 T3 1 T11 Number of fanouts, fo Lectures 8 and 9,: Linear Programming

  43. Optimization Function • Minimize ∑ 3(1 – Xk) fo(k) + 0.5 Xkfo(k) all gates k Lectures 8 and 9,: Linear Programming

  44. Typical Result 45 35 25 15 5 (11, 45) Normalized power (22, 7.5) 5 10 15 20 Normalized delay (Tc) Lectures 8 and 9,: Linear Programming

More Related