Optimality and Efficiency-      A Dual Objective Function for       Optimization Algorithms

Optimality and Efficiency- A Dual Objective Function for Optimization Algorithms PowerPoint PPT Presentation


  • 173 Views
  • Uploaded on
  • Presentation posted in: General

Download Presentation

Optimality and Efficiency- A Dual Objective Function for Optimization Algorithms

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


1. Optimality and Efficiency- A Dual Objective Function for Optimization Algorithms C. L. Liu National Tsing Hua University Hsinchu, Taiwan

2. Optimization Problems

3. Constrained Optimization Problems

4. Solution of Optimization Problems Finding a Best possible solution at the Lowest possible cost

5. Solution of Optimization Problems Cost : Time Space Energy . . . . .

6. Solution of Optimization Problems Finding a Best possible solution at a Lowest possible cost

7. Solution of Optimization Problems Finding a Best possible solution at a Low cost (Low : Polynomial in problem size)

8. Solution of Optimization Problems Finding a Best possible solution at a High cost (High : Exponential in problem size)

9. Solution of Optimization Problems Finding an Acceptable solution at a Low cost

10. Approximation Algorithms (Performance Guaranteed)

11. Approximation Algorithms (Performance Guaranteed)

12. Next Fit Packing Algorithm

13. Approximation Algorithms (Performance Guaranteed)

14. Solution of Optimization Problems Finding an Acceptable solution at a tolerably High cost

15. Simulated Annealing (Downhill - Uphill) Non-zero probability for up-hill moves Probability depends on the magnitude of up-hill move Probability depends on total search time

16. Genetic Algorithm Start with an initial generation of solutions ( a finite number, n, of randomly chosen solutions) Crossover Operator : Combine two current solutions to yield one or more offsprings Mutation Operator : Mutate a current solution to produce an offspring Select from the pool the next generation of n solutions according to a fitness criterion Stop according to some stopping criteria : number of generations, incremental improvement of the best solution

17. An Example An optimal solution An efficient algorithm A special case of a well-studied problem

18. Linear Programming

19. Linear Programming

20. Linear Programming A consistent system (non-empty solution space)

21. Linear Programming A redundant constraint ai1 x1 + ai2x2 + ……+ aimxm ? bi

22. Linear Programming Remove a largest possible subset of redundant constraints

23. Linear Programming A smallest possible set of constraints

24. Graph Constraints Consistency A redundant constraint Removal of a largest possible subset of constraints A smallest possible subset of graph constraints

25. Graph Constraints and Constraint Graph Graph Constraint Constraint graph of a system of graph constraints

26. Consistency Theorem A system of graph constraints is consistent if and only if its constraint graph has no directed positive cycle

27. Redundancy An edge (u, v) is redundant if there is a (directed) path from u to v, P, such that w(u,v) ? w(P) and (u,v) is not on P

28. Redundancy

29. Redundancy

30. Removal of Redundant Constraints Theorem : The Problem of removing a largest possible subset of redundant constraints is NP- complete Proof : Reduction from the directed Hamiltonian Cycle Problem

31. Optimal Reduction

32. Zero Cycles Theorem If G contains no zero-cycle, G – RG is the unique optimal reduction of G where RG is the set of all redundant edges of G

33. Zero Cycles

34. Optimal Reduction G1 and G2 are constraint graphs with the same vertex set WG1* (u, v) is the path length of the longest path from u to v in G1 WG2* (u, v) is the path length of the longest path from u to v in G2 Theorem : G1 and G2 define the same solution space if and only if WG1* (u, v) = WG2* (u, v) for all u and v

35. Optimal Reduction For a given G, find G0 such that (i) WG* (u, v) = WGo* (u, v) for all u and v (ii) G0 has the minimum number of edges

36. Constraint Graph

37. Dependency Relation u and v are dependent if (i) u = v , or (ii) there is a zero closed walk that contains u and v u and v are dependent if and only if wG* (u, v) + wG* (v, u) = 0

38. Dependency Relation

39. Step 1

40. Step 2

41. Step 3

42. Step 4

43. Step 5

44. Why Minimum ?

45. Extensions axi ? bxj ? c xi ? xj = xk ? xl xi ? xj ? xk ? c

46. Final Remark

  • Login