1 / 69

MATH 685/ CSI 700/ OR 682 Lecture Notes

MATH 685/ CSI 700/ OR 682 Lecture Notes. Lecture 9. Optimization problems. Optimization. Optimization problems. Examples. Global vs. local optimization. Global optimization. Finding, or even verifying, global minimum is difficult, in general

aysha
Download Presentation

MATH 685/ CSI 700/ OR 682 Lecture Notes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 9. Optimization problems.

  2. Optimization

  3. Optimization problems

  4. Examples

  5. Global vs. local optimization

  6. Global optimization • Finding, or even verifying, global minimum is difficult, in general • Most optimization methods are designed to find local minimum, which may or may not be global minimum • If global minimum is desired, one can try several widely separated starting points and see if all produce same result • For some problems, such as linear programming, global optimization is more tractable

  7. Existence of Minimum

  8. Level sets

  9. Uniqueness of minimum

  10. First-order optimality condition

  11. Second-order optimality condition

  12. Constrained optimality

  13. Constrained optimality

  14. Constrained optimality

  15. Constrained optimality • If inequalities are present, then KKT optimality conditions also require nonnegativity of Lagrange multipliers corresponding to inequalities, and complementarity condition

  16. Sensitivity and conditioning

  17. Unimodality

  18. Golden section search

  19. Golden section search

  20. Golden section search

  21. Example

  22. Example (cont.)

  23. Successive parabolic interpolation

  24. Example

  25. Newton’s method Newton’s method for finding minimum normally has quadratic convergence rate, but must be started close enough to solution to converge

  26. Example

  27. Safeguarded methods

  28. Multidimensional optimization.Direct search methods

  29. Steepest descent method

  30. Steepest descent method

  31. Example

  32. Example (cont.)

  33. Newton’s method

  34. Newton’s method

  35. Example

  36. Newton’s method

  37. Newton’s method

  38. Trust region methods

  39. Trust region methods

  40. Quasi-Newton methods

  41. Secant updating methods

  42. BFGS method

  43. BFGS method

  44. BFGS method

  45. Example For quadratic objective function, BFGS with exact line search finds exact solution in at most n iterations, where n is dimension of problem

  46. Conjugate gradient method

  47. CG method

  48. CG method example

  49. Example (cont.)

  50. Truncated Newton methods • Another way to reduce work in Newton-like methods is to solve linear system for Newton step by iterative method • Small number of iterations may suffice to produce step as useful as true Newton step, especially far from overall solution, where true Newton step may be unreliable anyway • Good choice for linear iterative solver is CG method, which gives step intermediate between steepest descent and Newton-like step • Since only matrix-vector products are required, explicit formation of Hessian matrix can be avoided by using finite difference of gradient along given vector

More Related