1 / 71

Inexact Methods for PDE-Constrained Optimization

University of North Carolina at Chapel Hill. Inexact Methods for PDE-Constrained Optimization. Frank Edward Curtis Northwestern University Joint work with Richard Byrd and Jorge Nocedal January 31, 2007. Nonlinear Optimization. “One” problem. Circuit Tuning. Building blocks:

domenica
Download Presentation

Inexact Methods for PDE-Constrained Optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. University of North Carolina at Chapel Hill Inexact Methods for PDE-Constrained Optimization Frank Edward Curtis Northwestern University Joint work with Richard Byrd and Jorge Nocedal January 31, 2007

  2. Nonlinear Optimization • “One” problem

  3. Circuit Tuning • Building blocks: • Transistors (switches) and Gates (logic units) • Improve aspects of the circuit – speed, area, power – by choosing transistor widths w1 w2 AT1 d1 AT3 AT2 d2 (A. Wächter, C. Visweswariah, and A. R. Conn, 2005)

  4. Circuit Tuning • Building blocks: • Transistors (switches) and Gates (logic units) • Improve aspects of the circuit – speed, area, power – by choosing transistor widths w1 w2 AT1 d1 AT3 AT2 d2 • Formulate an optimization problem (A. Wächter, C. Visweswariah, and A. R. Conn, 2005)

  5. Strategic Bidding • Electricity production companies “bid” on how much they will charge for one unit of electricity • Independent operator collects bids and sets production schedule and “spot price” to minimize cost to consumers (Pereira, Granville, Dix, and Barroso, 2004)

  6. Strategic Bidding • Electricity production companies “bid” on how much they will charge for one unit of electricity • Independent operator collects bids and sets production schedule and “spot price” to minimize cost to consumers • Bilevel problem • Equivalent to MPCC • Hard geometry! (Pereira, Granville, Dix, and Barroso, 2004)

  7. Challenges for NLP algorithms • Very large problems • Numerical noise • Availability of derivatives • Degeneracies • Difficult geometries • Expensive function evaluations • Real-time solutions needed • Integer variables • Negative curvature

  8. Outline • Problem Formulation • Equality constrained optimization • Sequential Quadratic Programming • Inexact Framework • Unconstrained optimization and nonlinear equations • Stopping conditions for linear solver • Global Behavior • Merit function and sufficient decrease • Satisfying first order conditions • Numerical Results • Model inverse problem • Accuracy tradeoffs • Final Remarks • Future work • Negative curvature

  9. Outline • Problem Formulation • Equality constrained optimization • Sequential Quadratic Programming • Inexact Framework • Unconstrained optimization and nonlinear equations • Stopping conditions for linear solver • Global Behavior • Merit function and sufficient decrease • Satisfying first order conditions • Numerical Results • Model inverse problem • Accuracy tradeoffs • Final Remarks • Future work • Negative curvature

  10. Equality constrained optimization Goal: solve the problem e.g., minimize the difference between observed and expected behavior, subject to atmospheric flow equations (Navier-Stokes)

  11. Equality constrained optimization Goal: solve the problem Define: the derivatives Define: the Lagrangian Goal: solve KKT conditions

  12. Sequential Quadratic Programming (SQP) • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem

  13. Sequential Quadratic Programming (SQP) • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem • KKT matrix • Cannot be formed • Cannot be factored

  14. Sequential Quadratic Programming (SQP) • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem • KKT matrix • Cannot be formed • Cannot be factored • Linear system solve • Iterative method • Inexactness

  15. Outline • Problem Formulation • Equality constrained optimization • Sequential Quadratic Programming • Inexact Framework • Unconstrained optimization and nonlinear equations • Stopping conditions for linear solver • Global Behavior • Merit function and sufficient decrease • Satisfying first order conditions • Numerical Results • Model inverse problem • Accuracy tradeoffs • Final Remarks • Future work • Negative curvature

  16. Unconstrained optimization Goal: minimize a nonlinear objective Algorithm: Newton’s method (CG)

  17. Unconstrained optimization Goal: minimize a nonlinear objective Algorithm: Newton’s method (CG) Note: choosing any intermediate step ensures global convergence to a local solution of NLP (Steihaug, 1983)

  18. Nonlinear equations Goal: solve a nonlinear system Algorithm: Newton’s method

  19. Nonlinear equations Goal: solve a nonlinear system Algorithm: Newton’s method any step with and ensures descent (Dembo, Eisenstat, and Steihaug, 1982) (Eisenstat and Walker, 1994)

  20. Line Search SQP Framework • Define “exact” penalty function • Implement a line search

  21. Exact Case

  22. Exact Case Exact step minimizes the objective on the linearized constraints

  23. Exact Case Exact step minimizes the objective on the linearized constraints … which may lead to an increase in the model objective

  24. Quadratic/linear model of merit function • Create model • Quantify reduction obtained from step

  25. Quadratic/linear model of merit function • Create model • Quantify reduction obtained from step

  26. Exact Case Exact step minimizes the objective on the linearized constraints … which may lead to an increase in the model objective

  27. Exact Case Exact step minimizes the objective on the linearized constraints … which may lead to an increase in the model objective … but this is ok since we can account for this conflict by increasing the penalty parameter

  28. Exact Case Exact step minimizes the objective on the linearized constraints … which may lead to an increase in the model objective … but this is ok since we can account for this conflict by increasing the penalty parameter

  29. Algorithm Outline (exact steps) • for k = 0, 1, 2, … • Compute step by… • Set penalty parameter to ensure descent on… • Perform backtracking line search to satisfy… • Update iterate

  30. First attempt • Proposition: sufficiently small residual • Test: 61 problems from CUTEr test set

  31. First attempt… not robust • Proposition: sufficiently small residual • … not enough for complete robustness • We have multiple goals (feasibility and optimality) • Lagrange multipliers may be completely off • … may not have descent!

  32. Second attempt • Step computation: inexact SQP step • Recall the line search condition • We can show

  33. Second attempt • Step computation: inexact SQP step • Recall the line search condition • We can show ... but how negative should this be?

  34. Algorithm Outline (exact steps) • for k = 0, 1, 2, … • Compute step • Set penalty parameter to ensure descent • Perform backtracking line search • Update iterate

  35. Algorithm Outline (inexact steps) • for k = 0, 1, 2, … • Compute step and set penalty parameter to ensure descent and a stable algorithm • Perform backtracking line search • Update iterate

  36. Inexact Case

  37. Inexact Case

  38. Inexact Case Step is acceptable if for

  39. Inexact Case Step is acceptable if for

  40. Inexact Case Step is acceptable if for

  41. Algorithm Outline • for k = 0, 1, 2, … • Iteratively solve • Until • Update penalty parameter • Perform backtracking line search • Update iterate or

  42. Termination Test • Observe KKT conditions

  43. Outline • Problem Formulation • Equality constrained optimization • Sequential Quadratic Programming • Inexact Framework • Unconstrained optimization and nonlinear equations • Stopping conditions for linear solver • Global Behavior • Merit function and sufficient decrease • Satisfying first order conditions • Numerical Results • Model inverse problem • Accuracy tradeoffs • Final Remarks • Future work • Negative curvature

  44. Assumptions • The sequence of iterates is contained in a convex set and the following conditions hold: • the objective and constraint functions and their first and second derivatives are bounded • the multiplier estimates are bounded • the constraint Jacobians have full row rank and their smallest singular values are bounded below by a positive constant • the Hessian of the Lagrangian is positive definite with smallest eigenvalue bounded below by a positive constant

  45. Sufficient Reduction to Sufficient Decrease • Taylor expansion of merit function yields • Accepted step satisfies

  46. Intermediate Results is bounded above is bounded above is bounded below by a positive constant

  47. Sufficient Decrease in Merit Function

  48. Step in Dual Space • We converge to an optimal primal solution, and (for sufficiently small and ) Therefore,

  49. Outline • Problem Formulation • Equality constrained optimization • Sequential Quadratic Programming • Inexact Framework • Unconstrained optimization and nonlinear equations • Stopping conditions for linear solver • Global Behavior • Merit function and sufficient decrease • Satisfying first order conditions • Numerical Results • Model inverse problem • Accuracy tradeoffs • Final Remarks • Future work • Negative curvature

  50. Problem Formulation • Tikhonov-style regularized inverse problem • Want to solve for a reasonably large mesh size • Want to solve for small regularization parameter • SymQMR for linear system solves • Input parameters: Recall: or (Curtis and Haber, 2007)

More Related