- 49 Views
- Uploaded on
- Presentation posted in: General

Inexact SQP Methods for Equality Constrained Optimization

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

INFORMS Annual Meeting 2006

Inexact SQP Methods for Equality Constrained Optimization

Frank Edward Curtis

Department of IE/MS, Northwestern University

with Richard Byrd and Jorge Nocedal

November 6, 2006

- Introduction
- Problem formulation
- Motivation for inexactness
- Unconstrained optimization and nonlinear equations

- Algorithm Development
- Step computation
- Step acceptance

- Global Analysis
- Merit function and sufficient decrease
- Satisfying first-order conditions

- Conclusions/Final remarks

- Introduction
- Problem formulation
- Motivation for inexactness
- Unconstrained optimization and nonlinear equations

- Algorithm Development
- Step computation
- Step acceptance

- Global Analysis
- Merit function and sufficient decrease
- Satisfying first-order conditions

- Conclusions/Final remarks

Goal: solve the problem

Define: the derivatives

Define: the Lagrangian

Goal: solve KKT conditions

- Two “equivalent” step computation techniques

Algorithm: Newton’s method

Algorithm: the SQP subproblem

- Two “equivalent” step computation techniques

Algorithm: Newton’s method

Algorithm: the SQP subproblem

- KKT matrix
- Cannot be formed
- Cannot be factored

- Two “equivalent” step computation techniques

Algorithm: Newton’s method

Algorithm: the SQP subproblem

- KKT matrix
- Cannot be formed
- Cannot be factored

- Linear system solve
- Iterative method
- Inexactness

Goal: minimize a nonlinear objective

Algorithm: Newton’s method (CG)

Note: choosing any intermediate step ensures global convergence to a local solution of NLP

(Steihaug, 1983)

Goal: solve a nonlinear system

Algorithm: Newton’s method

Note: choosing any step with

and

ensures global convergence

(Dembo, Eisenstat, and Steihaug, 1982)

(Eisenstat and Walker, 1994)

- Introduction/Motivation
- Unconstrained optimization
- Nonlinear equations
- Constrained optimization

- Algorithm Development
- Step computation
- Step acceptance

- Global Analysis
- Merit function and sufficient decrease
- Satisfying first-order conditions

- Conclusions/Final remarks

- Two “equivalent” step computation techniques

Algorithm: Newton’s method

Algorithm: the SQP subproblem

Question: can we ensure convergence to a local solution by choosing any step into the ball?

- Step computation: inexact SQP step

- Globalization strategy: exact merit function
… with Armijo line search condition

- Proposition: sufficiently small residual

- Test: 61 problems from CUTEr test set

- Proposition: sufficiently small residual

- … not enough for complete robustness
- We have multiple goals (feasibility and optimality)
- Lagrange multipliers may be completely off

- Step computation: inexact SQP step

- Recall the line search condition

- We can show

- Step computation: inexact SQP step

- Recall the line search condition

- We can show

... but how negative should this be?

- Create model

- Quantify reduction obtained from step

- Create model

- Quantify reduction obtained from step

Exact step minimizes the objective on the linearized constraints

Exact step minimizes the objective on the linearized constraints

… which may lead to an increase in the objective (but that’s ok)

Step is acceptable if for

Step is acceptable if for

Step is acceptable if for

- for k = 0, 1, 2, …
- Iteratively solve
- Until
- Update penalty parameter
- Perform backtracking line search
- Update iterate

or

- Observe KKT conditions

- Introduction/Motivation
- Unconstrained optimization
- Nonlinear equations
- Constrained optimization

- Algorithm Development
- Step computation
- Step acceptance

- Global Analysis
- Merit function and sufficient decrease
- Satisfying first-order conditions

- Conclusions/Final remarks

- The sequence of iterates is contained in a convex set over which the following hold:
- the objective function is bounded below
- the objective and constraint functions and their first and second derivatives are uniformly bounded in norm
- the constraint Jacobian has full row rank and its smallest singular value is bounded below by a positive constant
- the Hessian of the Lagrangian is positive definite with smallest eigenvalue bounded below by a positive constant

- Taylor expansion of merit function yields

- Accepted step satisfies

is bounded above

is bounded above

is bounded below by a positive constant

- We converge to an optimal primal solution, and

(for sufficiently small and )

Therefore,

- Introduction/Motivation
- Unconstrained optimization
- Nonlinear equations
- Constrained optimization

- Algorithm Development
- Step computation
- Step acceptance

- Global Analysis
- Merit function and sufficient decrease
- Satisfying first-order conditions

- Conclusions/Final remarks

Conclusion/Final remarks

- Review
- Defined a globally convergent inexact SQP algorithm
- Require only inexact solutions of KKT system
- Require only matrix-vector products involving objective and constraint function derivatives
- Results also apply when only reduced Hessian of Lagrangian is assumed to be positive definite

- Future challenges
- Implementation and appropriate parameter values
- Nearly-singular constraint Jacobian
- Inexact derivative information
- Negative curvature
- etc., etc., etc….