Inexact sqp methods for equality constrained optimization
This presentation is the property of its rightful owner.
Sponsored Links
1 / 37

Inexact SQP Methods for Equality Constrained Optimization PowerPoint PPT Presentation


  • 44 Views
  • Uploaded on
  • Presentation posted in: General

INFORMS Annual Meeting 2006. Inexact SQP Methods for Equality Constrained Optimization. Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge Nocedal November 6, 2006. Outline. Introduction Problem formulation Motivation for inexactness

Download Presentation

Inexact SQP Methods for Equality Constrained Optimization

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Inexact sqp methods for equality constrained optimization

INFORMS Annual Meeting 2006

Inexact SQP Methods for Equality Constrained Optimization

Frank Edward Curtis

Department of IE/MS, Northwestern University

with Richard Byrd and Jorge Nocedal

November 6, 2006


Outline

Outline

  • Introduction

    • Problem formulation

    • Motivation for inexactness

    • Unconstrained optimization and nonlinear equations

  • Algorithm Development

    • Step computation

    • Step acceptance

  • Global Analysis

    • Merit function and sufficient decrease

    • Satisfying first-order conditions

  • Conclusions/Final remarks


Outline1

Outline

  • Introduction

    • Problem formulation

    • Motivation for inexactness

    • Unconstrained optimization and nonlinear equations

  • Algorithm Development

    • Step computation

    • Step acceptance

  • Global Analysis

    • Merit function and sufficient decrease

    • Satisfying first-order conditions

  • Conclusions/Final remarks


Equality constrained optimization

Equality constrained optimization

Goal: solve the problem

Define: the derivatives

Define: the Lagrangian

Goal: solve KKT conditions


Equality constrained optimization1

Equality constrained optimization

  • Two “equivalent” step computation techniques

Algorithm: Newton’s method

Algorithm: the SQP subproblem


Equality constrained optimization2

Equality constrained optimization

  • Two “equivalent” step computation techniques

Algorithm: Newton’s method

Algorithm: the SQP subproblem

  • KKT matrix

  • Cannot be formed

  • Cannot be factored


Equality constrained optimization3

Equality constrained optimization

  • Two “equivalent” step computation techniques

Algorithm: Newton’s method

Algorithm: the SQP subproblem

  • KKT matrix

  • Cannot be formed

  • Cannot be factored

  • Linear system solve

  • Iterative method

  • Inexactness


Unconstrained optimization

Unconstrained optimization

Goal: minimize a nonlinear objective

Algorithm: Newton’s method (CG)

Note: choosing any intermediate step ensures global convergence to a local solution of NLP

(Steihaug, 1983)


Nonlinear equations

Nonlinear equations

Goal: solve a nonlinear system

Algorithm: Newton’s method

Note: choosing any step with

and

ensures global convergence

(Dembo, Eisenstat, and Steihaug, 1982)

(Eisenstat and Walker, 1994)


Outline2

Outline

  • Introduction/Motivation

    • Unconstrained optimization

    • Nonlinear equations

    • Constrained optimization

  • Algorithm Development

    • Step computation

    • Step acceptance

  • Global Analysis

    • Merit function and sufficient decrease

    • Satisfying first-order conditions

  • Conclusions/Final remarks


Equality constrained optimization4

Equality constrained optimization

  • Two “equivalent” step computation techniques

Algorithm: Newton’s method

Algorithm: the SQP subproblem

Question: can we ensure convergence to a local solution by choosing any step into the ball?


Globalization strategy

Globalization strategy

  • Step computation: inexact SQP step

  • Globalization strategy: exact merit function

    … with Armijo line search condition


First attempt

First attempt

  • Proposition: sufficiently small residual

  • Test: 61 problems from CUTEr test set


First attempt not robust

First attempt… not robust

  • Proposition: sufficiently small residual

  • … not enough for complete robustness

    • We have multiple goals (feasibility and optimality)

    • Lagrange multipliers may be completely off


Second attempt

Second attempt

  • Step computation: inexact SQP step

  • Recall the line search condition

  • We can show


Second attempt1

Second attempt

  • Step computation: inexact SQP step

  • Recall the line search condition

  • We can show

... but how negative should this be?


Quadratic linear model of merit function

Quadratic/linear model of merit function

  • Create model

  • Quantify reduction obtained from step


Quadratic linear model of merit function1

Quadratic/linear model of merit function

  • Create model

  • Quantify reduction obtained from step


Exact case

Exact case


Exact case1

Exact case

Exact step minimizes the objective on the linearized constraints


Exact case2

Exact case

Exact step minimizes the objective on the linearized constraints

… which may lead to an increase in the objective (but that’s ok)


Inexact case

Inexact case


Option 1 current penalty parameter

Option #1: current penalty parameter


Option 1 current penalty parameter1

Option #1: current penalty parameter

Step is acceptable if for


Option 2 new penalty parameter

Option #2: new penalty parameter


Option 2 new penalty parameter1

Option #2: new penalty parameter

Step is acceptable if for


Option 2 new penalty parameter2

Option #2: new penalty parameter

Step is acceptable if for


Algorithm outline

Algorithm outline

  • for k = 0, 1, 2, …

    • Iteratively solve

    • Until

    • Update penalty parameter

    • Perform backtracking line search

    • Update iterate

or


Termination test

Termination test

  • Observe KKT conditions


Outline3

Outline

  • Introduction/Motivation

    • Unconstrained optimization

    • Nonlinear equations

    • Constrained optimization

  • Algorithm Development

    • Step computation

    • Step acceptance

  • Global Analysis

    • Merit function and sufficient decrease

    • Satisfying first-order conditions

  • Conclusions/Final remarks


Assumptions

Assumptions

  • The sequence of iterates is contained in a convex set over which the following hold:

    • the objective function is bounded below

    • the objective and constraint functions and their first and second derivatives are uniformly bounded in norm

    • the constraint Jacobian has full row rank and its smallest singular value is bounded below by a positive constant

    • the Hessian of the Lagrangian is positive definite with smallest eigenvalue bounded below by a positive constant


Sufficient reduction to sufficient decrease

Sufficient reduction to sufficient decrease

  • Taylor expansion of merit function yields

  • Accepted step satisfies


Intermediate results

Intermediate results

is bounded above

is bounded above

is bounded below by a positive constant


Sufficient decrease in merit function

Sufficient decrease in merit function


Step in dual space

Step in dual space

  • We converge to an optimal primal solution, and

(for sufficiently small and )

Therefore,


Outline4

Outline

  • Introduction/Motivation

    • Unconstrained optimization

    • Nonlinear equations

    • Constrained optimization

  • Algorithm Development

    • Step computation

    • Step acceptance

  • Global Analysis

    • Merit function and sufficient decrease

    • Satisfying first-order conditions

  • Conclusions/Final remarks


Inexact sqp methods for equality constrained optimization

Conclusion/Final remarks

  • Review

    • Defined a globally convergent inexact SQP algorithm

    • Require only inexact solutions of KKT system

    • Require only matrix-vector products involving objective and constraint function derivatives

    • Results also apply when only reduced Hessian of Lagrangian is assumed to be positive definite

  • Future challenges

    • Implementation and appropriate parameter values

    • Nearly-singular constraint Jacobian

    • Inexact derivative information

    • Negative curvature

    • etc., etc., etc….


  • Login