1 / 58

# NLP - PowerPoint PPT Presentation

NLP. KKT Practice and Second Order Conditions from Nash and Sofer. Unconstrained. First Order Necessary Condition Second Order Necessary Second Order Sufficient. Easiest Problem. Linear equality constraints. KKT Conditions. Note for equality – multipliers are unconstrained

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

## PowerPoint Slideshow about ' NLP' - eagan-flowers

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

### NLP

KKT Practice and Second Order Conditions from Nash and Sofer

• First Order Necessary Condition

• Second Order Necessary

• Second Order Sufficient

• Linear equality constraints

Note for equality – multipliers are unconstrained

Complementarity not an issue

• Let x* be a feasible point, Ax*=b.

• Any other feasible point can be written as x=x*+p where Ap=0

• The feasible region

{x : x*+p pN(A)}

where N(A) is null space of A

See Section 3.2 of Nash and Sofer for example

You can convert any linear equality constrained optimization problem to an equivalent unconstrained problem

• Method 1 substitution

• Method 2 using Null space representation and a feasible point.

• Solve by substitution

becomes

• x*= [4 0 0]’

• x=x*+Zv

becomes

• There exists a Null Space Matrix

• The feasible region is:

• Equivalent “Reduced” Problem

• Assume feasible point and convert to null space formulation

• KKT implies null space

• Null Space implies KKT

Gradient is not in Null(A), thus it must be in Range(A’)

• If x* is a local min of f over {x|Ax=b}, and Z is a null matrix

• Or equivalently use KKT Conditions

• If x* satisfies (where Z is a basis matrix for Null(A))

then x* is a strict local minimizer

• If (x*,*) satisfies (where Z is a basis matrix for Null(A))

then x* is a strict local minimizer

• * is called the Lagrangian Multiplier

• It represents the sensitivity of solution to small perturbations of constraints

• Consider min (x2+4y2)/2 s.t. x-y=10

• Find KKT point Check SOSC

• Find a KKT point

• Verify SONC and

SOSC

so SOSC satisfied, and x* is a strict local minimum

Objective is convex, so KKT conditions are sufficient.

• Linear equality constraints

Constraints form a polyhedron

Close to Equality Case

Equality FONC:

a2x = b

a2x = b

-a2

Polyhedron Ax>=b

a3x = b

a4x = b

-a1

a1x = b

contour set of function

unconstrained minimum

Which i are 0? What is the sign of I?

Close to Equality Case

Equality FONC:

a2x = b

a2x = b

-a2

Polyhedron Ax>=b

a3x = b

a4x = b

-a1

a1x = b

Which i are 0? What is the sign of I?

Inequality Case

Inequality FONC:

a2x = b

a2x = b

-a2

Polyhedron Ax>=b

a3x = b

a4x = b

-a1

a1x = b

Nonnegative Multipliers imply gradient points to the less than

Side of the constraint.

• If x* is a local min of f over {x|Ax≤b}, and Z is a null-space matrix for active constraints then for some vector *

• If (x*,*) satisfies

where Z+ is a basis matrix for Null(A +) and A + corresponds to nondegenerate active constraints)

i.e.

• Find solution and verify SOSC

• Problem

• Solve the problem using above theorems:

• Sufficient conditions are good for?

• Way to confirm that a candidate point

• is a minimum (local)

• But…not every min satisifies any given SC

• Necessary tells you:

• If necessary conditions don’t hold then you know you don’t have a minimum.

• Under appropriate assumptions, every point that is a min satisfies the necessary cond.

• Good stopping criteria

• Algorithms look for points that satisfy Necessary conditions

• Optimality conditions expressed using

Lagrangian function

and Jacobian matrix

were each row is a gradient of a constraint

• If (x*,*) satisfies

• If (x*,*) satisfies

where Z+ is a basis matrix for Null(A +) and A + corresponds to Jacobian of nondegenerate active constraints)

i.e.

• Find solution and verify SOSC

• Find solution and verify SOSC

• If x* is a local min of f over {x|g(x)=0}, Z is a null-space matrix of the Jacobian g(x*)’, and x* is a regular point then

• If x* is a local min of f over {x|g(x)>=0}, Z is a null-space matrix of the Jacobian g(x*)’, and x* is a regular point then

• If x* is a regular point with respect to the constraints g(x*) if the gradient of the active constraints are linearly independent.

• For equality constraints, all constraints are active so

should have linearly independent rows.

• Show optimal solution x*=[1,0]’

is regular and find KKT point

• Regularity is an example of a constraint qualification CQ.

• The KKT conditions are based on linearizations of the constraints.

• CQ guarantees that this linearization is not getting us into trouble. Problem is

KKT point might not exist.

• There are many other CQ,e.g., for inequalities Slater is there exists g(x)<0.

• Note CQ not needed for linear constraints.

X* is global min

Convex f

Convex constraints

X* is local min

SOSC

CQ

KKT Satisfied