1 / 45

BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained

BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained. Important concepts for the optimization of systems with continuous variables and non-linear equations . Since we will limit the topic to unconstrained problems, we will concentrate on the OBJECTIVE FUNCTION.

sadlerr
Download Presentation

BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained Important concepts for the optimization of systems with continuous variables and non-linear equations. Since we will limit the topic to unconstrained problems, we will concentrate on the OBJECTIVE FUNCTION. • Optimality Conditions for Single Variable • Optimality Conditions for Multivariable Variable • Revisit Convexity and Its Importance

  2. BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained Wait a minute. No problem is unconstrained; so, why do we need to know this? • Unconstrained problems - sometimes the solution doesn’t involve constraints • Used in methods for constrained problems

  3. BUILDING EXPERIENCE IN OPTIMIZATION CLASS EXERCISE: The reactor is isothermal and the reaction kinetics are first order. Is this system linear or non-linear? • What must we define before defining an optimum? • - The goal is to maximize CB in the effluent at S-S • - You can adjust only the flow rate of feed This is an isothermal CFSTR with the reaction: A  B  C You can only adjust F

  4. Optimum WHAT DEFINES THE LOCATION OF AN OPTIMUM? For LP, the optimum is at a corner point. For NLP the optimum is located …….?

  5. WHAT DEFINES THE LOCATION OF AN OPTIMUM: Single variable? We will start with a single-variable system and then generalize to multiple variable. We will not yet include constraints. The general definition of a minimum of f(x) is x* is a minimum if f(x*)  f(x* + x) for small x We want to apply this concept, but we need to determine specific criteria that test for conformance to the statement in the box above.

  6. WHAT DEFINES THE LOCATION OF AN OPTIMUM: Single variable? Necessary Condition for a single-variable system:[df(x)/dx]x* = 0 Let’s look at the definition of a derivative, which is continuous Why isn’t this sufficient for a minimum? If this exists and f(x*)  (f(x*+x), then

  7. WHAT DEFINES THE LOCATION OF AN OPTIMUM: Single variable? Necessary Condition for a single-variable system:[df(x)/dx]x* = 0 (a) (b) (c) (d) (e) Where is the derivative zero?

  8. WHAT DEFINES THE LOCATION OF AN OPTIMUM: Single variable? Sufficient condition: A function with f’(x*)=0 has f’’(x*) = …= fn-1(x*) = 0 (the next n-1 derivatives = zero) has for n = even fn(x*) > 0 (the nth derivative at x* > 0 ) Approximate the function with a Taylor Series. 0 0 Remainder ( 0  h  1)

  9. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Single variable? Sufficient condition: A function with f’(x*)=0 has f’’(x*) = …= fn-1(x*) = 0 (2nd to n-1 derivatives = zero) has for n = even fn(x*) > 0 (the nth derivative at x* > 0 ) Rearrange the result. For n = even, (x)n > 0; when nth derivative is positive, the condition for a minimum is satisfied!

  10. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Single variable? Necessary & Sufficient Conditions: [df(x)/dx]x* = 0 ; d2f(x*)/dx2 > 0 (a) (b) (c) (d) (e) Which satisfy the necessary & sufficient?

  11. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Single variable? Let’s look at the following examples. f1 = 3 + 2x + 5x2 df1/dx = 2 + 10x = 0 : x = -.20 d2f1/dx2 = 10 > 0 at x = -.20 Therefore, the function has a local minimum at x = x* = -.20 f1 = 3 + 2x - 5x2 df1/dx = 2 - 10x = 0 : x = .20 d2f1/dx2 = -10 < 0 at x = .20 Therefore, the function has a local maximum at x = x* = .20

  12. WHAT DEFINES THE LOCATION OF AN OPTIMUM: Single variable? Necessary & Sufficient Conditions: [df(x)/dx]x* = 0 ; d2f(x*)/dx2 > 0 • Are these results consistent with the methods you have learned previously? • What do we conclude if n = odd? • What type of extremum occurs for f(x) = x4?

  13. WHAT DEFINES THE LOCATION OF AN OPTIMUM: Single variable? Necessary & Sufficient Conditions: [df(x)/dx]x* = 0 ; d2f(x*)/dx2 > 0 • Are these results consistent with the methods you have learned previously? • Hopefully, these are the rules that you learned in first-year calculus! • What do we conclude if n = odd? • The sign of the remainder depends on the sign of x. This is not a local minimum. It is termed a saddle point.

  14. WHAT DEFINES THE LOCATION OF AN OPTIMUM: Single variable? Necessary & Sufficient Conditions: [df(x)/dx]x* = 0 ; d2f(x*)/dx2 > 0 • What type of extremum occurs for f(x) = x4? Therefore, the extreme point is a minimum!

  15. WHAT DEFINES THE LOCATION OF AN OPTIMUM: Multivariable? Necessary: Let’s extend these results to multivariable systems, with x a vector of dimension n. Necessary condition: The proof is similar to the single-variable case. We call these equations the “stationarity conditions”.

  16. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Multivariable? The necessary condition for unconstrained optimization of a multivariable system is often stated as the following. The gradient equaling zero is the stationaritycondition.

  17. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Multivariable? Sufficient: Let’s extend these results to multivariable systems, with x a vector of dimension n. We will restrict sufficient conditions to second derivatives. The first and second differential is defined as

  18. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Multivariable? These terms can be used in the expression for a Taylor series to determine the sufficient condition. 0 Remainder ( 0  h  1) The condition for a minimum is satisfied when the remainder is positive.

  19. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Multivariable? H = the Hessian of second derivatives It is symmetric.

  20. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Multivariable? For a minimum, the right hand side is positive for any non-zero values of the vector x. How can we tell? We need to evaluate an infinite number of values of x! Let’s try a little mathematics to improve the situation

  21. x1 w1 w2 x2 WHAT DEFINES THE LOCATION OF AN OPTIMUM : Multivariable? We will consider a two-dimensional system. We start by defining a new vector of variables, w. Can we define the b’s to make the test for optimality easier?

  22. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Multivariable? The optimality test would be easy if the hessian were diagonal. How can we determine the b’s to give this nice, diagonal hessian matrix? Then, If,

  23. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Multivariable? The answer is determined from the eigenvalues and eigenvectors of the hessian matrix!!! When we prove that the function f(w) has a minimum at w* from 1 > 0 and 2 > 0 we also prove that the function f(x) has a minimum at x*!

  24. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Multivariable? A schematic of what we did. The coordinates are rotated to express the quadratic as the sum of variables squared times eigenvalues. w1 Clearly, the remainder term must only increase if alli are positive. w2

  25. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Multivariable? Positive Definite: A matrix is positive definite if all values of its eigenvalues () are positive. Eigenvalues are the solution to the following equation, with H evaluated at x*. | H - I | = 0 What is the form of this equation? How many solutions are there?

  26. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Multivariable? The following two conditions are necessary & sufficient at x* The Hessian is positive definite The gradient is zero • Some good news - We do not typically perform these calculations to test problems • But, these concepts are used in many solution methods for non-linear optimization.

  27. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Multivariable? This is a “nice” objective function, which is convex and symmetric. Local derivative information will direct us toward the minimum. All eigenvalues are positive.

  28. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Multivariable? This is an objective function with a ridge. We will find the valley quickly; then, we will search the ridge with little success. One eigenvalue is near zero.

  29. WHAT DEFINES THE LOCATION OF AN OPTIMUM : Multivariable? This objective function has a saddle point, which has a minimum in one direction and maximum in another direction. Derivative information will not direct us well. One eigenvalue is positive, and another is negative.

  30. What is the hessian for these stationary points WHAT DEFINES THE LOCATION OF AN OPTIMUM : Multivariable? What’s going on here? 1 2 3 4

  31. CONVEXITY: AN IMPORTANT PROPERTY IN OPTIMIZATION Convexity and the objective function. A function of x (a vector) is convex if the following is true. For points x1 and x2 and 0  1. Is this function convex (over the region in the figure)? f(x) x

  32. CONVEXITY: AN IMPORTANT PROPERTY IN OPTIMIZATION Convexity and the objective function. A function of x (a vector) is convex over a region if the following is true over the region. Gradient Test: Hessian Test: The function is convex if its Hessian matrix is positive definite positive

  33. CONVEXITY: AN IMPORTANT PROPERTY IN OPTIMIZATION Any local minimum of a convex function (over an unconstrained region) is a global minimum!

  34. BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained Conclusions on OBJECTIVE FUNCTIONproperties • Opt. Conditions for Single Variable • Opt. Conditions for Multivariable Variable • Convexity and Its Importance When is local = global optimum? Basis of many optimization algorithms and tests for convergence We seek to formulate our models to yield a convex programming problem

  35. OPTIMIZATION BASICS II - WORKSHOP #1 We covered the conditions for optimality and convexity in this section. They seemed similar. • What is difference between suff. condition for optimality and convexity? • Why is convexity important?

  36. OPTIMIZATION BASICS II - WORKSHOP #2 Since convexity is important, let’s evaluate convexity for a very important function. Is the following function convex or concave? with ci constants

  37. OPTIMIZATION BASICS II - WORKSHOP #3 The statement below is very important. Prove the statement. Hint: Consider directions of improvement for convex and non-convex functions. Any local minimum of a convex function (over an unconstrained region) is a global minimum!

  38. OPTIMIZATION BASICS II - WORKSHOP #4 All convex functions have a unique minimum, i.e., they are unimodal. Determine whether all unimodal functions are convex f(x) x

  39. OPTIMIZATION BASICS II - WORKSHOP #5 • We seek a global, rather than a local, optimum. • Define a global optimum in words • Determine a mathematical test for the global optimum. • Discuss how you would find a global optimum.

  40. OPTIMIZATION BASICS II - WORKSHOP #6 The objective function is often the sum of several functions, for example, costs, revenues, taxes, and so forth. Determine if the following are a convex functions, when each term [gi(x)] is convex individually.

  41. OPTIMIZATION BASICS II - WORKSHOP #7 A function is convex if its Hessian matrix is positive definite over the range of the variable x Positive definite One way to determine if a matrix (the hessian) is positive definite is to evaluate the determinants of its principle minors. If they are positive, the matrix is positive definite. The principle minors are the sub-matrices formed by eliminating n-k columns and rows, with k = 0 to n-1. Apply this approach to the following functions.

  42. OPTIMIZATION BASICS II - WORKSHOP #7 A function is convex if its Hessian matrix is positive definite over the range of the variable x Positive definite

  43. OPTIMIZATION BASICS II - WORKSHOP #7 SOLUTION Therefore, the function is convex

  44. OPTIMIZATION BASICS II - WORKSHOP #7 SOLUTION Therefore, the function is not convex

  45. OPTIMIZATION BASICS II - WORKSHOP #7 SOLUTION Therefore, the function is convex

More Related