1 / 12

ChE 250 Numeric Methods

ChE 250 Numeric Methods. Lecture #15, Chapra, Chapter 14: Multidimensional Unconstrained Optimization 20070223. Multidimensional Unconstrained Optimization.

Ava
Download Presentation

ChE 250 Numeric Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ChE 250 Numeric Methods Lecture #15, Chapra, Chapter 14: Multidimensional Unconstrained Optimization 20070223

  2. Multidimensional Unconstrained Optimization • For two and higher dimensions, simple methods like the Golden Search are not feasible and we must resort to open methods generally starting from a single point • If the function and its partials are readily calculated, then gradient, or ‘descent’ methods can be employed • Otherwise a non-gradient, or direct method is used

  3. Non-Gradient • Random Search • Brute force attack by calculating many points and simply recording the point with the highest function value • Only useful for situations where the function is easily calculated on computer • Might be useful to find initialization points for another, faster direct method • Can be used for any function, weather including non-linear or discontinuous, which is a distinct advantage over the other methods of this chapter

  4. Pattern Searches • Univariate • Use 1-D methods and iterate • Find the extremum of one variable while holding the others constant • Then move to the next variable, and the next, etc. • This method gets to the general neighborhood of the maximum quickly, but then can get stuck • Pattern searches make use of the lines 2-4-6 and 1-3-5 to find the maximum more quickly • Questions?

  5. Gradient Methods • Gradient methods use the derivative of the objective function to locate the optima • The gradient is a vector of the partial derivatives for each variable • It tells us the direction of the steepest incline as well as the magnitude of that incline

  6. Gradient Methods • The Hessian • For 1-D problems, we know the optima is when the derivative equals zero • For multi-D cases, it is more complicated • A point can appear to be a minimum if viewed in only the x or only the y cross section

  7. Gradient Methods • The Hessian is an indication of second order curvature and is also used in Newton’s method • |H|<0 then we have a saddle point, otherwise it’s a minimum or maximum • Scilab example • Questions?

  8. Steepest Ascent Method • Assuming you can calculate the gradient at any point, you can start at x0, y0 and head off in the direction of the gradient and then stop at the maximum on that straight line. • This is accomplished with a transformed variable, h, in the direction of the gradient at x0,y0

  9. Steepest Ascent Method • First calculate the partial equations • Then substitute in the initial x,y values • Then define h in the direction of the gradient • Substitute x and y equations into the function • Use 1-D methods to find the maximum • Then start the process again at that maximum point • Example 14.4 • Questions?

  10. Newton’s Method • The Newton method is implemented using the inverse of the n by n Hessian matrix • So all the partials must be evaluated and the inverse found for every iteration • It is important to be ‘close’ to the solution to reduce the number if iteration using this method especially if n is large • Equations for the partials are given in Chapra, page 365 • Questions?

  11. Multivariate Approach • An advanced algorithm would use a hybrid method, perhaps start with random, or steepest ascent method and change to a Newton method after a few iterations to quickly find the solution • Questions

  12. Preparation for 26Feb • Reading • Chapter 15: Constrained Optimization

More Related