1 / 10

Optimization of functions of one variable (Section 2)

Optimization of functions of one variable (Section 2). Find minimum of function of one variable Occurs directly Part of iterative algorithm (line search) Unimodal function, single optimum -- step toward optimum results in reduction of objective function along the path. Two methods.

ivor-tyson
Download Presentation

Optimization of functions of one variable (Section 2)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Optimization of functions of one variable (Section 2) • Find minimum of function of one variable • Occurs directly • Part of iterative algorithm (line search) • Unimodal function, single optimum -- step toward optimum results in reduction of objective function along the path

  2. Two methods • Golden section search • Polynomial approximation • Golden section search; known convergence rate, guaranteed to find interval bounding optimum (tolerance interval). Provides information about confidence in solution. Expensive • Polynomial approximation. Efficient but not as robust as Golden section search

  3. Golden section search • Starts with interval known to contain minimum (tolerance interval) • Proceeds by narrowing tolerance interval • Uses four data points for which objective function is evaluated. • In each iteration -- one additional function evaluation • Tolerance interval reduces to 61.8% of interval from previous iteration

  4. x1 x2 xhi First iteration xlo Second iteration xlo’ x1’ x2’ xhi’ Golden section method

  5. Bounds on minimum Fl Fu 1.618(xu-xl) xu First iteration xl Second iteration xl’ x1’ x2’ xu’

  6. Bounding minimum algorithm Given, xl, Fl, xmax Guess xu Y Fu>Fl Minimum in [xl,xu] STOP N Expand x1=xu* xu=x1+1/(x1-xl) Y Fu>F1 N Expand xl=x1 * Stop if xu>xmax

  7. Example of minimizing function using second degree polynomial approximation obtained through regression. Four data points are used from minimum bounding solution

  8. Example of minimizing function using second degree polynomial approximation obtained through regression. Five data points uniformly distributed between 10 and 30 are used

  9. Example of minimizing function using second degree polynomial approximation obtained using three data points (exact fit)

  10. Minimizing constrained functions of one variable • Direct approach • Deal with each function (objective, constraint) individually • Indirect approach • Develop and use pseudo objective function that includes both the objective function and the constraint functions

More Related