1 / 19

Goal

We present a hybrid optimization approach for solving global optimization problems, in particular automated parameter estimation models.

chase
Download Presentation

Goal

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. We present a hybrid optimization approach for solving global optimization problems, in particular automated parameter estimation models. The hybrid approach is based on the coupling of the Simultaneous Perturbation Stochastic Approximation (SPSA) and a Newton-Krylov Interior-Point method (NKIP) via a surrogate model. Goal

  2. Problem Formulation • We consider the global optimization problem in the form: • where the global solution x* is such that • We are interested in problem (1) that have many local minima.

  3. Types of local minima local minima global minimum x

  4. Hybrid Optimization Framework • Global Method: SPSA • Surrogate Model • Local Method: NKIP

  5. Global Method: SPSA • Stochastic steepest descent direction algorithm • (James Spall, 1998) • Advantage • SPSA gives region(s) where the function value is low, and this allows to conjecture in which region(s) is the global solution. • Disadvantages • Slow method • Do not take into account equality/inequality constraints

  6. Global Search: SPSA • SPSA performs random simultaneous perturbations of all model parameters to generate a descent direction at each iteration • This process may be performed by starting with different initial guesses (multistart). • Multistartincreases the chances for finding a global solution, and yields to find a vast sampling of the parameter space.

  7. Global Search Via SPSA Hybrid Optimization Scheme (1) Explore Parameter space Multistart (x0)1 (x0)2 . . . (x0)k . . .

  8. SPSA: Pseudocode

  9. Observations • SPSA gives region(s) where the function value is low, and this allows to conjecture in which region(s) is the global solution. • This give us a motivation to apply a local method in the region(s) found by SPSA.

  10. Local Method • Advantages • Fast Method: Newton Type Methods • Interior-Point Methods allow to add equality/inequality constraints • Disadvantage • Needs first/second order information • Solution • Construct a Surrogate Model using the SPSA function values inside the conjecture region(s)

  11. Surrogate Model • A surrogate model is created by using an interpolation method with the data, , provided by SPSA. • This can be performed in different ways, e.g., radial basis functions, kriging, regression analysis, or using artificial neural networks.

  12. Why Surrogate Models? Most real problems require thousands or millions of objective and constraint function evaluations, and often the associated high cost and time requirements render this infeasible.

  13. Radial Basis Function (RBF) • RBF is typically parameterized by two sets of parameters: the center c which defines its position, and shape r that determines its width or form • An RBF interpolation algorithm (Orr,1996) characterizes the uncertainty parameters:

  14. Surrogate Model • Our goal is to optimize the surrogate function • Where the radial basis functions can be defined as: • that are the multiquadric and the Gaussian basis functions, respectively

  15. Global Search Via SPSA Hybrid Optimization Scheme (2) Explore Parameter space Multistart (x0)1 (x0)2 . . . (x0)k . . . Target Region Filtering Sampling + Surrogate Model

  16. Surrogate Model • We plot the original model function and the surrogate function:

  17. Local Search: NKIP NKIP is a globalized and derivative dependent optimization method based on the global strategy introduced by Miguel Argaéz and Richard Tapia in 2002. This method calculates the directions using the conjugate gradient algorithm, and a linesearch is implemented to guarantee a sufficient decrease of the objective function.

  18. Local Search: NKIP • We consider the optimization problem in the form: • where a and b are determined by the sampled points given by SPSA.

  19. Hybrid Optimization Scheme (3)

More Related