1 / 26

Outline

Formulations for Surrogate-Based Optimization Using Data Fit and Multifidelity Models Daniel M. Dunlavy and Michael S. Eldred Sandia National Laboratories, Albuquerque, NM SIAM Conference on Computation Science and Engineering February 19-23, 2007 SAND2007-0813C.

ordell
Download Presentation

Outline

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Formulations for Surrogate-Based Optimization Using Data Fit and Multifidelity ModelsDaniel M. Dunlavy and Michael S. EldredSandia National Laboratories, Albuquerque, NMSIAM Conference on Computation Science and EngineeringFebruary 19-23, 2007SAND2007-0813C Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy’s National Nuclear Security Administration under Contract DE-AC04-94AL85000

  2. Outline Introduction to Surrogate-Based Optimization • Data fit case • Model hierarchy case Algorithmic Components • Approximate subproblem • Iterate acceptance logic • Merit functions • Constraint relaxation • Corrections and convergence Computational Experiments • Barnes • CUTE • MEMS design

  3. Introduction to Surrogate-Based Optimization (SBO) • Purpose: Reduce the number of expensive, high-fidelity simulations • Use succession of approximate (surrogate) models • Drawback:Approximations generally have a limited range of validity • Trust regions adaptively manage this range based on efficacy of approximations throughout optimization run • Benefit: SBO algorithms can be provably-convergent • E.g., using trust region globalization and local 1st-order consistency • Surrogate Models of Interest • Data fits(e.g., local, multipoint, global) • Multifidelity(e.g., multigrid optimization) • Reduced-order models (e.g., spectral decomposition)

  4. Smoothing • O(101) variables • Local consistency f x2 x1 Kriging Quad Poly Neural Network Splines Trust Region SBO – Data Fit Case Sequence of trust regions Data fit surrogates: • Global • Polynomial response surfaces • Artificial neural networks • Splines • Kriging models • Gaussian processes • Local • 1st/2nd-order Taylor series • Multipoint: • Two-Point Exponential Approx. (TPEA) • Two-point Adaptive Nonlinearity Approx. (TANA)

  5. Trust Region SBO – Multifidelity Case Multifidelity surrogates: • Loosened solver tolerances • Coarsened discretizations: e.g, mesh size • Omitted physics: e.g., Euler CFD, panel methods Multifidelity models in SBO: • Truth evaluation only at center of trust region • Smoothing character is lost  requires well-behaved low-fidelity model • Correction quality is crucial • Additive: f(x)HF = f(x)LF + a(x) • Multiplicative: f(x)HF = f(x)LFb(x) • Combined (convex combination of add/mult) • May require design vector mapping • Space mapping • Corrected space mapping • POD mapping • Hybrid POD space mapping Sequence of trust regions

  6. Outline Intro to Surrogate-Based Optimization • Data fit case • Model hierarchy case Algorithmic Components • Approximate subproblem • Iterate acceptance logic • Merit functions • Constraint relaxation • Corrections and convergence Computational Experiments • Barnes • CUTE • MEMS design

  7. SBO Algorithm Components:Approximate Subproblem Formulations (TRAL) (IPTRSAO) (SQP-like) (Direct surrogate)

  8. SBO Algorithm Components:Iterate Acceptance Logic • Trust region ratio • Ratio of actual to predicted improvement • Filter method • Uses concept of Pareto optimality applied to objective and constraint violation • Still need logic to adapt TR size for accepted steps

  9. SBO Algorithm Components:Merit function selections Penalty: Adaptive penalty:adapts rp using monotonic increases in the iteration offset value in order to accept any iterate that reduces the constraint violation  mimics a (monotonic) filter Lagrangian: Multiplierestimation: Augmented Lagrangian: (derived from elimination of slacks) Multiplierestimation: (drives Lagrangian gradient to KKT)

  10. Approximate subproblem Relaxed subproblem Relax nonlinear constraints SBO Algorithm Components:Constraint Relaxation • Composite step: • Byrd-Omojokun • Celis-Dennis-Tapia • MAESTRO • Homotopy: • Heuristic (IPTRSAO-like) • Probability one Estimate t: Leave portion of TR volume feasible for relaxed subproblem

  11. Outline Intro to Surrogate-Based Optimization • Data fit case • Model hierarchy case Algorithmic Components • Approximate subproblem • Iterate acceptance logic • Merit functions • Constraint relaxation • Corrections and convergence Computational Experiments • Barnes • CUTE • MEMS design

  12. Data Fit SBO Experiment #1Barnes Problem: Algorithm Component Test Matrices • Surrogates • Linear polynomial, quadratic polynomial, 1st-order Taylor series, TANA • Approximate subproblem • Function: f(x), Lagrangian, augmented Lagrangian • Constraints: g(x)/h(x), linearized, none • Iterate Acceptance Logic • Trust region ratio, filter • Merit function • Penalty, adaptive penalty, Lagrangian, augmented Lagrangian • Constraint relaxation • Heuristic homotopy, none

  13. Data Fit SBO Experiment #2Barnes Problem: Effect of Constraint Relaxation Sample homotopy constraint relaxation 106 starting points on uniform grid Optimal/feasible directions differ by < 90o. Relaxation beneficial to achieve balance. Optimal/feasible directions opposed. Relaxation harmful since feasibility must ultimately take precedence.

  14. Data Fit SBO Experiment #3Selected CUTE Problems: Test Matrix Summaries • Surrogates • TANA better than local • Approximate subproblem • Function: f(x) • Constraints: g(x)/h(x) • Iterate Acceptance Logic • TR ratio varies considerably • Filter more consistent/robust • Merit function • No clear preference • Constraint relaxation • No clear preference

  15. 13 design vars: Wi, Li, qi Multifidelity SBO Testing:Bi-Stable MEMS Switch 38.6x

  16. NPSOL DOT MEMS Design Results:Single-Fidelity and Multifidelity • Current best SBO approaches carried forward: • Approximate subproblem: original function, original constraints • Augmented Lagrangian merit function • Trust region ratio acceptance Note: Both single-fidelity runs fail with forward differences

  17. Concluding Remarks • Data Fit SBO Performance Comparison: • Primary conclusions • Approx. subproblem: Direct surrogate (original obj., original constr.) • Iterate acceptance: TR ratio more variable, filter more robust • Secondary conclusions • Merit function: Augmented Lagrangian, Adaptive Penalty • Constraint relaxation: homotopy, adaptive logic needed • Multifidelity SBO Results: • Robustness: SBO less sensitive to gradient accuracy • Expense: Equivalent high-fidelity work reduced by ~3x • Future Work: • Penalty-free, multiplier-free approaches (filter-based merit fn/TR resizing) • Constraint relaxation linking optimal/feasible directions to a • Investigation of SBO performance for alternate subproblem formulations

  18. DAKOTA • Capabilities available in DAKOTA v4.0+: • Available for download:http://www.cs.sandia.gov/DAKOTA • Paper containing more details: • Available for download: http://www.cs.sandia.gov/~dmdunla/publications/Eldred_FSB_2006.pdf • Questions/comments: • Email: dmdunla@sandia.gov

  19. Extra Slides

  20. Starting point (30,40) Starting point (65,1) Starting point (10,20) Data Fit SBO Experiment #1Barnes Problem: Algorithm Component Test Matrices

  21. Starting point (30,40) Starting point (65,1) Starting point (10,20) Data Fit SBO Experiment #1Barnes Problem: Algorithm Component Test Matrices

  22. Data Fit SBO Experiment #3Selected CUTE Problems: Test Matrix Summaries 3 top performers for each problem: SP Obj–SP CON—Surr—Merit—Accept—Con Relax

  23. SBO Algorithm Components:Corrections and Convergence • Corrections: • Additive • Multiplicative • Combined • Zeroth-order • First-order • Second-order (full, FD, Quasi) • Soft convergence assessment: • Diminishing returns • Hard convergence assessment: • Bound constrained-LLS for updating multipliers for standard Lagrangian minimizes the KKT residual • If feasible and KKT residual < tolerance, then hard convergence achieved

  24. Then: Approximate A,B with 2nd-order Taylor series centered at xc: where: Local Correction Derivations Exact additive/multiplicative:

  25. Multipoint Correction Derivation Combined additive/multiplicative (notional): Assume a weighted sum of a, b corrections which preserves consistency: Previous xcRejected x* Enforce additional matching condition: where xp Can be used to preserve global accuracy while satisfying local consistency

  26. Parameters Responses DAKOTA Framework Model: Iterator Parameters Interface LeastSq DoE Design continuous discrete Functions Application system fork directgrid NLSSOL GN DDACE CCD/BB objectives constraints NL2SOL QMC/CVT Uncertain normal/logn uniform/logu triangular beta/gamma EV I, II, III histogram interval least sq. terms ParamStudy generic UQ Optimizer Gradients Approximationglobal polynomial 1/2/3, NN, kriging, MARS, RBF multipoint – TANA3 local – Taylor series hierarchical DSTE LHS/MC Vector List numericalanalytic SFEM Reliability Center MultiD Hessians numericalanalyticquasi State continuous discrete NLPQL DOT CONMIN NPSOL OPT++ COLINY JEGA Strategy: control of multiple iterators and models Coordination: Strategy Iterator Nested Layered Cascaded Concurrent Adaptive/Interactive Model Optimization Uncertainty Iterator Hybrid Parallelism: OptUnderUnc Asynchronous local Message passing Hybrid 4 nested levels with Master-slave/dynamic Peer/static Model SurrBased UncOfOptima Iterator Pareto/MStart 2ndOrderProb Branch&Bound/PICO Model

More Related