1 / 37

Engineering Optimization

Concepts and Applications. Engineering Optimization. Fred van Keulen Matthijs Langelaar CLA H21.1 A.vanKeulen@tudelft.nl. Contents. Optimization problem checking and simplification Model simplification. Expensive model. Cheap model. Optimizer. Optimizer. Model simplification.

dong
Download Presentation

Engineering Optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Concepts and Applications Engineering Optimization Fred van Keulen Matthijs Langelaar CLA H21.1 A.vanKeulen@tudelft.nl

  2. Contents • Optimization problem checking and simplification • Model simplification

  3. Expensivemodel Cheapmodel Optimizer Optimizer Model simplification • Basic idea: • Motivation: • Replacement of expensive function, evaluated many times • Interaction between different disciplines • Estimation of derivatives • Noise

  4. Procedure: Extract information Construct approximation Model simplification (2) • Drawback: loss of accuracy • Different ranges: local, mid-range, global • Synonyms: • Approximation models • Metamodels • Surrogate models • Compact models • Reduced order models

  5. Model simplification (3) • Information extraction: linked to techniques from physical experiments: “plan of experiments” / DoE • Many approaches! Covered here: • Taylor series expansions • Exact fitting • Least squares fitting (response surface techniques) • Kriging • Reduced basis methods • Briefly: neural nets, genetic programming, simplified physical models • Crucial: purpose,range and level of detail

  6. Taylor series expansions • Approximation based on local information: Truncation error! Use of derivative information! Valid in neighbourhood of x

  7. 20th order 4th order 3rd order 5th order 2nd order Taylor approximation example Function Approximation(x = 20) 1st order x

  8. Exact fitting (interpolation) • # datapoints = # fitting parameters • Every datapoint reproduced exactly • Example: f2 f1 x1 x2

  9. Often used: polynomials, generalized polynomials: Exact fitting (2) • Easy for intrinsically linear functions: No smoothing / filtering / noise reduction Danger of oscillations with high-order polynomials

  10. 9th order polynomial 5th order 9th order Oscillations • Referred to as “Runge phenomenon” In practice: use order 6 or less

  11. “Best fit”? Minimize sum of squared deviations: Least squares fitting • Less fitting parameters than datapoints • Smoothing / filtering behaviour • “Best fit”? Minimize sum of deviations: f x

  12. Short notation: Least squares fitting (2) • Choose fitting function linear in parameters ai :

  13. LS fitting (3) • Minimize sum of squared errors:(Optimization problem!)

  14. Polynomial LS fitting • Polynomial of degree m:

  15. quadratic 6th degree Polynomial LS example 1.2 samples 1 0.8 0.6 0.4 0.2 0 -0.2 -1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1

  16. Multidimensional LS fitting • Polynomial in multiple dimensions: Number of coefficients ai for quadratic polynomial in Rn: Curse of dimensionality!

  17. x3 x2 x1 Fractional factorial design 2n full factorial design Response surface • Generate datapoints through sampling: • Generate design points through Design of Experiments • Evaluate responses Fit analytical model Check accuracy

  18. Latin Hypercube Sampling (LHS) • Popular method: LHS • Based on idea of Latin square: • Properties: • Space-filling • Any number of design points • Intended for box-like domains • Matlab: lhsdesign

  19. Statistical quality indicators: • R2 correlation measure: • F-ratio (signal to noise): Okay: >0.6 Okay: >>1 (LS) Fit quality indicators • Accuracy? More / fewer terms? • Examine the residuals • Small • Random! e 0 xi

  20. Nonlinear LS: more complicated functions of ai: More difficult to fit! (Nonlinear optimization problem) Matlab: lsqnonlin Nonlinear LS • Linear LS: intrinsically linear functions (linear in ai):

  21. LS pitfalls f • Scattered data: • Wrong choice of basis functions: x f x

  22. Kriging • Named after D.C. Krige, mining engineer, 1951 • Statistical approach: correlation between neighbouring points • Interpolation by weighted sum: • Weights depend on distance • Certain spatial correlationfunction is assumed(usually Gaussian)

  23. Kriging properties • Kriging interpolation is “most likely” in some sense (based on assumptions of the method) • Interpolation: no smoothing / filtering • Many variations exist! • Advantage: no need to assume form of interpolation function • Fitting process more elaborate than LS procedure

  24. Kriging example • Results depend strongly on statistical assumptions and method used: Kriging interpolation Dataset z(x,y)

  25. Select small number of “modes” to build basis • Example: eigenmodes Reduced order model • Idea: describing system in reduced basis: • Example: structural dynamics

  26. N1 Nk k1 • Reduced system equations: kN NN Nk kN N1 Reduced order model (2) • Reduced basis:

  27. Reduced order models • Many approaches! • Selection of type and number of basis vectors • Dealing with nonlinearity / multiple disciplines • Active research topic • No interpolation / fitting, but approximate modeling

  28. Structural model Mass model Aerodynamic model Example:Aircraft model

  29. Neural nets

  30. f(x) output S(input) Neural nets x To determine internal neuron parameters, neural nets must be trained on data.

  31. Neural net features • Versatile, can capture complex behavior • Filtering, smoothing • Many variations possible • Network • Number of neurons, layers • Transfer functions • Many training steps might be required (nonlinear optimization) • Matlab: see e.g. nndtoc

  32. ^2 + / x3 x1 x2 Genetic programming • Building mathematical functions using evolution-like approach • Approach good fit by crossover and mutation of expressions

  33. Genetic programming • LS fitting with population of analytic expressions • Selection / evolution rules • Features: • Can capture very complex behavior • Danger of artifacts / overfitting • Quite expensive procedure

  34. Refinement: correction function, parameter functions ... Simplified model Correctionfunction f(x) x Simplified physical models • Goal: capture trends from underlying physics through simpler model: • Lumped / Analytic / Coarse • Parameters fitted to “high-fidelity” data

  35. Model simplification summary Many different approaches: • Local: Taylor series (needs derivatives) • Interpolation (exact fit): • (Polynomial) fitting • Kriging • Fitting: LS • Approximate modeling: reduced order / simplified models • Other: genetic programming, neural nets, etc

  36. Expensivemodel Cheapmodel Optimizer Optimizer Response surfaces in optimization • Popular approach for computationally expen-sive problems: • DoE, generate samples (expensive) in part of domain • Build response surface (cheap) • Perform optimization on response surface (cheap) • Update domain of interest, and repeat Additional advantage: smoothens noisy responses Easy to combine with parallel computing

  37. Optimum (Expensive) simulation Sub-optimal point Trust region Response surface Example: Multi-point Approximation Method Design domain

More Related