1 / 14

INCLUDING UNCERTAINTY MODELS FOR SURROGATE BASED GLOBAL DESIGN OPTIMIZATION The EGO algorithm

STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION GROUP. INCLUDING UNCERTAINTY MODELS FOR SURROGATE BASED GLOBAL DESIGN OPTIMIZATION The EGO algorithm. Thanks to Felipe A. C. Viana. BACKGROUND: SURROGATE MODELING.

cicily
Download Presentation

INCLUDING UNCERTAINTY MODELS FOR SURROGATE BASED GLOBAL DESIGN OPTIMIZATION The EGO algorithm

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION GROUP INCLUDING UNCERTAINTY MODELS FOR SURROGATE BASED GLOBAL DESIGN OPTIMIZATION The EGO algorithm Thanks to Felipe A. C. Viana

  2. BACKGROUND: SURROGATE MODELING Surrogates replaceexpensive simulations by simple algebraic expressions fit to data. is an estimate of . Example: • Kriging (KRG) • Polynomial response surface (PRS) • Support vector regression • Radial basis neural networks • Differences are larger in regions of low point density. 2

  3. BACKGROUND: UNCERTAINTY Some surrogates also provide an uncertainty estimate: standard error, s(x). Example: kriging and polynomial response surface. These are used in EGO 3

  4. KRIGING FIT AND THE IMPROVEMENT QUESTION • First we sample the function and fit a krigingmodel. • We note the present best solution (PBS) • At every x there is some chance of improving on the PBS. • Then we ask: Assuming an improvement over the PBS, where is it likely be largest? 4

  5. WHAT IS EXPECTED IMPROVEMENT? Consider the point x=0.8, and the random variable Y, which is the possible values of the function there. Its mean is the kriging prediction, which is slightly above zero. 5

  6. EXPLORATION AND EXPLOITATION EGO maximizes E[I(x)] to find the next point to be sampled. • The expected improvement balances exploration and exploitation because it can be high either because of high uncertainty or low surrogate prediction. • When can we say that the next point is “exploration?” 6

  7. THE BASIC EGO WORKS WITH KRIGING Considering the root mean square error, : (a) Kriging We want to run EGOwith the mostaccurate surrogate. But we have no uncertainty model for SVR (b) Support vector regression 7

  8. IMPORTATION AT A GLANCE 8

  9. HARTMAN3 EXAMPLE Hartman3 function (initially fitted with 20 points): After 20 iterations (i.e., total of 40 points), improvement (I) over initial best sample:

  10. TWO OTHER DESIGNS OF EXPERIMENTS FIRST: SECOND: 10

  11. SUMMARY OF THE HARTMAN3 EXAMPLE Box plot of the difference between improvement offered by different surrogates (out of 100 DOEs) In 34 DOEs (out of 100) KRG outperforms RBNN (in those cases, the difference between the improvements has mean of only 0.8%). 11

  12. EGO WITH MULTIPLE SURROGATES Traditional EGO uses kriging to generate one point at a time. We use multiple surrogates to get multiple points.

  13. POTENTIAL OF EGO WITH MULTIPLE SURROGATES Hartman3 function (100 DOEs with 20 points) Overall, surrogates are comparable in performance.

  14. POTENTIAL OF EGO WITH MULTIPLE SURROGATES “krg” runs EGO for 20 iterations adding one point at a time. “krg-svr” and “krg-rbnn” run 10 iterations adding two points. Multiple surrogates offer good results in half of the time!!!

More Related