1 / 9

Example of EGO with linear regression

Example of EGO with linear regression. EGO is usually applied to a kriging surrogate. However it is applicable to any surrogate that provides normally distributed uncertainty about predictions.

zev
Download Presentation

Example of EGO with linear regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Example of EGO with linear regression • EGO is usually applied to a kriging surrogate. However it is applicable to any surrogate that provides normally distributed uncertainty about predictions. • With kriging we often assume zero noise, so we will never sample a point twice. With noisy data that can happen. • We will use as an example • For comparison with quadratic fit • Data is noisy with noise distributed as

  2. DOE and data x=lhsdesign(5,1); x‘=0.9474 0.2692 0.4620 0.0504 0.7099 x=sort(x); x‘= 0.0504 0.2692 0.4622 0.7099 0.9474 obj=inline('2-5*(x-0.45).^2'); f=obj(x); f‘= 1.2015 1.8365 1.9993 1.6623 0.7629 noise=randn(5,1); noise’= -2.1384 -0.8396 1.3546 -1.0722 0.9610 fnoisy=f+0.3*noise; fnoisy’= 0.5600 1.5847 2.4056 1.3406 1.0512 xg=linspace(0,1,21);fg=obj(xg); plot(xg,fg,x,fnoisy,'go','MarkerSize',20, 'MarkerFaceColor','g','LineWidth',2)

  3. Fit xx=[ones(5,1),x,x.^2]; [b,bint,r,rint,stats] = regress(fnoisy,xx); b = 0.3015 6.6325 -6.3559 bint = -1.9351 2.5381 -4.0883 17.3532 -16.6833 3.9716 stats =0.7819 3.5857 0.2181 0.2045 fit=b(1)+b(2)*xg'+b(3)*(xg.^2)‘ hold on; plot(xg,fit,'r','LineWidth',2) xlabel('x');legend('f','data','fit','Location','NorthEast') .

  4. Prediction variance • Recall equation xprimex=xx'*xx; xpxinv=inv(xprimex); xmg=[ones(21,1),xg',(xg.^2)']; variance=diag(xmg*xpxinv*xmg') *stats(4); sterr=sqrt(variance); plot(xg,fit,'r','LineWidth',2);hold on; plot(xg,2*sterr,'m','Linewidth',2); legend('fit','2*sterr','location','NorthEast') .

  5. Expected improvement • Recall • pbs=min(fnoisy); delpbs=(pbs-fit)./sterr; • phi=normpdf(delpbs); • PHI=normcdf(delpbs); • ei=(pbs-fit).*PHI+sterr.*phi; .

  6. One more iteration – new fit xnew= 0 >> x=[xnew; x];newnoise=randn(1,1) newnoise=0.1240 noise=[newnoise;noise]; fnoisy=obj(x)+0.3*noise; fnoisy’=1.0247 0.5601 1.5847 2.4056 1.3406 1.0513 %computer crash and data reinstatedfromMatlabprintout [b,bint,r,rint,stats] = regress(fnoisy,xx) b ‘= 0.7132 4.9955 -5.0270 bint = -0.3909 1.8172 -1.2927 11.2836 -11.6259 1.5719 stats = 0.6806 3.1960 0.1805 0.2114 .

  7. Prediction variance and expected improvement . .

  8. Expected feasibility example • Constraint boundary example by considering a constraint • That should lead to sampling in the middle • Recall u=(1.6-fit)./sterr; up=u+2;um=u-2; PHI=normcdf(u); PHIP=normcdf(up); PHIM=normcdf(um); phi=normpdf(u);phip=normpdf(up); phim=normpdf(um); first=(fit-1.6).*(2*PHI-PHIP-PHIM) second=-sterr.*(2*phi-phip-phim) third=2*sterr.*(PHIP-PHIM) efeas=first+second +third

  9. Plots . .

More Related