90 likes | 223 Views
This document delves into the application of the Efficient Global Optimization (EGO) technique using linear regression as a surrogate model. EGO is typically utilized with kriging but can effectively apply to any surrogate that provides normally distributed uncertainty. We demonstrate this with noisy data and quadratic fitting, showcasing prediction variance, expected improvement, and feasibility examples. The analysis includes detailed statistical components and visualizations to illustrate optimization processes and the effectiveness of various sampling strategies.
E N D
Example of EGO with linear regression • EGO is usually applied to a kriging surrogate. However it is applicable to any surrogate that provides normally distributed uncertainty about predictions. • With kriging we often assume zero noise, so we will never sample a point twice. With noisy data that can happen. • We will use as an example • For comparison with quadratic fit • Data is noisy with noise distributed as
DOE and data x=lhsdesign(5,1); x‘=0.9474 0.2692 0.4620 0.0504 0.7099 x=sort(x); x‘= 0.0504 0.2692 0.4622 0.7099 0.9474 obj=inline('2-5*(x-0.45).^2'); f=obj(x); f‘= 1.2015 1.8365 1.9993 1.6623 0.7629 noise=randn(5,1); noise’= -2.1384 -0.8396 1.3546 -1.0722 0.9610 fnoisy=f+0.3*noise; fnoisy’= 0.5600 1.5847 2.4056 1.3406 1.0512 xg=linspace(0,1,21);fg=obj(xg); plot(xg,fg,x,fnoisy,'go','MarkerSize',20, 'MarkerFaceColor','g','LineWidth',2)
Fit xx=[ones(5,1),x,x.^2]; [b,bint,r,rint,stats] = regress(fnoisy,xx); b = 0.3015 6.6325 -6.3559 bint = -1.9351 2.5381 -4.0883 17.3532 -16.6833 3.9716 stats =0.7819 3.5857 0.2181 0.2045 fit=b(1)+b(2)*xg'+b(3)*(xg.^2)‘ hold on; plot(xg,fit,'r','LineWidth',2) xlabel('x');legend('f','data','fit','Location','NorthEast') .
Prediction variance • Recall equation xprimex=xx'*xx; xpxinv=inv(xprimex); xmg=[ones(21,1),xg',(xg.^2)']; variance=diag(xmg*xpxinv*xmg') *stats(4); sterr=sqrt(variance); plot(xg,fit,'r','LineWidth',2);hold on; plot(xg,2*sterr,'m','Linewidth',2); legend('fit','2*sterr','location','NorthEast') .
Expected improvement • Recall • pbs=min(fnoisy); delpbs=(pbs-fit)./sterr; • phi=normpdf(delpbs); • PHI=normcdf(delpbs); • ei=(pbs-fit).*PHI+sterr.*phi; .
One more iteration – new fit xnew= 0 >> x=[xnew; x];newnoise=randn(1,1) newnoise=0.1240 noise=[newnoise;noise]; fnoisy=obj(x)+0.3*noise; fnoisy’=1.0247 0.5601 1.5847 2.4056 1.3406 1.0513 %computer crash and data reinstatedfromMatlabprintout [b,bint,r,rint,stats] = regress(fnoisy,xx) b ‘= 0.7132 4.9955 -5.0270 bint = -0.3909 1.8172 -1.2927 11.2836 -11.6259 1.5719 stats = 0.6806 3.1960 0.1805 0.2114 .
Expected feasibility example • Constraint boundary example by considering a constraint • That should lead to sampling in the middle • Recall u=(1.6-fit)./sterr; up=u+2;um=u-2; PHI=normcdf(u); PHIP=normcdf(up); PHIM=normcdf(um); phi=normpdf(u);phip=normpdf(up); phim=normpdf(um); first=(fit-1.6).*(2*PHI-PHIP-PHIM) second=-sterr.*(2*phi-phip-phim) third=2*sterr.*(PHIP-PHIM) efeas=first+second +third
Plots . .