1 / 59

metalab.uniten.my/~farrukh/Utim/Utimoptim.zip

Optimization with MATLAB. A Hands-On Workshop. Farrukh Nagi. Universiti Teknologi MARA , Shah Alam Campus 25-26 June, 2014. http://metalab.uniten.edu.my/~farrukh/Utim/Utimoptim.zip. Introduction To Non–linear Optimization. PART I. Contents. PART 1 – INTRODUCTION TO OPTIMIZATION

cybil
Download Presentation

metalab.uniten.my/~farrukh/Utim/Utimoptim.zip

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Optimization with MATLAB A Hands-On Workshop Farrukh Nagi Universiti Teknologi MARA , Shah Alam Campus 25-26 June, 2014 http://metalab.uniten.edu.my/~farrukh/Utim/Utimoptim.zip

  2. Introduction To Non–linear Optimization PART I

  3. Contents • PART 1 – INTRODUCTION TO OPTIMIZATION • Introduction to Optimization • Fundamental of Optimization • Mathematical Background • Unconstrained Optimization Methods • Gradient Descent Methods (Steepest Descent) • Least Square Methods • Simplex Methods • 4. Constrained Optimization

  4. PART 2 - MATLAB OPTIMIZATION • Matlab/Simulink Optimization Methods • Function Optimization • Minimization Algorithms • Unconstrained Optimization – fminunc, fminsearch • Optimization Options Settings • Constrained Optimization • Multi-objective Optimization - lsqnonlin • Optimization Toolbox >>optimtool %GUI • Environmental Science Optimization • 4.

  5. PART 3 - SIMULINK OPTIMIZATION • RESPONSE /IEEE_optim/… • Parametric Modeling • Parameter Identification • Parameter Passing with Component Block Input • Simulink Optimization Design (SOD) – GUI • 9. Optimizer Output • 10. Simulink Examples List • REFERENCES

  6. What is Optimization? • Optimization is an iterative process • by which a desired solution(max/min) • of the problem can be found while • satisfying all its constraint or bounded • conditions. Figure 2: Optimum solution is found while satisfying its constraint (derivative must be zero at optimum). • Optimization problem could be linear • or non-linear. • Non –linear optimization is accomplished by • numerical ‘Search Methods’. • Search methods are used iteratively before a solution is achieved. • The search procedure is termed as algorithm.

  7. Optimization Methods • One-Dimensional Unconstrained Optimization Golden-Section Search Quadratic Interpolation Newton's Method • Multi-Dimensional Unconstrained Optimization Non-gradient or direct methods Gradient methods • Linear Programming (Constrained) Graphical Solution Simplex Method • Genetic Algorithm (GA) – Survival of the fittest principle based upon • evolutionary theory \IEEE_OPTIM_2012\GA\GA Presentation.ppt • Particle Swarm Optimization (PSO) – Concept of best solution in the neighborhood : \IEEE_OPTIM_2012\PSO\ PSO Presentation.ppt • Others …….

  8. Fundamentals of Non-Linear Optimization • The solution of the linear problem lies on boundaries of the feasible region. Figure 4: Three dimensional solution of non-linear problem Figure 3: Solution of linear problem • Non-linear problem solution lies within and on the boundaries of the feasible region.

  9. Maximize X1 + 1.5 X2 Subject to:X1 + X2 ≤ 1500.25 X1 + 0.5 X2 ≤ 50X1 ≥ 50X2 ≥ 25X1 ≥0, X2 ≥0 …Contd Fundamentals of Non-Linear Optimization • Single Objective function f(x) • Maximization • Minimization • Design Variables, xi , i=0,1,2,3….. • Constraints • Inequality • Equality Figure 5: Example of design variables and constraints used in non-linear optimization. • Optimal points • Local minima/maxima points: A point or Solution x* is at local point if there is no other x in its Neighborhood less than x* • Global minima/maxima points: A point or Solution x** is at global point if there is no other x in entire search space less than x**

  10. Fundamentals of Non-Linear Optimization …Contd Figure 7: Local point is equal to global point if the function is convex. Figure 6: Global versus local optimization. A set S is convex if the line segment joining any two points in the set is also in the set. convex not convex convex not convex not convex

  11. Fundamentals of Non-Linear Optimization …Contd • Function f is convex if f(Xa) is less than value of the corresponding • point joining f(X1) and f(X2). • Convexity condition – Hessian 2nd order derivative) matrix of • function f must be positive semi definite ( eigen values +ve or zero). Figure 8: Convex and nonconvex set Figure 9: Convex function

  12. Optimality Conditions • First order Condition (FOC) • Hessian – Second derivative of f of several variables • Second order condition (SOC) • Eigen values of H(X*) are all positive • Determinants of all lower order of H(X*) are +ve

  13. Optimization Methods …Constrained a.) Indirect approach – by transforming into unconstrained problem. b.) Exterior Penalty Function (EPF) and Augmented Lagrange Multiplier c.) Direct Method Sequential Linear Programming (SLP), SQP and Steepest Generalized Reduced Gradient Method (GRG) • Figure 10: Descent Gradient or LMS

  14. Steepest descent method

  15. Example

  16. Example (cont.)

  17. Matlab steepest descent- fminunc • %startpresentgrad.m • clc • clf • hold off • x0=[5,1]; • options = optimset('OutputFcn', @outfun); %for graphic display • options=optimset('LargeScale','off','Display','iter-detailed'); %for iteration • options=optimset(options,'GradObj','on'); %for enabling gradient object • options = optimset(options,'OutputFcn', @outfun); • [x,fval]=fminunc(@myfuncon2,x0,options); • function [f,g]=myfuncon2(x) function stop = outfun(x, optimValues, state) • f=0.5*x(1)^2+2.5*x(2)^2; ww1= -6:0.05:6; • if nargout > 1 ww2=ww1; • g(1)=x(1); %gradient 1 supplied [w1,w2]=meshgrid(ww1,ww2); • g(2)=5*x(2); %gradient2 supplied J=-1*(0.5*w1.^2+2.5*w2.^2); • foo=max(abs(g)); cs=contour(w1,w2,J,20); • hold on grid • End stop=false • %cs=surf(w1,w2,J); hold on; • Grid plot(x(1),x(2),’bl+’); drawnow

  18. Non-Linear least squares

  19. Non-Linear least squares

  20. Simplex Methods Minimize

  21. Derivative-free optimization Downhill simplex method

  22. Example: constrained optimization

  23. Example (cont.)

  24. Example (cont.)

  25. Environmental Sciences Optimization Case Studies • 1. Fish Harvesting…/Utim/fishharvester/ • 2. River Pollution…../Utim/WWTP_RivPol/ • 3. Noise Pollution…./Utim/machine_noise/ • Data Fitting • 1. Hydrology .../Utim/hydrology/ • 2. Anthropometric.../Utim/anthropometry/

  26. MATLAB Optimization PART II

  27. MATLAB/SIMULINK OPTIMIZATION METHODS • M-Files • Custom code • @functions • @functions • Scripts MATLAB >>Command Window MATLAB >>Optimtool -GUI- SimulinkModel.mdl SimulinkDesign Optimization • Ports/ Block update • Model Block

  28. Function Optimization • Optimization concerns the minimization or maximization of functions • Standard Optimization Problem: Subject to: Equality Constraints Inequality Constraints Side Constraints Where: is the objective function, which measure and evaluate the performance of a system. In a standard problem, we are minimizing the function. For maximization, it is equivalent to minimization of the –ve of the objective function. is a column vector of design variables, which can affect the performance of the system.

  29. Function Optimization (Cont.) • Constraints – Limitation to the design space. Can be linear or nonlinear, explicit or implicit functions Equality Constraints Inequality Constraints Most algorithm require less than!!! Side Constraints

  30. Optimization Toolbox • Is a collection of functions that extend the capability of MATLAB. • The toolbox includes routines for: • Unconstrained optimization • Constrained nonlinear optimization, including goal attainment • problems, minimax problems, and semi-infinite minimization • problems • Quadratic and linear programming • Nonlinear least squares and curve fitting • Nonlinear systems of equations solving • Constrained linear least squares • Specialized algorithms for large scale problems

  31. Unconstrained Minimization • Consider the problem of finding a set of values [x1 x2]T that solves • Steps: • Create an M-file that returns the function value (Objective • Function). Call it objfun.m • Then, invoke the unconstrained minimization routine. Use fminunc

  32. Step 1 – Obj. Function function f = objfun(x) f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1); Objective function

  33. Step 2 – Invoke Routine Starting with a guess x0 = [-1,1]; options = optimset(‘LargeScale’,’off’); [xmin,feval,exitflag,output]= fminunc(‘objfun’,x0,options); Optimization parameters settings Output arguments Input arguments

  34. Results xmin = 0.5000 -1.0000 feval = 1.3028e-010 exitflag = 1 output = iterations: 7 funcCount: 40 stepsize: 1 firstorderopt: 8.1998e-004 algorithm: 'medium-scale: Quasi-Newton line search' Minimum point of design variables Objective function value Exit flag tells if the algorithm is converged. If exit flag > 0, then local minimum is found Some other information

  35. More on fminunc – Input [xmin,feval,exitflag,output,grad,hessian]= fminunc(fun,x0,options,P1,P2,…) fun : Return a function of objective function. x0 : Starts with an initial guess. The guess must be a vector of size of number of design variables. Option : To set some of the optimization parameters. (More after few slides) P1,P2,… : To pass additional parameters.

  36. More on fminunc – Output [xmin,feval,exitflag,output,grad,hessian]= fminunc(fun,x0,options,P1,P2,…) • xmin : Vector of the minimum point (optimal point). The size is the number of design variables. • feval : The objective function value of at the optimal point. • exitflag : A value shows whether the optimization routine is terminated successfully. (converged if >0) • Output : This structure gives more details about the optimization • grad : The gradient value at the optimal point. • hessian : The hessian value of at the optimal point

  37. Options Setting – optimset Options = optimset(‘param1’,value1, ‘param2’,value2,…) • The routines in Optimization Toolbox has a set of default optimization parameters. • However, the toolbox allows you to alter some of those parameters, for example: the tolerance, the step size, the gradient or hessian values, the max. number of iterations etc. • There are also a list of features available, for example: displaying the values at each iterations, compare the user supply gradient or hessian, etc. • You can also choose the algorithm you wish to use.

  38. Options Setting (Cont.) Options = optimset(‘param1’,value1, ‘param2’,value2,…) • Type help optimset in command window, a list of options setting available will be displayed. • How to read? For example: LargeScale - Use large-scale algorithm if possible [ {on} | off ] The default is with { } Value (value1) Parameter (param1)

  39. Options Setting (Cont.) Options = optimset(‘param1’,value1, ‘param2’,value2,…) LargeScale - Use large-scale algorithm if possible [ {on} | off ] Since the default is on, if we would like to turn off, we just type: Options = optimset(‘LargeScale’, ‘off’) and pass to the input of fminunc.

  40. Useful Option Settings Highly recommended to use!!! • Display - Level of display [ off | iter | notify | final ] • MaxIter - Maximum number of iterations allowed [ positive integer ] • TolCon - Termination tolerance on the constraint violation [ positive scalar ] • TolFun - Termination tolerance on the function value [ positive scalar ] • TolX - Termination tolerance on X [ positive scalar ]

  41. fminunc and fminsearch • fminunc uses algorithm with gradient and hessian information. • Two modes: • Large-Scale: interior-reflective Newton • Medium-Scale: quasi-Newton (BFGS) • Not preferred in solving highly discontinuous functions. • This function may only give local solutions.. • fminsearch is generally less efficient than fminunc for problems of order greater than two. However, when the problem is highly discontinuous, fminsearch may be more robust. • This is a direct search method that does not use numerical or analytic gradients as in fminunc. • This function may only give local solutions.

  42. Constrained Minimization Vector of Lagrange Multiplier at optimal point [xmin,feval,exitflag,output,lambda,grad,hessian] = fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,…)

  43. Example function f = myfun(x) f=-x(1)*x(2)*x(3); Subject to:

  44. Example (Cont.) For Create a function call nonlcon which returns 2 constraint vectors [C,Ceq] function [C,Ceq]=nonlcon(x) C=2*x(1)^2+x(2); Ceq=[]; Remember to return a null Matrix if the constraint does not apply

  45. Example (Cont.) Initial guess (3 design variables) x0=[10;10;10]; A=[-1 -2 -2;1 2 2]; B=[0 72]'; LB = [0 0 0]'; UB = [30 30 30]'; [x,feval]=fmincon(@myfun,x0,A,B,[],[],LB,UB,@nonlcon) CAREFUL!!! fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,…)

  46. Warning: Large-scale (trust region) method does not currently solve this type of problem, switching to medium-scale (line search). > In D:\Programs\MATLAB6p1\toolbox\optim\fmincon.m at line 213 In D:\usr\CHINTANG\OptToolbox\min_con.m at line 6 Optimization terminated successfully: Magnitude of directional derivative in search direction less than 2*options.TolFun and maximum constraint violation is less than options.TolCon Active Constraints: 2 9 x = 0.00 0.00 16.231 feval = -4.657237250542452e-025 Const. 1 Const. 2 Const. 3 Const. 5 Const. 4 Const. 6 Const. 7 Const. 8 Const. 9 Sequence: A,B,Aeq,Beq,LB,UB,C,Ceq Example (Cont.)

  47. >> optimtool - % GUI

  48. Constrained Optimization An optimization algorithm is large scalewhen it uses linear algebra that does not need to store, nor operate on, full matrices. In contrast, medium-scale methods internally create full matrices and use dense linear algebra The definition is based on the Karush-Kuhn-Tucker (KKT) conditions. The KKT conditions are analogous to the condition that the gradient must be zero at a minimum, modified to take constraints into account. The difference is that the KKT conditions hold for constrained problems. The KKT conditions use the auxiliary Lagrangian function:

  49. fmincon Algorithms fmincon has four algorithm options: a) interior-point, b) active-set, c) SQP, d) trust-region-reflective a) An Interior point method is a linear or nonlinear programming method (Forsgren et al. 2002) that achieves optimization by going through the middle of the solid defined by the problem rather than around its surface b) Active Set Approach Equality constraints always remain in the active set Sk. The search direction dk is calculated and minimizes the objective function while remaining on active constraint boundaries.

  50. Sequential Quadratic Programming and Trust Region c) Sequential quadratic programming (SQP): is an iterative method for nonlinear optimization . SQP methods are used on problems for which the objective function and the constraints are twice continuously differentiable. If the problem is unconstrained, then the method reduces to Newton's method for finding a point where the gradient of the objective vanishes. If the problem has only equality constraints, then the method is equivalent to applying Newton's method to the first-order optimality conditions, or Karush–Kuhn–Tucker conditions , of the problem. d) Trust-Region Reflective: The basic idea is to approximate f with a simpler function q, which reasonably reflects the behavior of function f in a neighborhood N around the point x. This neighborhood is the trust region. A trial step s is computed by minimizing (or approximately minimizing) over N. This is the trust-region sub problem The current point is updated to be x + s if f(x + s) < f(x); otherwise, the current point remains unchanged and N, the region of trust, is shrunk and the trial step computation is repeated.

More Related