1 / 22

Interlude (Optimization and other Numerical Methods)

Interlude (Optimization and other Numerical Methods). Fish 458, Lecture 8. Numerical Methods.

earnest
Download Presentation

Interlude (Optimization and other Numerical Methods)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Interlude(Optimization and other Numerical Methods) Fish 458, Lecture 8

  2. Numerical Methods • Most fisheries assessment problems are mathematically too complicated to apply analytical methods. We often have to resort to numerical methods. The two most frequent questions requiring numerical solutions are: • Find the values for a set of parameters so that they satisfy a set of (non-linear) equations. • Find the values for a set of parameters to minimize some function. • Note:Numerical methods can (and do) fail – you need to know enough about them to be able to check for failure.

  3. Minimizing a Function • The problem: • Find the vector so that the function is minimized (note: maximizing is the same as minimizing ). • We may place bounds on the values for the elements of (e.g. some must be positive). • By definition, for a minimum:

  4. Analytic Approaches • Sometimes it is possible to solve the differential equation directly. For example: • Now:

  5. Linear Models – The General Case • Generalizing: • The solution to this case is: • Exercise: check it for the simple case:

  6. Analytical Approaches-II • Use analytical approaches whenever possible. Finding analytical solutions for some of the parameters of a complicated model can substantially speed up the process of minimizing the function. • For example: q for the Dynamic Schaefer model:

  7. Numerical Methods (Newton’s Method)Single variable version-I • We wish to find the value of x such that f(x) is at a minimum. • Guess a value for x • Determine whether increasing or decreasing x will lead to a lower value for f(x) (based on the derivative). • Assess the slope and its change (first and second derivatives of f) to determine how far to move from the current value of x. • Change x based on step 3. • Repeat steps 2-4 until no further progress is made.

  8. Numerical Methods (Newton’s Method)Single variable version-II • Formally: • Note: Newton’s method may diverge rather than converge!

  9. Minimize: 2+(x-2)^4-x^2 This is actually quite a nasty function – differentiate it and see! Minimum: -6.175 at x = 3.165 Convergence took 17 steps in this case.

  10. Numerical Methods (Newton’s Method) • If the function is quadratic (i.e. the third derivative is zero), Newton’s method will get to the solution in one step (but then you could solve the equation by hand!) • If the function is non-quadratic, iteration will occur. • Multi-parameter extensions to Newton’s method exist but most people prefer gradient free methods (e.g. SIMPLEX) to deal with problems like this.

  11. Calculating Derivatives Numerically-I • Many numerical methods (e.g. Newton’s method) require derivatives. You should generally calculate the derivatives analytically. However, sometimes, this gets very tedious. For example, for the Dynamic Schaefer model: • I think you get the picture…

  12. Calculating Derivatives Numerically-II • The accuracy of the approximation depends on the number of terms and the size of x / y (smaller – but not too small – is better)

  13. Calculating Derivatives Numerically-III(back to Cape Hake)

  14. Optimization – some problems-I • Local minima Global minimum Local minimum

  15. Optimization – some problems-II • Problems calculating numerical derivatives. • Integer parameters. • Bounds [either on the function itself (extinction of Cape Hake hasn’t happened - yet) or on the values for the parameters].

  16. Optimization – Tricks of the Trade-I • To keep a parameter,x, constrained between a and b, transform it to a+(0.5+arctan(y)/)(b-a). x=1+(0.5+arctan(y)/) x~[1-2]

  17. Optimization – Tricks of the Trade-II • Test the code for the model by fitting to data where the answer is known! • Look at the fit graphically (have you maximized rather than minimizing the function)? • Minimize the function manually; restart the optimization algorithm from the final value; restart the optimization algorithm from different values. • When fitting n parameters that must add to 1, fit n-1 parameters and set the last to 1-sum(1:n-1). • In SOLVER, use automatic scaling and set the convergence criterion smaller.

  18. Solving an Equation(the Bisection Method) • Often we have to solve the equation f(x)=0 (e.g. the Lokta equation). This can be treated as an optimization problem (i.e. minimize f(x)2). • However, it is usually more efficient to use a numerical method specifically developed for this purpose. • We illustrate the Bisection Method here but there are many others.

  19. Solving an Equation(the Bisection Method)

  20. Solving an Equation(the Bisection Method) Find x such that 10+20(x-2)^3+100(x-2)^2=0

  21. Note • Use of Numerical Methods (particularly optimization) is an art. The only way to get it right is to practice (probably more than anything else in this course).

  22. Readings • Hilborn and Mangel (1997); Chapter 11 • Press et al. (1988), Chapter 10

More Related