1 / 8

Math 175: Numerical Analysis II

Math 175: Numerical Analysis II. Lecturer: Jomar Fajardo Rabajante IMSP, UPLB 2 nd Sem AY 2012-2013. SOLVING AND SOLVING AGAIN…. . Solutions of Linear Equations [high-school/ Math 17] Solutions of Nonlinear Equations (finding roots of nonlinear equations) [Math 17/Math 175]

hanh
Download Presentation

Math 175: Numerical Analysis II

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Math 175: Numerical Analysis II Lecturer: Jomar Fajardo Rabajante IMSP, UPLB 2ndSem AY 2012-2013

  2. SOLVING AND SOLVING AGAIN…  • Solutions of Linear Equations [high-school/ Math 17] • Solutions of Nonlinear Equations (finding roots of nonlinear equations) [Math 17/Math 175] • Solutions of Linear Systems (Systems of Linear Equations) [high school/Math 17/Math 120/ then later in Math 175] • Solutions of Nonlinear Systems (Systems of Nonlinear Equations) [Math 17/Math175] 

  3. SOLVING AND SOLVING AGAIN… • We will DELAY the discussion of Numerical Solutions to LINEAR SYSTEMS. • Your LABORATORY instructors will discuss Numerical Solutions to NONLINEAR SYSTEMS, specifically applying Multivariate Newton’s Method (similar to our Newton-Raphson but here we will use the concept of Jacobian).

  4. NOW, • We will discuss an optional topic which is NUMERICAL OPTIMIZATION (unconstrained) • We will consider two methods: • Golden-section Search (univariate) (lecture class) • Newton’s Method (laboratory class) • Univariate (Math 36) • Multivariate (Math 38) Of course Exhaustive Search is still applicable but not encouraged. Another famous method is the Steepest Descent.

  5. GOLDEN SECTION SEARCH • We will only consider MINIMIZATION. Why? • Similar to bisection method. It uses a bracket. It is also linearly but globally convergent. • The function should be UNIMODAL on the interval in consideration (i.e. there is only ONE dip ).

  6. GOLDEN SECTION SEARCH It uses the golden ratio or the divine proportion φ: Theorem: After k steps of Golden Section Search with starting interval [a,b], the midpoint of the final interval is within of the minimum, where The theorem says that (b-a) is cut approx (38.2%)^k times, then after that get the midpoint.

  7. GOLDEN SECTION SEARCH Algorithm: Given funimodal with minimum in [a,b] while (0.5*(g^k)*(b-a)>tol) if f(a+(1-g)*(b-a))<f(a+g*(b-a)) b=a+g*(b-a) else a=a+(1-g)*(b-a) end end approxmin=(a+b)/2

More Related