1 / 22

Methods for ill-posed & nonlinear systems

Methods for ill-posed & nonlinear systems. Ill-posed linear systems Equations form Matrix form No solution in classical sense!!!!. Least-square problem. Def: The least square problem is to find an n-vector to minimize Two conditions m>=n, i.e. overdetermined system

Download Presentation

Methods for ill-posed & nonlinear systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Methods for ill-posed & nonlinear systems • Ill-posed linear systems • Equations form • Matrix form • No solution in classical sense!!!!

  2. Least-square problem • Def: The least square problem is to find an n-vector to minimize • Two conditions • m>=n, i.e. overdetermined system • A has full column rank, i.e. rank(A)=n • Thm: The least-square problem always has a solution. The solution is unique iff A is of full column rank. • Proof: See details in class (or as an exercise)

  3. Least square problem • Thm: (Normal equation) Let be a solution of the least square problem. Then the residual vector satisfies the normal equation • Proof: See details in class (or as an exercise) • Numerical methods for least square problem • Normal equation method -- when n is small • QR method • SVD method -- most popular!!!

  4. Normal equation method • Idea: Solve the normal equations • Methods (m>>n): • Cholesky factorization • CG method & PCG method, ….. • Drawbacks: • Condition number increases!! • Sensitive to round-off errors

  5. QR method • Decompose • Denote • Solution • Proof: See details in class (or as an exercise)

  6. SVD method • The solution of LS problem is • Since A has full column rank & has the following SVD • Solution

  7. Nonlinear systems • Nonlinear systems • Equations form • Vector form • An example

  8. Nonlinear systems • Solutions • Existence & Uniqueness • Minimization problem • Local minimizer vs Global minimizer • Numerical methods: • Picard iteration or fixed point method • Newton’s & Quasi-Newton’s methods – most popular • The Secant method • Other methods --- read yourself !! • The Fibonacci search method -- based on Fibonacci numbers • The golden section search method -- based on the golden # =0.618!!

  9. Convergence rate • Def: A sequence of vectors converges locally to with order of r if • Comments • Any norm can be used • The constant C may depend on the norm used • The order of convergence r doesn’t depend on the norm!!! • Convergence rates: • Linear convergence: r=1 & 0<C<1 • Superlinear convergence: r>1 • Quadratic convergence: r=2

  10. Newton’s method for 1D • In 1D, i.e. n=1 • Nonlinear equation • Minimization problem • f(x) smooth. There exist roots & each root is single!! • Newton’s method • If f(x) is linear function, solve exact!! • If f(x) is nonlinear function, approximate it by linear function

  11. Locally convergence rate • Thm: Suppose , then the Newton’s method converges locally quadratic, i.e. • Proof: See details in class (or as an exercise) • An example:

  12. Minimization view of Newton’s method • If g(x) is quadratic polynomials, we can find exact!!! • If g(x) is general nonlinear function, we can approximate g(x) locally by a quadratic polynomial.

  13. Newton’s method for n dimensions • The Problem • Define • Taylor expansion to get a quadratic minimization problem • Existence & uniqueness iff the Hessian matrix is positive definite (PD) (vs positive semi-definite (PSD))

  14. Newton’s method for n dimensions • Solution of the quadratic minimization problem • Local convergence rate -- locally quadratic • For nonlinear system • Computational cost: very expensive for computing derivatives when n>>1 !!

  15. The secant (quasi-Newton) method • Newton’s method in 1D: • In many cases, the derivatives are not well-defined or expensive to calculate. Approximate it by secant • The secant (or quasi-Newton) method: • Convergence rate: superliner -- Exercise • Extension to high dimensions: Exercise

  16. Comments on Newton’s method • Some drawbacks of Newton’s method • Need to compute the derivatives of the function f OR the first and second derivatives of the function g!! This can be revised by the secant (or quasi-Newton) method. • The function f & g must be smooth!! In many cases, they are not smooth!! • They are semi-smooth, regularizing technique • When n>>1, we need solve a linear system in every step!! This is very time consuming!! This can be revised by search directions & line searches.

  17. Search directions & line searches • When G is quadratic form – coming from solving linear equations • Steepest decent method • Conjugate gradient (CG) method • When G is general form – coming from solving nonlinear equations • Nonlinear steepest decent method • Nonlinear CG method • The Problem • Find a sequence by iteration • Search directions • Line searches

  18. Search directions & line searches • In Newton’s method, we choose • We have flexibility to choose the search directions & line searches. In general, they need satisfying • Descent directions satisfying • Line search, one-dimensional optimization problem

  19. Search directions & line searches • Best local search direction, i.e. choose such that • Steepest descent direction

  20. General Steepest descent method • The algorithm • Comments • If G is quadratic form, it collapses to steepest descent method • If H is positive definite (PD) or semi-positive definite (SPD) • For general cases, more research is needed!!!

  21. Nonlinear CG method • Choose as conjugate vectors (different ways!!!) • Algorithm

  22. Nonlinear CG method • Some comments • If G is quadratic form, it collapses to conjugate gradient (CG) method • In the computation, we need only compute the gradient of G, no need to compute the Hessian Matrix and compute its inverse !!! • In computation, no need to find the one-dimensional minimization problem very accurately in each step!!! • If H is positive definite (PD) or semi-positive definite (SPD) • For general cases, more research is needed!!

More Related