methods for nonlinear least square problems n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Methods For Nonlinear Least-Square Problems PowerPoint Presentation
Download Presentation
Methods For Nonlinear Least-Square Problems

Loading in 2 Seconds...

play fullscreen
1 / 39

Methods For Nonlinear Least-Square Problems - PowerPoint PPT Presentation


  • 82 Views
  • Uploaded on

Methods For Nonlinear Least-Square Problems . Jinxiang Chai. Applications. Inverse kinematics Physically-based animation Data-driven motion synthesis Many other problems in graphics, vision, machine learning, robotics, etc. Where , i=1,…,m are given functions, and m>=n .

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Methods For Nonlinear Least-Square Problems' - krystal


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
applications
Applications
  • Inverse kinematics
  • Physically-based animation
  • Data-driven motion synthesis
  • Many other problems in graphics, vision, machine learning, robotics, etc.
problem definition

Where , i=1,…,m are given functions, and m>=n

Problem Definition

Most optimization problem can be formulated as a nonlinear least squares problem

inverse kinematics

Base

Inverse Kinematics

Find the joint angles θ that minimizes the distance between the character position and user specified position

θ2

θ2

l2

l1

θ1

C=(c1,c2)

(0,0)

global minimum vs local minimum
Global Minimum vs. Local Minimum
  • Finding the global minimum for nonlinear functions is very hard
  • Finding the local minimum is much easier
assumptions
Assumptions
  • The cost function F is differentiable and so smooth that the following Taylor expansion is valid,
gradient descent
Gradient Descent

Objective function:

Which direction is optimal?

gradient descent1
Gradient Descent

Which direction is optimal?

gradient descent2
Gradient Descent

A first-order optimization algorithm.

To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the gradient of the function at the current point.

gradient descent3
Gradient Descent
  • Initialize k=0, choose x0
  • While k<kmax
newton s method
Newton’s Method
  • Quadratic approximation
  • What’s the minimum solution of the quadratic approximation
newton s method1
Newton’s Method
  • High dimensional case:
  • What’s the optimal direction?
newton s method2
Newton’s Method
  • Initialize k=0, choose x0
  • While k<kmax
newton s method3
Newton’s Method
  • Finding the inverse of the Hessian matrix is often expensive
  • Approximation methods are often used

- conjugate gradient method

- quasi-newton method

comparison
Comparison
  • Newton’s method vs. Gradient descent
gauss newton methods
Gauss-Newton Methods
  • Often used to solve non-linear least squares problems.

Define

We have

gauss newton method
Gauss-Newton Method
  • In general, we want to minimize a sum of squared function values
gauss newton method1
Gauss-Newton Method
  • In general, we want to minimize a sum of squared function values
  • Unlike Newton’s method, second derivatives are not required.
gauss newton method2
Gauss-Newton Method
  • In general, we want to minimize a sum of squared function values
gauss newton method3
Gauss-Newton Method
  • In general, we want to minimize a sum of squared function values

Quadratic function

gauss newton method4
Gauss-Newton Method
  • In general, we want to minimize a sum of squared function values

Quadratic function

gauss newton method5
Gauss-Newton Method
  • In general, we want to minimize a sum of squared function values

Quadratic function

gauss newton method6
Gauss-Newton Method
  • In general, we want to minimize a sum of squared function values

Quadratic function

gauss newton method7
Gauss-Newton Method
  • Initialize k=0, choose x0
  • While k<kmax
gauss newton method8
Gauss-Newton Method
  • In general, we want to minimize a sum of squared function values

Any Problem?

Quadratic function

gauss newton method9
Gauss-Newton Method
  • In general, we want to minimize a sum of squared function values

Any Problem?

Quadratic function

gauss newton method10
Gauss-Newton Method
  • In general, we want to minimize a sum of squared function values

Any Problem?

Quadratic function

Solution might not be unique!

gauss newton method11
Gauss-Newton Method
  • In general, we want to minimize a sum of squared function values

Any Problem?

Quadratic function

Add regularization term!

levenberg marquardt method
Levenberg-Marquardt Method
  • In general, we want to minimize a sum of squared function values

Any Problem?

levenberg marquardt method1
Levenberg-Marquardt Method
  • In general, we want to minimize a sum of squared function values

Any Problem?

Quadratic function

Add regularization term!

levenberg marquardt method2
Levenberg-Marquardt Method
  • In general, we want to minimize a sum of squared function values

Any Problem?

Quadratic function

Add regularization term!

levenberg marquardt method3
Levenberg-Marquardt Method
  • Initialize k=0, choose x0
  • While k<kmax
stopping criteria
Stopping Criteria
  • Criterion 1: reach the number of iteration specified by the user

K>kmax

stopping criteria1
Stopping Criteria
  • Criterion 1: reach the number of iteration specified by the user
  • Criterion 2: when the current function value is smaller than a user-specified threshold

K>kmax

F(xk)<σuser

stopping criteria2
Stopping Criteria
  • Criterion 1: reach the number of iteration specified by the user
  • Criterion 2: when the current function value is smaller than a user-specified threshold
  • Criterion 3: when the change of function value is smaller than a user specified threshold

K>kmax

F(xk)<σuser

||F(xk)-F(xk-1)||<εuser

levmar library
Levmar Library
  • Implementation of the Levenberg-Marquardt algorithm
  • http://www.ics.forth.gr/~lourakis/levmar/
constrained nonlinear optimization
Constrained Nonlinear Optimization
  • Finding the minimum value while satisfying some constraints