 Download Presentation Methods For Nonlinear Least-Square Problems

# Methods For Nonlinear Least-Square Problems - PowerPoint PPT Presentation

Methods For Nonlinear Least-Square Problems . Jinxiang Chai. Applications. Inverse kinematics Physically-based animation Data-driven motion synthesis Many other problems in graphics, vision, machine learning, robotics, etc. Where , i=1,…,m are given functions, and m>=n . I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation ## Methods For Nonlinear Least-Square Problems

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript ### Methods For Nonlinear Least-Square Problems

Jinxiang Chai Applications
• Inverse kinematics
• Physically-based animation
• Data-driven motion synthesis
• Many other problems in graphics, vision, machine learning, robotics, etc. Problem Definition

Most optimization problem can be formulated as a nonlinear least squares problem   Base

Inverse Kinematics

Find the joint angles θ that minimizes the distance between the character position and user specified position

θ2

θ2

l2

l1

θ1

C=(c1,c2)

(0,0) Global Minimum vs. Local Minimum
• Finding the global minimum for nonlinear functions is very hard
• Finding the local minimum is much easier Assumptions
• The cost function F is differentiable and so smooth that the following Taylor expansion is valid, Objective function:

Which direction is optimal? Which direction is optimal? A first-order optimization algorithm.

To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the gradient of the function at the current point. • Initialize k=0, choose x0
• While k<kmax Newton’s Method
• What’s the minimum solution of the quadratic approximation Newton’s Method
• High dimensional case:
• What’s the optimal direction? Newton’s Method
• Initialize k=0, choose x0
• While k<kmax Newton’s Method
• Finding the inverse of the Hessian matrix is often expensive
• Approximation methods are often used

- conjugate gradient method

- quasi-newton method Comparison
• Newton’s method vs. Gradient descent Gauss-Newton Methods
• Often used to solve non-linear least squares problems.

Define

We have Gauss-Newton Method
• In general, we want to minimize a sum of squared function values Gauss-Newton Method
• In general, we want to minimize a sum of squared function values
• Unlike Newton’s method, second derivatives are not required. Gauss-Newton Method
• In general, we want to minimize a sum of squared function values Gauss-Newton Method
• In general, we want to minimize a sum of squared function values Gauss-Newton Method
• In general, we want to minimize a sum of squared function values Gauss-Newton Method
• In general, we want to minimize a sum of squared function values Gauss-Newton Method
• In general, we want to minimize a sum of squared function values Gauss-Newton Method
• Initialize k=0, choose x0
• While k<kmax Gauss-Newton Method
• In general, we want to minimize a sum of squared function values

Any Problem? Gauss-Newton Method
• In general, we want to minimize a sum of squared function values

Any Problem? Gauss-Newton Method
• In general, we want to minimize a sum of squared function values

Any Problem?

Solution might not be unique! Gauss-Newton Method
• In general, we want to minimize a sum of squared function values

Any Problem? Levenberg-Marquardt Method
• In general, we want to minimize a sum of squared function values

Any Problem? Levenberg-Marquardt Method
• In general, we want to minimize a sum of squared function values

Any Problem? Levenberg-Marquardt Method
• In general, we want to minimize a sum of squared function values

Any Problem? Levenberg-Marquardt Method
• Initialize k=0, choose x0
• While k<kmax Stopping Criteria
• Criterion 1: reach the number of iteration specified by the user

K>kmax Stopping Criteria
• Criterion 1: reach the number of iteration specified by the user
• Criterion 2: when the current function value is smaller than a user-specified threshold

K>kmax

F(xk)<σuser Stopping Criteria
• Criterion 1: reach the number of iteration specified by the user
• Criterion 2: when the current function value is smaller than a user-specified threshold
• Criterion 3: when the change of function value is smaller than a user specified threshold

K>kmax

F(xk)<σuser

||F(xk)-F(xk-1)||<εuser Levmar Library
• Implementation of the Levenberg-Marquardt algorithm
• http://www.ics.forth.gr/~lourakis/levmar/ Constrained Nonlinear Optimization
• Finding the minimum value while satisfying some constraints