methods for nonlinear least- square problems jinxiang chai
Post on 20-Dec-2015
226 views
TRANSCRIPT
Applications
• Inverse kinematics
• Physically-based animation
• Data-driven motion synthesis
• Many other problems in graphics, vision, machine learning, robotics, etc.
Problem Definition
Most optimization problem can be formulated as a nonlinear least squares problem
)()(2
1minarg xfxfx T
x
m
iix xfx
1
2))((2
1minarg
Where , i=1,…,m are given functions, and m>=n RRf ni :
Inverse Kinematics
Find the joint angles θ that minimizes the distance between the character position and user specified position
Base
θ2
θ1
(0,0)
θ2 l2l1
C=(c1,c2)
2221211
2121211
,))sin(sin())cos(cos(minarg
21
cllcll
Global Minimum vs. Local Minimum
• Finding the global minimum for nonlinear functions is very hard
• Finding the local minimum is much easier
Assumptions
• The cost function F is differentiable and so smooth that the following Taylor expansion is valid,
Gradient Descent
A first-order optimization algorithm.
To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the gradient of the function at the current point.
Newton’s Method
• Quadratic approximation
• What’s the minimum solution of the quadratic approximation
2)(2
1)()()( xxfxxfxfxxf
)(
)(
xf
xfx
Newton’s Method
• High dimensional case:
• What’s the optimal direction?
xxHxxxFxFxxF T )(2
1)()()(
)()( 1 xFxHx
Newton’s Method
• Finding the inverse of the Hessian matrix is often expensive
• Approximation methods are often used
- conjugate gradient method
- quasi-newton method
Gauss-Newton Methods
• Often used to solve non-linear least squares problems.
222 )12.0(2
1)1(
2
1)( xxxxF
12.0
1)( 2 xx
xxf
)()(2
1)( xfxfxF T
Define
We have
Gauss-Newton Method
• In general, we want to minimize a sum of squared function values
)()(2
1minarg xfxfx T
x
Gauss-Newton Method
• In general, we want to minimize a sum of squared function values
• Unlike Newton’s method, second derivatives are not required.
)()(2
1minarg xfxfx T
x
Gauss-Newton Method
• In general, we want to minimize a sum of squared function values
)()(2
1minarg xfxfx T
x
xxFxfxxf )()()(
))()(())()((2
1)()(
2
1xxFxfxxFxfxxfxxf TT
Gauss-Newton Method
• In general, we want to minimize a sum of squared function values
)()(2
1minarg xfxfx T
x
xxFxfxxf )()()(
))()(())()((2
1)()(
2
1xxFxfxxFxfxxfxxf TT
Quadratic function
Gauss-Newton Method
• In general, we want to minimize a sum of squared function values
)()(2
1minarg xfxfx T
x
xxFxfxxf )()()(
))()(())()((2
1)()(
2
1xxFxfxxFxfxxfxxf TT
Quadratic function
0))()(()( xxFxfxF T
Gauss-Newton Method
• In general, we want to minimize a sum of squared function values
)()(2
1minarg xfxfx T
x
xxFxfxxf )()()(
))()(())()((2
1)()(
2
1xxFxfxxFxfxxfxxf TT
Quadratic function
xxFxFxfxF TT )()()()(
Gauss-Newton Method
• In general, we want to minimize a sum of squared function values
)()(2
1minarg xfxfx T
x
xxFxfxxf )()()(
))()(())()((2
1)()(
2
1xxFxfxxFxfxxfxxf TT
Quadratic function
)()())()(( 1 xfxFxFxFx TT
Gauss-Newton Method
• Initialize k=0, choose x0
• While k<kmax
)()())()(( 111
111
kT
kkT
kkk xfxfxfxfxx
Gauss-Newton Method
• In general, we want to minimize a sum of squared function values
)()(2
1minarg xfxfx T
x
xxFxfxxf )()()(
))()(())()((2
1)()(
2
1xxFxfxxFxfxxfxxf TT
Quadratic function
)()())()(( 1 xfxFxFxFx TT
Any Problem?
Gauss-Newton Method
• In general, we want to minimize a sum of squared function values
)()(2
1minarg xfxfx T
x
xxFxfxxf )()()(
))()(())()((2
1)()(
2
1xxFxfxxFxfxxfxxf TT
Quadratic function
0))()(()( xxFxfxF T
Any Problem?
Gauss-Newton Method
• In general, we want to minimize a sum of squared function values
)()(2
1minarg xfxfx T
x
xxFxfxxf )()()(
))()(())()((2
1)()(
2
1xxFxfxxFxfxxfxxf TT
Quadratic function
0))()(()( xxFxfxF T
Any Problem?
Solution might not be unique!
Gauss-Newton Method
• In general, we want to minimize a sum of squared function values
)()(2
1minarg xfxfx T
x
xxFxfxxf )()()(
))()(())()((2
1)()(
2
1xxFxfxxFxfxxfxxf TT
Quadratic function
0))()(()( xxFxfxF T
Any Problem?
Add regularization term!
Levenberg-Marquardt Method
• In general, we want to minimize a sum of squared function values
)()(2
1minarg xfxfx T
x
xxFxfxxf )()()(
xIxxFxfxxFxfxxxfxxf TT ))()(())()((2
1)()(
2
1 2
Any Problem?
Levenberg-Marquardt Method
• In general, we want to minimize a sum of squared function values
)()(2
1minarg xfxfx T
x
xxFxfxxf )()()(
xIxxFxfxxFxfxxxfxxf TT ))()(())()((2
1)()(
2
1 2
Quadratic function
0))()(()( xIxxFxfxF T
Any Problem?
Add regularization term!
Levenberg-Marquardt Method
• In general, we want to minimize a sum of squared function values
)()(2
1minarg xfxfx T
x
xxFxfxxf )()()(
xIxxFxfxxFxfxxxfxxf TT ))()(())()((2
1)()(
2
1 2
Quadratic function
)()())()(( 1 xfxFIxFxFx TT
Any Problem?
Add regularization term!
Levenberg-Marquardt Method
• Initialize k=0, choose x0
• While k<kmax
)()())()(( 111
111
kT
kkT
kkk xfxfIxfxfxx
Stopping Criteria
• Criterion 1: reach the number of iteration specified by the user
• Criterion 2: when the current function value is smaller than a user-specified threshold
K>kmax
F(xk)<σuser
Stopping Criteria
• Criterion 1: reach the number of iteration specified by the user
• Criterion 2: when the current function value is smaller than a user-specified threshold
• Criterion 3: when the change of function value is smaller than a user specified threshold
K>kmax
F(xk)<σuser
||F(xk)-F(xk-1)||<εuser
Levmar Library
• Implementation of the Levenberg-Marquardt algorithm
• http://www.ics.forth.gr/~lourakis/levmar/