Tutorial 7 svd total least squares
This presentation is the property of its rightful owner.
Sponsored Links
1 / 15

Tutorial 7 SVD Total Least Squares PowerPoint PPT Presentation


  • 97 Views
  • Uploaded on
  • Presentation posted in: General

Tutorial 7 SVD Total Least Squares. Singular Value Decomposition. We already know that the eigenvectors of a matrix A form a convenient basis for working with A . However, for rectangular matrices A (m x n), dim( A x ) ≠ dim( x ) and the concept of eigenvectors doesn’t exist.

Download Presentation

Tutorial 7 SVD Total Least Squares

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Tutorial 7 svd total least squares

Tutorial 7 SVD Total Least Squares


Tutorial 7 svd total least squares

Singular Value Decomposition

We already know that the eigenvectors of a matrix A form a convenient basis for working with A.

However, for rectangular matrices A (m x n), dim(Ax) ≠ dim(x) and the concept of eigenvectors doesn’t exist.

Yet, ATA (n x n) is symmetric real matrix (A is real) and therefore, there is an orthonormal basis of eigenvectors {uK}.

Consider the vectors {vK}

They are also orthonormal, since:


Tutorial 7 svd total least squares

Singular Value Decomposition

Since ATA is positive semidefinite, its {λk≥0}.

Define the singular values of A as

and order them in a non-increasing order:

Motivation: One can see, that if A itself square and symmetric, than {uk, σk} are the set of its own eigenvectors and eigenvalues.

For a general matrix A, assume {σ1 ≥ σ2 ≥… σR >0= σr+1 = σr+2 =…= σn }.


Tutorial 7 svd total least squares

Singular Value Decomposition

Now we can write:


Tutorial 7 svd total least squares

SVD: Example

Let us find SVD for the matrix

In order to find V, we are calculating eigenvectors of ATA:

(5-λ)2-9=0;


Tutorial 7 svd total least squares

SVD: Example

The corresponding eigenvectors are found by:


Tutorial 7 svd total least squares

SVD: Example

Now, we obtain the U and Σ :

A=VΣUT:


Tutorial 7 svd total least squares

Total Least Squares

Consider again (see Tutorial 4) the set of data points ,

and the problem of linear approximation of this set:

In the Least Squares (LS) approach, we defined a set of equations:

If , then the LS solution minimizes the sum of squared errors:


Tutorial 7 svd total least squares

Total Least Squares

This approach assumes that in the set of points the values of bi are measured with errors while the values of ti are exact, as demonstrated on the figure.


Tutorial 7 svd total least squares

Total Least Squares

Assume, that we rewrite the line equation in the form:

. Then the corresponding LS equation becomes:

Corresponding to minimization of

Which means noise in ti instead of bi, and this generally leads to different solution.


Tutorial 7 svd total least squares

Illustration

Consider the following Matlab code:

% Create the data

x=(0:0.01:2)'; y=0.5*x+4;

xn=x+randn(201,1)*0.3;

yn=y+randn(201,1)*0.3;

figure(1); clf; plot(x,y,'r'); hold on;

grid on; plot(xn,yn,'+');

% LS - version 1 - horizontal is fixed

A=[ones(201,1),xn]; b=yn;

param=inv(A'*A)*A'*b;

plot(xn,A*param,'g');

% LS - version 2 - vertical is fixed

C=[ones(201,1),yn]; t=xn;

param=inv(C'*C)*C'*t;

plot(C*param,yn,'b');


Tutorial 7 svd total least squares

TLS

To solve the problem with the noise along both ti and bi, we rewrite the line equation as:

where

Now we can write:

The exact solution of this system is possible if ti, bi lie on the same line, in this case rank(A)=1. This formulation is symmetric relatively to t and b.


Tutorial 7 svd total least squares

TLS

The rank of A is 2 since the points are noisy and do not lie on the same line. SVD factorization, and zeroing of the second singular value allow to construct the matrix A1, closest to A with rank(A1)=1.


Tutorial 7 svd total least squares

TLS

The geometric interpretation of the TLS method is finding a constant a and a set of points , such that the points lie closest in the L2 to the data set :


Tutorial 7 svd total least squares

Total Least Squares

xnM=mean(xn);

ynM=mean(yn);

A=[xn-xnM,yn-ynM];

[U,D,V]=svd(A);

D(2,2)=0;

Anew=U*D*V';

plot(Anew(:,1)+xnM;

Anew(:,2)+ynM,‘r');


  • Login