1 / 48

# Lecture 13 - Eigen-analysis - PowerPoint PPT Presentation

Lecture 13 - Eigen-analysis. CVEN 302 July 1, 2002. Lecture’s Goals. Shift Method Inverse Power Method Accelerated Power Method QR Factorization Householder Hessenberg Method. Shift method.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

## PowerPoint Slideshow about ' Lecture 13 - Eigen-analysis' - alexander-pate

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

### Lecture 13 - Eigen-analysis

CVEN 302

July 1, 2002

• Shift Method

• Inverse Power Method

• Accelerated Power Method

• QR Factorization

• Householder

• Hessenberg Method

It is possible to obtain another eigenvalue from the set equations by using a technique known as shifting the matrix.

Subtract the a vector from each side, thereby changing the maximum eigenvalue

The eigenvalue, s, is the maximum value of the matrix A. The matrix is rewritten in a form.

Use the Power method to obtain the largest eigenvalue of [B].

Assume an arbitrary vector x0 = { 1 1 1}T

Multiply the matrix by the matrix [A] by {x}

Normalize the result of the product

Continue with the iteration and the final value is l = -5. However, to get the true you need to shift back by:

The inverse method is similar to the power method, except that it finds the smallest eigenvalue. Using the following technique.

The algorithm is the same as the Power method and the “eigenvector” is not the eigenvector for the smallest eigenvalue. To obtain the smallest eigenvalue from the power method.

The inverse algorithm use the technique avoids calculating the inverse matrix and uses a LU decomposition to find the {x} vector.

The matrix is defined as:

• There are set of programs Power and InversePower.

• The InversePower(A, x0,iter,tol) does the inverse method.

The Power method can be accelerated by using the Rayleigh Quotient instead of the largest wk value.

The Rayleigh Quotient is defined as:

The values of the next z term is defined as:

The Power method is adapted to use the new value.

Assume an arbitrary vector x0 = { 1 1 1}T

Multiply the matrix by the matrix [A] by {x}

Multiply the matrix by the matrix [A] by {x}

And so on ...

The technique can be used to find the eigenvalue using a successive iteration using Householder transformation to find an equivalent matrix to [A] having an eigenvalues on the diagonal

Another form of factorization

A = Q*R

Produces an orthogonal matrix (“Q”) and a right upper triangular matrix (“R”)

Orthogonal matrix - inverse is transpose

Why do we care?

We can use Q and R to find eigenvalues

1. Get Q and R (A = Q*R)

2. Let A = R*Q

3. Diagonal elements of A are eigenvalue

approximations

4. Iterate until converged

Note:QR eigenvalue method gives all eigenvalues

simultaneously, not just the dominant 

In practice, QR factorization on any given matrix requires a number of steps

First transform A into Hessenberg form

Hessenberg matrix - upper triangular plus first sub-diagonal

Special properties of Hessenberg matrix make it easier to find Q, R, and eigenvalues

• Construction of QR Factorization

• Use Householder reflections and given rotations to reduce certain elements of a vector to zero.

• Use QR factorization that preserve the eigenvalues.

• The eigenvalues of the transformed matrix are much easier to obtain.

• Any square matrix is orthogonally similar to a triangular matrix with the eigenvalues on the diagonal

• Transformation of the matrix A of the form H-1AHis known as similarity transformation.

• A real matrix Q is orthogonal if QTQ = I.

• If Q is orthogonal, then A and Q -1AQ are said to be orthogonally similar

• The eigenvalues are preserved under the similarity transformation.

• The diagonal elements Rii of the upper triangular matrix R are the eigenvalues

• Householder reflector is a matrix of the form

• It is straightforward to verify that Qis symmetric and orthogonal

• Householder matrix reduces zk+1 ,…,zn to zero

• To achieve the above operation, v must be a linear combination of x and ek

• Corollary (kth Householder matrix):Let A be an nxn matrix and x any vector. If k is an integer with 1< k<n-1 we can construct a vector w(k) and matrix H(k) = I - 2w(k)w’(k) so that

• Define the value  so that

• The vector w is found by

• Choose  = sign(xk)g to reduce round-off error

• [A] = [Q] [R]

• [Q] is orthogonal, QTQ = I

• [R] is upper triangular

• QR factorization using Householder matrices

• Q = H(1)H(2)….H(n-1)

• Similarity transformation B = QTAQ preserve the eigenvalues

QR = A

Finding Eigenvalues Using QR Factorization

• Generate a sequence A(m) that are orthogonally similar to A

• Use Householder transformation H-1AH

• the iterates converge to an upper triangular matrix with the eigenvalues on the diagonal

Find all eigenvalues simultaneously!

• QR factorization: A = QR

• Similarity transformation: A(new) = RQ

A =

2.4634 1.8104 -1.3865

-0.0310 3.0527 1.7694

0.0616 -0.1047 -0.5161

A =

2.4056 1.8691 1.3930

0.0056 2.9892 -1.9203

0.0099 -0.0191 -0.3948

A =

2.4157 1.8579 -1.3937

-0.0010 3.0021 1.8930

0.0017 -0.0038 -0.4178

A =

2.4140 1.8600 1.3933

0.0002 2.9996 -1.8982

0.0003 -0.0007 -0.4136

A =

2.4143 1.8596 -1.3934

0.0000 3.0001 1.8972

0.0001 -0.0001 -0.4143

e =

2.4143

3.0001

-0.4143

» A=[1 2 -1; 2 2 -1; 2 -1 2]

A =

1 2 -1

2 2 -1

2 -1 2

» [Q,R]=QR_factor(A)

Q =

-0.3333 -0.5788 -0.7442

-0.6667 -0.4134 0.6202

-0.6667 0.7029 -0.2481

R =

-3.0000 -1.3333 -0.3333

0.0000 -2.6874 2.3980

0.0000 0.0000 -0.3721

» e=QR_eig(A,6);

A =

2.1111 2.0535 1.4884

0.1929 2.7966 -2.2615

0.2481 -0.2615 0.0923

QR factorization

eigenvalue

• Using similarity transformation to form an upper Hessenberg Matrix (upper triangular matrix & one nonzero band below diagonal) .

• More efficient to form Hessenberg matrix without explicitly forming the Householder matrices (not given in textbook).

function A = Hessenberg(A)

[n,nn] = size(A);

for k = 1:n-2

H = Householder(A(:,k),k+1);

A = H*A*H;

end

A =

2.4056 -2.1327 0.9410

-0.0114 -0.4056 -1.9012

0.0000 0.0000 3.0000

A =

2.4157 2.1194 -0.9500

-0.0020 -0.4157 -1.8967

0.0000 0.0000 3.0000

A =

2.4140 -2.1217 0.9485

-0.0003 -0.4140 -1.8975

0.0000 0.0000 3.0000

A =

2.4143 2.1213 -0.9487

-0.0001 -0.4143 -1.8973

0.0000 0.0000 3.0000

e =

2.4143

-0.4143

3.0000

» eig(A)

ans =

2.4142

-0.4142

3.0000

» A=[1 2 -1; 2 2 -1; 2 -1 2]

A =

1 2 -1

2 2 -1

2 -1 2

» [Q,R]=QR_factor_g(A)

Q =

0.4472 0.5963 -0.6667

0.8944 -0.2981 0.3333

0 -0.7454 -0.6667

R =

2.2361 2.6833 -1.3416

-1.4907 1.3416 -1.7889

-1.3333 0 -1.0000

» e=QR_eig_g(A,6);

A =

2.1111 -2.4356 0.7071

-0.3143 -0.1111 -2.0000

0 0.0000 3.0000

A =

2.4634 2.0523 -0.9939

-0.0690 -0.4634 -1.8741

0.0000 0.0000 3.0000

Hessenberg matrix

eigenvalue

MATLAB function

• Single value eigen-analysis

• Power Method

• Shifting technique

• Inverse Power Method

• QR Factorization

• Householder matrix

• Hessenberg matrix

• Check the Homework webpage