1 / 15

Linear Least Squares QR Factorization

Linear Least Squares QR Factorization. Systems of linear equations. Problem to solve: M x = b Given M x = b : Is there a solution? Is the solution unique?. Systems of linear equations. Find a set of weights x so that the weighted sum of the

trung
Download Presentation

Linear Least Squares QR Factorization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Linear Least Squares QR Factorization

  2. Systems of linear equations • Problem to solve: M x = b • Given M x = b : • Is there a solution? • Is the solution unique?

  3. Systems of linear equations Find a set of weights x so that the weighted sum of the columns of the matrix M is equal to the right hand side b

  4. Systems of linear equations - Existence A solution exists when b is in the span of the columns of M A solution exists if: There exist weights, x1, …., xN, such that:

  5. Systems of linear equations - Uniqueness Suppose there exist weights, y1, …., yN, not all zero, such that: Then: Mx = b  Mx + My= b  M(x+y) = b A solution is unique only if the columns of M are linearlyindependent.

  6. QR factorization 1 • A matrix Q is said to be orthogonal if its columns are orthonormal, i.e. QT·Q=I. • Orthogonal transformations preserve the Euclidean norm since • Orthogonal matrices can transform vectors in various ways, such as rotation or reflections but they do not change the Euclidean length of the vector. Hence, they preserve the solution to a linear least squares problem.

  7. QR factorization 2 Any matrix A(m·n) can be represented as A = Q·R ,where Q(m·n) is orthonormal and R(n·n) is upper triangular:

  8. QR factorization 2 • Given A , let its QR decomposition be given as A=Q·R, where Q is an (m x n) orthonormal matrix and R is upper triangular. • QR factorization transform the linear least square problem into a triangular least squares. Q·R·x = b R·x = QT·b x=R-1·QT·b Matlab Code:

  9. Normal Equations Consider the system It can be a result of some physical measurements, which usually incorporate some errors. Since, we can not solve it exactly, we would like to minimize the error: r=b-Ax r2=rTr=(b-Ax)T(b-Ax)=bTb-2xTATb+xTATAx (r2)x=0 - zero derivative is a (necessary) minimum condition -2ATb+2ATAx=0; ATAx = ATb; – Normal Equations

  10. Normal Equations 2 ATAx = ATb – Normal Equations

  11. Least squares via A=QR decomposition A(m,n)=Q(m,n)R(n,n), Q is orthogonal, therefore QTQ=I. QRx=b R(n,n)x=QT(n,m)b(m,1) -well defined linear system x=R-1QTb Q is found by Gram=Schmidt orthogonalization of A. How to find R? QR=A QTQR=QTA, but Q is orthogonal, therefore QTQ=I: R=QTA R is upper triangular, since in orthogonalization procedure only a1,..ak (without ak+1,…) are used to produce qk

  12. Least squares via A=QR decomposition 2 Let us check the correctness: QRx=b Rx=QTb x=R-1QTb

  13. Last lecture reminderQR Factorization – By picture

  14. QR Factorization – Minimization ViewMinimization Algorithm For i = 1 to N “For each Target Column” For j = 1 to i-1 “For each Source Column left of target” end end Orthogonalize Search Direction Normalize

More Related