1 / 23

Math 415: Linear Algebra Chapter 5 : The Orthogonality and Least Squares

Math Dept, Faculty of Applied Science, HCM University of Technology -------------------------------------------------------------------------------------. Math 415: Linear Algebra Chapter 5 : The Orthogonality and Least Squares Instructor Dr. Dang Van Vinh (6/2006).

maik
Download Presentation

Math 415: Linear Algebra Chapter 5 : The Orthogonality and Least Squares

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Math Dept, Faculty of Applied Science, HCM University of Technology------------------------------------------------------------------------------------- Math 415: Linear Algebra Chapter 5: The Orthogonality and Least Squares • Instructor Dr. Dang Van Vinh (6/2006)

  2. CONTENTS---------------------------------------------------------------------------------------------------------------------------CONTENTS--------------------------------------------------------------------------------------------------------------------------- 3.1 – The Scalar Product in Rn 3.2 – Orthogonal Subspaces 3.3 – Orthonormal set 3.4 – The Gram - Schmidt Orthogonalization Process 3.5 – Inner Product Spaces 3.6 – Least Square Problem

  3. Definition of Inner Product in Rn Let u and v are vectors in Rn. The inner product of u and v is 3.1 The Scalar Product in Rn---------------------------------------------------------------------------------------------------------------------------

  4. Example. Let . Compute (u,v) and (v,u) Solution. 3.1 The Scalar Product in Rn--------------------------------------------------------------------------------------------------------------

  5. Theorem Let u, v and w be vectors in Rn, and let c be a scalar. Then a. (u,v) = (v,u) b. (u+v,w) = (u,w) + (v,w) c. (cu,v) = c(u,v)=(u,cv) d. (u,u) (v,u), and (u,u) = 0 if and only if u = 0. The Length of a Vector The length (or norm) of vector u is the nonnegative scalar ||u|| defined by 3.1 The Scalar Product in Rn--------------------------------------------------------------------------------------------------------------

  6. A vector whose length is 1 is called a unit vector. If we divide a nonzero vector u by its length, we obtain a unit vector. The process of creating a unit vector is called normalizing The Distance between two vectors For u and v in Rn, the distance between u and v, written as dist(u,v), is the length of the vector u – v. That is dist(u,v) = ||u – v|| Example. 1) Let v = (1,-2,2,0). Find a unit vector u in the same direction as v. 2) Compute the distance between the vectors u = (7,1) and v = (3,2). 3.1 The Scalar Product in Rn-----------------------------------------------------------------------------------------------------------

  7. Definition of Orthogonality Two vectors u and v in Rn are orthogonal (to each other) if (u,v) = 0 The Pythagorean Theorem Two vectors u and v in Rn are orthogonal if and only if Orthogonal Complements If a vector z is orthogonal to every vector in a subspace W of Rn, then z is said to be orthogonal to W. The set of all vectors z that are orthogonal to W is called the orthogonal complement of W and is denoted by 3.2 Orthogonal Subspaces--------------------------------------------------------------------------------------------------------------

  8. Theorem 1. A vector x is in if and only if x is orthogonal to every vector in a set that span W. 2. is a subspace of Rn. Theorem Let A be an mxn matrix. Then the orthogonal complement of the row space of A is the nullspace of A, and the orthogonal complement of the column space of A is the nullspace of AT. (Row A)T = Null A; (Col A)T = Null AT. Proof. 3.2 Orthogonal Subspaces-------------------------------------------------------------------------------------------------------------- Proof.

  9. Example. Let be a subspace of R3. Find the basis and dimension of Solution. Dim =1; basis: {(1,-2,1)} 3.2 Orthogonal Subspaces--------------------------------------------------------------------------------------------------------------

  10. Example. Let be a subspace of R3. Find the basis and dimension of Solution. Step 1. Find the spanning set of F. The spanning set of F is {(2,-3,1)} Step 2. Analogous the previous example. 3.2 Orthogonal Subspaces--------------------------------------------------------------------------------------------------------------

  11. Definition of an Orthogonal Set A set of vectors {u1, u2, ..., up} is said to be an orthogonal set if each pair of distinct vectors from the set is orthogonal. Theorem If S= {u1, u2, ..., up} is an orthogonal set of nonzero vectors in Rn, then S is linearly independent and hence is a basic for the subspace spanned by S. Proof. Example. Show that {u1, u2, u3} is an orthogonal set, where 3.3 Orthonormal Set------------------------------------------------------------------------------------------------------------

  12. Theorem Let E = {u1, u2, ..., up} be an orthogonal basis for a subspace W of Rn. Then each y in W has a unique representation as a linear combination of E. In fact, if y = c1u1 + c2u2 + ... + cpup then Proof. Example. The set E = {u1, u2, u3} is an orthogonal basis for R3, where Express the vector y = (6,1,-8) as a linear combination of the vectors in E. 3.3 Orthonormal Set--------------------------------------------------------------------------------------------------------------

  13. Solution. Compute Remark. How easy it is to compute the weights needed to build y from an orthogonal basis. If the basis were not orthogonal, it would be necessary to solve a system of linear equations to find the weights. 3.3 Orthonormal Set--------------------------------------------------------------------------------------------------------------

  14. Definition of an Orthonormal Set A set E = {u1, u2, ..., up} is an orthonormal set if it is an orthogonal set of unit vectors. If W is a subspace spanned by such of set, then E is an orthonormal basis for W, since the set is automatically linearly independent. Example. Show that {v1, v2, v3} is an orthonormal set, where Theorem An mxn matrix U has orthonormal columns if and only if UTU=I. Proof. 3.3 Orthonormal Set--------------------------------------------------------------------------------------------------------------

  15. Theorem Let U be an mxn matrix with orthonormal columns, and let x and y be in Rn. Then a. b. c. Proof. Definition of an Orthogonal Matrix An orthogonal matrix is a square invertible matrix U such that U-1 = UT. Theorem A square matrix with orthonormal columns is an orthogonal matrix 3.3 Orthonormal Set------------------------------------------------------------------------------------------------------

  16. The Gram – Schmidt process is a simple algorithm for producing an orthogonal or orthonormal basis for any subspace of Rn. Example. Let W = Span{x1, x2}, where x1= (3,6,0) and x2= (1,2,2). Construct an orthogonal basis {v1, v2} for W. Solution. Let p be the projection of x2 onto x1. The component of x2 orthogonal to x1 is x2 – p, which is in W because it is formed from x2 and a multiple of x1. Let v1= x1 and Then {v1, v2} is an orthogonal set of nonzero vectors in W. 3.4 The Gram-Schmidt Orthogonalization Process--------------------------------------------------------------------------------------------------------------

  17. Example. Let W = Span{x1=(1,1,1,1), x2=(0,1,1,1), x3 = (0,0,1,1)} Construct an orthogonal basis {v1, v2, v3} for W. Solution. Step1. Let v1 = x1 and W1= Span{x1} = Span{v1} Step2. To simplify later computations, choose v2 = (-3,1,1,1) Step3. 3.4 The Gram-Schmidt Orthogonalization Process--------------------------------------------------------------------------------------------------------------

  18. The Gram – Schmidt Process Given a basis {x1, x2, ..., xp} for a subspace W of Rn, define Then {v1, v2, ..., vp} is an orthogonal basis for W. In addition 3.4 The Gram-Schmidt Orthogonalization Process--------------------------------------------------------------------------------------------------------------

  19. Definition of Inner Product An inner product on a vector space V is a function that, to each pair of vectors u and v in V, associates a real number (u,v) and satisfies the following axioms, for all u, v, w and all scalars c a. b. c. d. A vector space with an inner product space is called an inner product space. 3.5 Inner Product Spaces---------------------------------------------------------------------------------------------------------------------------

  20. Example. For vectors x = (x1, x2) and y = (y1, y2) in R2, set Show that (x,y) is an inner product. Example. For vectors x = (x1, x2) and y = (y1, y2) in R2, set Show that (x,y) is an inner product. Example. For vectors x = (x1, x2, x3); y = (y1, y2, y3) in R3, set Show that (x,y) is an inner product. 3.5 Inner Product Spaces---------------------------------------------------------------------------------------------------------------------------

  21. Example. For vectors p(x) and q(x) in P2, set a. Show that (p,q) is an inner product. b. Compute (p,q) where c. Compute the length of the vector d. Compute the distance between p(x) and q(x) where e. Compute the angle between two vector in d) 3.5 Inner Product Spaces---------------------------------------------------------------------------------------------------------------------------

  22. Example. For vectors p(x) and q(x) in P2, set Given a subspace a. Find the spanning set of subspace F. c. Find the basis and dimension of b. Determine m such that is be long to d. Produce an orthogonal basis for P2 by applying the Gram-Schmidt process to the polynomials 1, x, x2. 3.5 Inner Product Spaces---------------------------------------------------------------------------------------------------------------------------

  23. 3.6 Least Square Problem---------------------------------------------------------------------------------------------------------------------------

More Related