1 / 52

Lecture 12 Inner Product Space & Linear Transformation

Lecture 12 Inner Product Space & Linear Transformation. Last Time - Length and Dot Product in R n - Inner Product Spaces - Orthonormal Bases:Gram-Schmidt Process. Elementary Linear Algebra R. Larsen et al. (5 Edition) TKUEE 翁慶昌 -NTUEE SCC_12_2007.

Download Presentation

Lecture 12 Inner Product Space & Linear Transformation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 12 Inner Product Space & Linear Transformation Last Time - Length and Dot Product in Rn - Inner Product Spaces - Orthonormal Bases:Gram-Schmidt Process Elementary Linear Algebra R. Larsen et al. (5 Edition) TKUEE翁慶昌-NTUEE SCC_12_2007

  2. Lecture 12: Inner Product Spaces & L.T. Today • Mathematical Models and Least Square Analysis • Inner Product Space Applications • Introduction to Linear Transformations Reading Assignment: Secs 5.4,5.5,6.1,6.2 Next Time • The Kernel and Range of a Linear Transformation • Matrices for Linear Transformations • Transition Matrix and Similarity Reading Assignment: Secs 6.2-6.4

  3. What Have You Actually Learned about Inner Product Spaces So Far?

  4. Today • Orthonormal Bases:Gram-Schmidt Process (Cont.) • Mathematical Models and Least Square Analysis • Inner Product Space Applications • Introduction to Linear Transformations

  5. Gram-Schmidt orthonormalization process: is a basis for an inner product space V is an orthogonal basis. is an orthonormal basis.

  6. Ex 7:(Applying the Gram-Schmidt orthonormalization process) Apply the Gram-Schmidt process to the following basis. Sol:

  7. Orthogonal basis Orthonormal basis

  8. Ex 10: (Alternative form of Gram-Schmidt orthonormalization process) Find an orthonormal basis for the solution space of the homogeneous system of linear equations. Sol:

  9. Thus one basis for the solution space is (orthogonal basis) (orthonormal basis)

  10. Keywords in Section 5.3: • orthogonal set: 正交集合 • orthonormal set: 單範正交集合 • orthogonal basis: 正交基底 • orthonormal basis: 單範正交基底 • linear independent: 線性獨立 • Gram-Schmidt Process: Gram-Schmidt過程

  11. Today • Orthonormal Bases:Gram-Schmidt Process (Cont.) • Mathematical Models and Least Square Analysis • Inner Product Space Applications • Introduction to Linear Transformations

  12. (read “ perp”) • Notes: 5.4 Mathematical Models and Least Squares Analysis • Orthogonal complement of W: Let W be a subspace of an inner product space V. (a) A vector u in V is said to orthogonal to W, if u is orthogonal to every vector inW. (b) The set of all vectors in V that are orthogonal to W is called the orthogonal complement of W.

  13. Notes: • Ex:

  14. Thm 5.13: (Properties of orthogonal subspaces) Let W be a subspace of Rn. Then the following properties are true. (1) (2) (3) • Direct sum: Let and be two subspaces of . If each vector can be uniquely written as a sum of a vector from and a vector from , , then is the direct sum of and , and you can write .

  15. Thm 5.14: (Projection onto a subspace) If is an orthonormal basis for the subspace Sof V, and , then

  16. Ex 5: (Projection onto a subspace) Find the projection of the vector v onto the subspace W. Sol: an orthogonal basis for W an orthonormal basis for W

  17. Find by the other method:

  18. Thm 5.15: (Orthogonal projection and distance) Let W be a subspace of an inner product space V, and . Then for all , ( is the best approximation to v from W)

  19. Pf: By the Pythagorean theorem

  20. Notes: (1) Among all the scalar multiples of a vector u, the orthogonal projection of v onto u is the one that is closest to v. (2) Among all the vectors in the subspace W, the vector is the closest vector to v.

  21. Thm 5.16: (Fundamental subspaces of a matrix) If A is an m×n matrix, then (1) (2) (3) (4)

  22. Ex 6: (Fundamental subspaces) Find the four fundamental subspaces of the matrix. (reduced row-echelon form) Sol:

  23. Check:

  24. Sol: (reduced row-echelon form) • Ex 3: Let W is a subspace of R4 and . (a) Find a basis for W (b) Find a basis for the orthogonal complement of W.

  25. Notes: is a basis for W

  26. Least squares problem: (A system of linear equations) (1) When the system is consistent, we can use the Gaussian elimination with back-substitution to solve for x (2) When the system is consistent, how to find the “best possible” solution of the system. That is, the value of x for which the difference between Ax and b is small.

  27. Least squares solution: Given a system Ax = bof m linear equations in n unknowns, the least squares problem is to find a vector x in Rn that minimizes with respect to the Euclidean inner product on Rn. Such a vector is called a least squares solution ofAx = b.

  28. (the normal system associated with Ax = b)

  29. Note: The problem of finding the least squares solution of is equal to he problem of finding an exact solution of the associated normal system . • Thm: For any linear system , the associated normal system is consistent, and all solutions of the normal system are least squares solution of Ax = b. Moreover, if W is the column space of A, and x is any least squares solution of Ax = b, then the orthogonal projection of b on W is

  30. Thm: If A is an m×n matrix with linearly independent column vectors, then for every m×1 matrix b, the linear system Ax = b has a unique least squares solution. This solution is given by Moreover, if W is the column space of A, then the orthogonal projection of b on W is

  31. Ex 7: (Solving the normal equations) Find the least squares solution of the following system and find the orthogonal projection of b on the column space of A.

  32. Sol: the associated normal system

  33. the least squares solution of Ax = b the orthogonal projection of b on the column space of A

  34. Keywords in Section 5.4: • orthogonal to W: 正交於W • orthogonal complement: 正交補集 • direct sum: 直和 • projection onto a subspace: 在子空間的投影 • fundamental subspaces: 基本子空間 • least squares problem: 最小平方問題 • normal equations: 一般方程式

  35. 6.1 Introduction to Linear Transformations • Function T that maps a vector space V into a vector space W: V: the domain of T W: the codomain of T

  36. Image of v under T: If v is in V and w is in W such that Then w is called the image of v under T . • the range of T: The set of all images of vectors in V. • the preimage of w: The set of all v in V such that T(v)=w.

  37. Ex 1: (A function from R2 intoR2 ) (a) Find the image of v=(-1,2). (b) Find the preimage of w=(-1,11) Sol: Thus {(3, 4)} is the preimage of w=(-1, 11).

  38. Linear Transformation (L.T.):

  39. Addition in V Addition in W Scalar multiplication in V Scalar multiplication in W • Notes: (1) A linear transformation is said to be operation preserving. (2) A linear transformation from a vector space into itself is called a linear operator.

  40. Ex 2: (Verifying a linear transformation T from R2 into R2) Pf:

  41. Therefore, T is a linear transformation.

  42. Ex 3: (Functions that are not linear transformations)

  43. Notes: Two uses of the term “linear”. (1) is called a linear function because its graph is a line. (2) is not a linear transformation from a vector space R into R because it preserves neither vector addition nor scalar multiplication.

  44. Zero transformation: • Identity transformation: • Thm 6.1: (Properties of linear transformations)

  45. Ex 4: (Linear transformations and bases) Let be a linear transformation such that Find T(2, 3, -2). Sol: (T is a L.T.)

  46. Ex 5: (A linear transformation defined by a matrix) The function is defined as Sol: (vector addition) (scalar multiplication)

  47. Thm 6.2: (The linear transformation given by a matrix) Let A be an mn matrix. The function T defined by is a linear transformation from Rn into Rm. • Note:

  48. Ex 7: (Rotation in the plane) Show that the L.T. given by the matrix has the property that it rotates every vector in R2 counterclockwise about the origin through the angle . Sol: (polar coordinates) r: the length of v :the angle from the positive x-axis counterclockwise to the vector v

  49. r:the length of T(v)  +:the angle from the positive x-axis counter- clockwise to the vector T(v) Thus, T(v) is the vector that results from rotating the vector v counterclockwise through the angle .

  50. Ex 8: (A projection in R3) The linear transformation is given by is called a projection in R3.

More Related