1 / 50

Lecture 13 Inner Product Space & Linear Transformation

Lecture 13 Inner Product Space & Linear Transformation. Last Time - Orthonormal Bases:Gram-Schmidt Process - Mathematical Models and Least Square Analysis - Inner Product Space Applications. Elementary Linear Algebra R. Larsen et al. (5 Edition) TKUEE 翁慶昌 -NTUEE SCC_12_2007.

nestrada
Download Presentation

Lecture 13 Inner Product Space & Linear Transformation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 13 Inner Product Space & Linear Transformation Last Time - Orthonormal Bases:Gram-Schmidt Process - Mathematical Models and Least Square Analysis - Inner Product Space Applications Elementary Linear Algebra R. Larsen et al. (5 Edition) TKUEE翁慶昌-NTUEE SCC_12_2007

  2. Lecture 12: Inner Product Spaces & L.T. Today • Mathematical Models and Least Square Analysis • Inner Product Space Applications • Introduction to Linear Transformations Reading Assignment: Secs 5.4,5.5,6.1,6.2 Next Time • The Kernel and Range of a Linear Transformation • Matrices for Linear Transformations • Transition Matrix and Similarity Reading Assignment: Secs 6.2-6.4

  3. What Have You Actually Learned about Projection So Far?

  4. (read “ perp”) • Notes: 5.4 Mathematical Models and Least Squares Analysis • Orthogonal complement of W: Let W be a subspace of an inner product space V. (a) A vector u in V is said to orthogonal to W, if u is orthogonal to every vector inW. (b) The set of all vectors in V that are orthogonal to W is called the orthogonal complement of W.

  5. Thm 5.13: (Properties of orthogonal subspaces) Let W be a subspace of Rn. Then the following properties are true. (1) (2) (3) • Direct sum: Let and be two subspaces of . If each vector can be uniquely written as a sum of a vector from and a vector from , , then is the direct sum of and , and you can write .

  6. Find by the other method:

  7. Thm 5.16: (Fundamental subspaces of a matrix) If A is an m×n matrix, then (1) (2) (3) (4)

  8. Ex 6: (Fundamental subspaces) Find the four fundamental subspaces of the matrix. (reduced row-echelon form) Sol:

  9. Check:

  10. Sol: (reduced row-echelon form) • Ex 3: Let W is a subspace of R4 and . (a) Find a basis for W (b) Find a basis for the orthogonal complement of W.

  11. Notes: is a basis for W

  12. Least Squares Problem • Least squares problem: (A system of linear equations) (1) When the system is consistent, we can use the Gaussian elimination with back-substitution to solve for x (2) When the system is consistent, how to find the “best possible” solution of the system. That is, the value of x for which the difference between Ax and b is small.

  13. Least squares solution: Given a system Ax = bof m linear equations in n unknowns, the least squares problem is to find a vector x in Rn that minimizes with respect to the Euclidean inner product on Rn. Such a vector is called a least squares solution ofAx = b.

  14. (the normal system associated with Ax = b)

  15. Note: The problem of finding the least squares solution of is equal to he problem of finding an exact solution of the associated normal system . • Thm: For any linear system , the associated normal system is consistent, and all solutions of the normal system are least squares solution of Ax = b. Moreover, if W is the column space of A, and x is any least squares solution of Ax = b, then the orthogonal projection of b on W is

  16. Thm: If A is an m×n matrix with linearly independent column vectors, then for every m×1 matrix b, the linear system Ax = b has a unique least squares solution. This solution is given by Moreover, if W is the column space of A, then the orthogonal projection of b on W is

  17. Ex 7: (Solving the normal equations) Find the least squares solution of the following system and find the orthogonal projection of b on the column space of A.

  18. Sol: the associated normal system

  19. the least squares solution of Ax = b the orthogonal projection of b on the column space of A

  20. Keywords in Section 5.4: • orthogonal to W: 正交於W • orthogonal complement: 正交補集 • direct sum: 直和 • projection onto a subspace: 在子空間的投影 • fundamental subspaces: 基本子空間 • least squares problem: 最小平方問題 • normal equations: 一般方程式

  21. Application: Cross Product Cross product (vector product) of two vectors 向量(vector) 方向: use right-hand rule The cross product is not commutative: The cross product is distributive: 13 - 23

  22. y Bsinθ θ x Application: Cross Product Parallelogram representation of the vector product Area 13 - 24

  23. 向量之三重純量積 Triple Scalar product 純量(scalar) The dot and the cross may be interchanged : 13 - 25

  24. 向量之三重純量積 Parallelepiped representation of triple scalar product Volume of parallelepiped defined by , , and z y x 13 - 26

  25. Fourier Approximation 13 - 27

  26. Fourier Approximation • The Fourier series transforms a given periodic function into a superposition of sine and cosine waves • The following equations are used

  27. Today • Mathematical Models and Least Square Analysis (Cont.) • Inner Product Space Applications • Introduction to Linear Transformations • The Kernel and Range of a Linear Transformation

  28. 6.1 Introduction to Linear Transformations • Function T that maps a vector space V into a vector space W: V: the domain of T W: the codomain of T

  29. Image of v under T: If v is in V and w is in W such that Then w is called the image of v under T . • the range of T: The set of all images of vectors in V. • the preimage of w: The set of all v in V such that T(v)=w.

  30. Ex 1: (A function from R2 intoR2 ) (a) Find the image of v=(-1,2). (b) Find the preimage of w=(-1,11) Sol: Thus {(3, 4)} is the preimage of w=(-1, 11).

  31. Linear Transformation (L.T.):

  32. Addition in V Addition in W Scalar multiplication in V Scalar multiplication in W • Notes: (1) A linear transformation is said to be operation preserving. (2) A linear transformation from a vector space into itself is called a linear operator.

  33. Ex 2: (Verifying a linear transformation T from R2 into R2) Pf:

  34. Therefore, T is a linear transformation.

  35. Ex 3: (Functions that are not linear transformations)

  36. Notes: Two uses of the term “linear”. (1) is called a linear function because its graph is a line. (2) is not a linear transformation from a vector space R into R because it preserves neither vector addition nor scalar multiplication.

  37. Zero transformation: • Identity transformation: • Thm 6.1: (Properties of linear transformations)

  38. Ex 4: (Linear transformations and bases) Let be a linear transformation such that Find T(2, 3, -2). Sol: (T is a L.T.)

  39. Ex 5: (A linear transformation defined by a matrix) The function is defined as Sol: (vector addition) (scalar multiplication)

  40. Thm 6.2: (The linear transformation given by a matrix) Let A be an mn matrix. The function T defined by is a linear transformation from Rn into Rm. • Note:

  41. Ex 7: (Rotation in the plane) Show that the L.T. given by the matrix has the property that it rotates every vector in R2 counterclockwise about the origin through the angle . Sol: (polar coordinates) r: the length of v :the angle from the positive x-axis counterclockwise to the vector v

  42. r:the length of T(v)  +:the angle from the positive x-axis counter- clockwise to the vector T(v) Thus, T(v) is the vector that results from rotating the vector v counterclockwise through the angle .

  43. Ex 8: (A projection in R3) The linear transformation is given by is called a projection in R3.

  44. Ex 9: (A linear transformation from Mmn into Mnm ) Show that T is a linear transformation. Sol: Therefore, T is a linear transformation from Mmn into Mnm.

  45. Keywords in Section 6.1: • function: 函數 • domain: 論域 • codomain: 對應論域 • image of v under T: 在T映射下v的像 • range of T: T的值域 • preimage of w: w的反像 • linear transformation: 線性轉換 • linear operator: 線性運算子 • zero transformation: 零轉換 • identity transformation: 相等轉換

  46. Today • Mathematical Models and Least Square Analysis (Cont.) • Inner Product Space Applications • Introduction to Linear Transformations • The Kernel and Range of a Linear Transformation

  47. Ex 1: (Finding the kernel of a linear transformation) Sol: 6.2 The Kernel and Range of a Linear Transformation • Kernel of a linear transformation T: Let be a linear transformation Then the set of all vectors v in V that satisfy is called the kernel of T and is denoted by ker(T).

  48. Ex 3: (Finding the kernel of a linear transformation) Sol: • Ex 2: (The kernel of the zero and identity transformations) (a) T(v)=0 (the zero transformation ) (b) T(v)=v (the identity transformation )

More Related