1 / 75

Chapter 3 Linear Algebra Review and Elementary Differential Equations

Chapter 3 Linear Algebra Review and Elementary Differential Equations To review some important concept of linear algebra and differential equations that will be used subsequently. Outline. Vector spaces and subspaces Independence, basis and dimension Norm and inner product

kirra
Download Presentation

Chapter 3 Linear Algebra Review and Elementary Differential Equations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 3 Linear Algebra Review and Elementary Differential Equations • To review some important concept of linear algebra and differential equations that will be used subsequently 長庚大學電機系

  2. Outline • Vector spaces and subspaces • Independence, basis and dimension • Norm and inner product • Gram-Schmidt procedure and QR-factorization • Linear algebraic equations • Change of basis and similarity transformation • Diagonal form and Jordan form • Function of square matrix • Lyapunov equation • A useful result 長庚大學電機系

  3. Quadratic form and positive definiteness • Singular value decomposition • Norm of matrices • Elementary differential equations 長庚大學電機系

  4. Linear Algebra is a course dealing with the concept of vector space and its properties, and linear transformation defined on vector spaces. Question: • What are vector space and linear transformation? • How many important results and applications do you know regarding vector space and linear transformation? 長庚大學電機系

  5. Vector Spaces A real vector space is a set Vwith two binary operations: vector addition+ : and scalar multiplication: which satisfy • (closed) • (commutative law) • (associative law) • (zero element) • (inverse) • (distributive law) 長庚大學電機系

  6. Question: Is a line in a vector space? Is a plane in a vector space? 長庚大學電機系

  7. Examples • with standard vector addition and scalar multiplication is a vector space. • The set of all matrices with the usual matrix addition and scalar multiplication is a vector space, denoted by The zero vector is the zero matrix. • Let Define + and scalar multiplication by for all and Then is a vector space. 長庚大學電機系

  8. Examples • is a vector space, where • A vector in is a continuous vector function on • The zero vector is the zero function. • is a vector space. A vector now is a trajectory of the linear system • is a vector space, where is the set of nonnegative integers. A vector in is a sequence which converges to zero. 長庚大學電機系

  9. Subspaces • A subspace of a vector space V is a subset of V, which itself is a vector space. • A subset is a subspace of V if and only if Examples: • is a subspace of • is a subspace of • is a subspace of • Subspaces in are lines and planes that pass the origin, and {0}. • If and are subspaces of V, then so are and 長庚大學電機系

  10. Linear Independency Question:How to characterize a vector space with elements (or vectors) as less as possible? • A set of vectors is called linearly independent if • is linearly independent implies that • No vector can be expressed as a linear combination of the other vectors (i.e. no redundancy). • If then That is, the coefficients areuniquely determined. 長庚大學電機系

  11. A set is linearly independent if every finite subset of S is linearly independent. • The set is linearly dependent if and only if there are not all zero, such that • Hence if is linearly dependent,then at least one of the vectors can be expressed as a linear combination of the other vectors. • The set of all linear combinations of is called the spanof . That is, span • spanis a subspace of V. 長庚大學電機系

  12. Basis • The set of vectors is called a basis of V if and only if (i) is linearly independent and (ii) spans V, i.e., V=span • If is a basis of V, then every vector in V can be uniquely expressed as • With basis , there is a 1-1 correspondence between V and . 長庚大學電機系

  13. Dimension • If V contains an infinite subset of linearly independent vectors, we say V is infinite dimensional. Otherwise, V is finite dimensional. • The number of vectors in any basis of a (finite dimensional) vector space is the same. • The number of vectors in a basis of V is called the dimension of V, denoted by dim V. 長庚大學電機系

  14. dim . • dim . • dim . • dim : consider vectors defined by Clearly these vectors are linearly independent. • dim C([0,1])= since is linearly independent. 長庚大學電機系

  15. Norms Question:A vector may denote a velocity, an acceleration or a time function from numerical simulation. How to characterize its magnitude and to give it an interpretation? • Let V be a vector space. A function is called a normon V if • (triangle inequality) • A norm is a measure of length of vectors. 長庚大學電機系

  16. Examples • Let • 1-norm: • 2-norm: • -norm: • Triangle inequality: 長庚大學電機系

  17. The unit ball under different norm: 長庚大學電機系

  18. In V = C([0; 1]) • In Here, the norm can be any norm in 長庚大學電機系

  19. Inner products Question:How to characterize and measure the angle between two vectors? What the advantages can an orthonormal basis have? • Let V be a real vector space. An inner producton V is a function which satisfies 長庚大學電機系

  20. A vector space with an inner product defined on it is called an inner product space. Example: • (standard inner product) 長庚大學電機系

  21. Norm Induced by Inner Product • Given an inner product, define • ||u|| is indeed a norm, called induced norm. • For with standard inner product • For 長庚大學電機系

  22. Cauchy-Schwartz inequality: It follows that • for all • Geometrical interpretation of inner product: • Define the angle between two vector to be • For 長庚大學電機系

  23. Orthonormal vectors in • A set of vectors is said to be orthonormalif and • Orthonormal vectors are linearly independent. • Define the matrix • Let This means that the transformation preserves inner product, norm and angle. • If k=n, then U given above is called an orthogonal matrix So computing the representation of x is easy if the basis is orthonormal. 長庚大學電機系

  24. Gram-Schmidt procedure Question:Given a set of independent vectors , how to find orthonormal vectors such that • One method: ’s are given recursively as • (normalize) • (remove component from ) • (normalize) • (remove component from ) • (normalize) • etc. 長庚大學電機系

  25. Thus • In general, for we have • (if not, can be written as linear combination of ) • Hence the rth column of A can be expressed as a linear combination of orthonormal vectors • Clearly, 長庚大學電機系

  26. The relation can be written in matrix form as • and R is upper triangular and invertible. • This is called QR factorization of A. • Columns of Q are orthonormal basis of R(A). 長庚大學電機系

  27. Housholder Transformation • Let Then • H is said to be an Housholder transformation • H is symmetric • H is orthogonal • Hx is the reflection of x w.r.t. the plane • What are the eigenvalues and associated eigenvectors of H? • What are det(H) and trace(H)? • How to employ Housholder transformation to find Q such that QA=R, an upper triangular matrix? 長庚大學電機系

  28. Question: Given a set of k vectors How to use computer resource to select a basis of S from QR decomposition is a good strategy. 長庚大學電機系

  29. Linear Algebraic Equation Ax=b Question:Let • (Existence) When does have a solution? • (Uniqueness) How many solution does have if it is solvable? When is the solution unique? • (Solution set characterization) How to express all solutions of in terms of A and b if it is solvable? 長庚大學電機系

  30. The nullspaceof is defined as • N(A)is the set of vectors orthogonal to all rows of A. • Therangeof is defined as • R(A) is the set of vectors y that can be reached by the linear mapping y=Ax • R(A) is the span of columns of A • R(A) is the set of vectors y such that Ax=y has a solution. 長庚大學電機系

  31. The rankof is defined as • rank(A) is the maximum number of independent columns of A. • rank(A) is the maximum number of independent rows of A. • We say that has full rank if 長庚大學電機系

  32. Ax=b has a solution iff rank(A)=rank(A|b) iff • Ax=0 has a unique solution iff N(A) = {0} iff columns of A are linearly independent iff det iff A has a left inverse, i.e., there is a matrix such that BA = I 長庚大學電機系

  33. Let and be a solution of Ax=b. Suppose that is a basis for N(A). Then • The solution set of Ax=b is • There are infinite many solution if N(A) is not the trivial vector space. • The solution set is a linear variety (translation of a vector subspace) 長庚大學電機系

  34. The following statements are equivalent: (i) The linear equation has a solution for any . (ii) The mapping y=Ax is onto. (iii) (iv) Columns of A span (v) A has a right inverse, i.e., there is a matrix such that . (vi) (vii)det 長庚大學電機系

  35. A square matrix A is said to be invertibleor nonsingularif det • The following statements are equivalent: • is nonsingular • Columns of A form a basis of • Rows of A form a basis of . • has a unique solution for every . • A has an inverse such that • det det • From eigenvalues, singular values and pivots viewpoint … 長庚大學電機系

  36. Change of Basis • Let be a basis of Then every vector v of can be written in terms of E as Here, is called the coordinate of v with respect to the basis E. Question: Let E and F be two basis of What is the relation between the two coordinates of a vector with respect to E and F? • Let x and y be the coordinates of v with respect to the basis E and F or in the notation: 長庚大學電機系

  37. Matrix Representation of a Linear Mapping • Let V be a real vector spaces. A mapping L from V to V is said to be linear if • Let be a basis of a vector space V and L is a linear mapping from V to V. Suppose that Here, is called the matrix representation of L with respect to basis E. 長庚大學電機系

  38. Similarity Transformation Question:What is the relation between the matrix representations of a linear function with respect to different basis (change of basis)? Let E and F be two basis of , and L be a linear mapping from to Then for any v Similarly, A similarity transformation 長庚大學電機系

  39. Example:Let E be the standard basis of and By direct calculation, is nonsingular and 長庚大學電機系

  40. Diagonalization Question:Can a matrix (or a dynamics) be transformed into a form as simple as possible through a similarity transformation? The diagonal matrices are perhaps the simplest ones • Given such that , then is called an eigenvalue of A and x is an eigenvector associated with is called an eigenpair of A. 長庚大學電機系

  41. Question:How to find eigenvalues and eigenvectors? How many eigenvalues do a matrix have? Let Thus is an eigenvalue of A 長庚大學電機系

  42. is a polynomial of degree n, called the characteristic polynomial of A. has exactly n roots including multiplicities (by FTA). A has exactly neigenvalues including multiplicities. • Every nonzero vectors satisfying is an eigenvector associated with eigenvalue . • The vector space is called the eigenspace of A associated with eigenvalue . 長庚大學電機系

  43. Let be eigenpairs of A with nonsingular in this case, A is called diagonalizable. Example: 長庚大學電機系

  44. Theorem:Let be eigenpairs of Then A is diagonalizable That is, there exists a complete set of eigenvectors. In this case, Theorem:Let be eigenpairs of Suppose that are distinct. Then Proof: Suppose that 長庚大學電機系

  45. Jordan form Question:Are all matrix diagonalizable? No, a counterexample is that In this case, we can not find two linearly independent eigenvectors of A associated with . Question:If A is not diagonalizable, then how simple can A be made by similarity transformation. 長庚大學電機系

  46. A Jordan block is an upper triangular matrix of the form • A matrix is said to be of Jordan(canonical)form if where are Jordon blocks and • might not be distinct 長庚大學電機系

  47. Theorem:Each matrix can be transformed into a Jordan form through similarity transformation . (For proof, see p. 216 of R.A. Horn and C.R. Johnson, Matrix Analysis, Cambridge University Press, 1985. ) 長庚大學電機系

  48. Question:How to find the similarity transformation and the associated Jordan form? • (An observation) Let 長庚大學電機系

  49. If • In general, if Here, is an eigenvector of A associated with , are called generalized eigenvectors of A associated with . 長庚大學電機系

  50. Some observations: • The number of Jordan block, s = number of linearly independent eigenvectors • A matrix is diagonalizable • The number of Jordan blocks corresponding to a given eigenvalue = the number of l. indep. eigenvectors associated with the eigenvalue - Indexof : the maximum dimension of Jordan blocks of J associated with 長庚大學電機系

More Related