1 / 30

240-650 Principles of Pattern Recognition

240-650 Principles of Pattern Recognition. Montri Karnjanadecha montri@coe.psu.ac.th http://fivedots.coe.psu.ac.th/~montri. Appendix A. Mathematical Foundations. Linear Algebra. Notation and Preliminaries Inner Product Outer Product Derivatives of Matrices Determinant and Trace

Download Presentation

240-650 Principles of Pattern Recognition

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 240-650 Principles of Pattern Recognition Montri Karnjanadecha montri@coe.psu.ac.th http://fivedots.coe.psu.ac.th/~montri 240-572: Appendix A: Mathematical Foundations

  2. Appendix A Mathematical Foundations 240-572: Appendix A: Mathematical Foundations

  3. Linear Algebra • Notation and Preliminaries • Inner Product • Outer Product • Derivatives of Matrices • Determinant and Trace • Matrix Inversion • Eigenvalues and Eigenvectors 240-572: Appendix A: Mathematical Foundations

  4. Notation and Preliminaries • A d-dimensional column vector x and its transpose xt can be written as 240-572: Appendix A: Mathematical Foundations

  5. Inner Product • The inner product of two vectors having the same dimensionality will be denoted as xty and yields a scalar: 240-572: Appendix A: Mathematical Foundations

  6. Euclidian Norm (Length of vector) • We call a vector normalized if ||x|| = 1 • The angle between two vectors 240-572: Appendix A: Mathematical Foundations

  7. Cauchy-Schwarz Inequality • If xty = 0 then the vectors are orthogonal • If ||xty|| = ||x||||y| then the vectors are colinear. 240-572: Appendix A: Mathematical Foundations

  8. Linear Independence • A set of vectors {x1, x2, x3, …, xn} is linearly independent if no vector in the set can be written as a linear combination of any of the others. • A set of d L.I. vectors spans a d-dimensional vector space, i.e. any vector in that space can be written as a linear combination of such spanning vectors. 240-572: Appendix A: Mathematical Foundations

  9. Outer Product • The outer product of 2 vectors yields a matrix 240-572: Appendix A: Mathematical Foundations

  10. Determinant and Trace • Determinant of a matrix is a scalar • It reveals properties of the matrix • If columns are considered as vectors, and if these vector are not L.I. then the determinant vanishes. • Trace is the sum of the matrix’s diagonal elements 240-572: Appendix A: Mathematical Foundations

  11. Eigenvectors and Eigenvalues • A very important class of linear equations is of the form • The solution vector x=ei and corresponding scalar are called the eigenvector and associated eigenvalue, respectively • Eigenvalues can be obtained by solving the characteristic equation: 240-572: Appendix A: Mathematical Foundations

  12. Example • Let find eigenvalues and associated eigenvectors Characteristic Eqn: 240-572: Appendix A: Mathematical Foundations

  13. Example (cont’d) Solution: Eigenvalues are: Each eigenvector can be found by substituting each eigenvalue into the equation then solving for x1 in term of x2 (or vice versa) 240-572: Appendix A: Mathematical Foundations

  14. Example (cont’d) • The eigenvectors associated with both eigenvalues are: 240-572: Appendix A: Mathematical Foundations

  15. Trace and Determinant • Trace = sum of eigenvalues • Determinant = product of eigenvalues 240-572: Appendix A: Mathematical Foundations

  16. Probability Theory • Let x be a discrete RV that can assume any of the finite number of m of different values in the set X = {v1, v2, …, vm}. We denote pi the probability that x assumes the value vi : pi = Pr[x=vi], i = 1..m • pi must satisfy 2 conditions 240-572: Appendix A: Mathematical Foundations

  17. Probability Mass Function • Sometimes it is more convenient to express the set of probabilities {p1, p2, …, pm} in terms of the probability mass functionP(x), which must satisfy the following conditions: For Discrete x 240-572: Appendix A: Mathematical Foundations

  18. Expected Value • The expected value, mean or average of the random variable x is defined by • If f(x) is any function of x, the expected value of f is defined by 240-572: Appendix A: Mathematical Foundations

  19. Second Moment and Variance • Second moment • Variance • Where is the standard deviation of x 240-572: Appendix A: Mathematical Foundations

  20. Variance and Standard Deviation • Variance can be viewed as the moment of inertia of the probability mass function. The variance is never negative. • Standard deviation tells us how far values of x are likely to depart from the mean. 240-572: Appendix A: Mathematical Foundations

  21. Pairs of Discrete Random Variables • Joint probability • Joint probability mass function • Marginal distributions 240-572: Appendix A: Mathematical Foundations

  22. Statistical Independence • Variables x and y are said to be statistically independent if and only if • Knowing the value of x did not give any knowledge about the possible values of y 240-572: Appendix A: Mathematical Foundations

  23. Expected Values of Functions of Two Variables • The expected value of a function f(x,y) of two random variables x and y is defined by 240-572: Appendix A: Mathematical Foundations

  24. Means and Variances 240-572: Appendix A: Mathematical Foundations

  25. Covariance • Using vector notation, the notations of mean and covariance become 240-572: Appendix A: Mathematical Foundations

  26. Uncorrelated • The covariance is one measure of the degree of statistical dependence between x and y. • If x and y are statistically independent then and The variables x and y are said to be uncorrelated 240-572: Appendix A: Mathematical Foundations

  27. Conditional Probability • conditional probability of x given y • In terms of mass functions 240-572: Appendix A: Mathematical Foundations

  28. The Law of Total Probability • If an event A can occur in m different ways, A1, A2, …, Am, and if these m subevents are mutually exclusive then the probability of A occurring is the sum of the probabilities of the subevents Ai. 240-572: Appendix A: Mathematical Foundations

  29. Bayes Rule • Likelihood = P(y|x) • Prior probability = P(x) • Posterior distribution P(x|y) X = cause Y = effect 240-572: Appendix A: Mathematical Foundations

  30. Normal Distributions 240-572: Appendix A: Mathematical Foundations

More Related