1 / 28

Linear Algebra and Matrices Methods for Dummies FIL November 2011

Linear Algebra and Matrices Methods for Dummies FIL November 2011 Narges Bazargani and Sarah Jensen. ONLINE SOURCES Web Guides http://mathworld.wolfram.com/LinearAlgebra.html http://www.maths.surrey.ac.uk/explore/emmaspages/option1.html http://www.inf.ed.ac.uk/teaching/courses/fmcs1/

Download Presentation

Linear Algebra and Matrices Methods for Dummies FIL November 2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Linear Algebra and Matrices Methods for Dummies FIL November 2011 Narges Bazargani and Sarah Jensen

  2. ONLINE SOURCES • WebGuides • http://mathworld.wolfram.com/LinearAlgebra.html • http://www.maths.surrey.ac.uk/explore/emmaspages/option1.html • http://www.inf.ed.ac.uk/teaching/courses/fmcs1/ • Onlineintroduction: • - http://www.khanacademy.org/video/introduction-to-matrices?playlist=Linear+Algebra

  3. What Is MATLAB? - And why learn about matrices? • MATLAB = MATrixLABoratory • Typical uses include: • Math and computation • Algorithm development • Modelling, simulation, and prototyping • Data analysis, exploration, and visualization • Scientific and engineering graphics • Application development, including Graphical User Interface building

  4. Everything in MATLAB is a matrix ! Zero-dimentional matrix A Scalar - a single number is really a 1 x 1 matrix in Matlab! 1 dimentional matrix A vector is a 1xn matrix with 1 row [1 2 3] A matrix is an mxn matrix Even a picture is a matrix! 4 n m 2 7 4 3 8 9

  5. Building matrices I MATLAB with [ ]: A = [2 7 4] A = [2; 7; 4] A = [2 7 4; 3 8 9] 2 7 4 2 7 4 ; separates the different rows 2 7 4 3 8 9 : separates collums

  6. Matrix formationinMATLAB X = [1 2 3; 4 5 6; 7 8 9] = SubmatricesinMATLAB Subscripting – each element of a matrix can be addressed with a pair of numbers; row first, column second X(2,3) =6 X(3,:) =( 7 8 9 ) X( [2 3], 2) =

  7. Matrixadditionandsubtraction NB Only matrices of the same size can be added and substracted Addition Subtraction

  8. MatrixMultiplication I Different kinds of multiplication I MATLAB • Scalar multiplication Matlab does all this for you!: 3 * A

  9. Matrix multiplication II Sum product of respective rows and columns n l Matrix multiplication rule: A x B is only viable if n=k. k m Matlab does all this for you!: C = A * B

  10. Elementwisemultiplication Matrix multiplication rule: Matrixes need the exact same ‘m’ and ‘n’ Matlab does all this for you!: A .* B

  11. Transposition – reorganising matrices column → row row →column In Matlab:AT =A’

  12. Worked example A In = A for a 3x3 matrix: Identitymatrices Tool to solve equation This identity matrix Is a matrix which plays a similar role as the number 1 in number multiplication In Matlab:eye(r, c) produces an r x c identity matrix

  13. Inversematrices Definition: Matrix A is invertible if there exists a matrix B such that: • Notation for the inverse of a matrix A is A-1 • If A is invertible, A-1 is also invertible  A is the inverse matrix of A-1. • In Matlab: A-1 = inv(A)

  14. Determinants • Determinant is a function: • A Matrix A has an inverse matrix (A-1)if and only if det(A) ≠0 • In Matlab: det(A) = det(A)

  15. With more than 1 equationandmorethan 1unknown • Can use solution from the single equation to solve • For example • In matrix form AX = B

  16. if B is So Need to find determinant of matrix A (because X =A-1B) From earlier (2 x -2) – (3 x 1) = -4 – 3 = -7 So determinant is -7 To find A-1:

  17. scalars, vectors and matrices in SPM • Scalar:Variable described by a single • number – e.g. intensity of each voxel in MRI scan • Vector: Physics vector is Variable described by magnitude and direction – Here we talk about column of numbers e.g. voxel intensity at a differenttimesor different voxels at the same time • Matrix: Rectangular array of vectors defined by number of rows and columns. x11 x12 ………x1n . . xn1……………xnn

  18. Vectorial Space and Matrix Rank Vectorial space: is a space that contains vectors and all the those that can be obtained by multiplying vectors by a real number then adding them (linear combination). In other words, because each column of the matrix can be represented by a vector, the ensemble of n vector-column defines a vectorial spacefor a matrix. Rank of a matrix:corresponds to the number of vectors that are linearly independents from each other. So, if there is a linear relationship between the lines or columns of a matrix, then the matrix will be rank-deficient (and the determinant will be zero). For example, in the graph below there is a linear relationship between X1 and X2, and the determinent is zero. And the Vectorial space defined will has only 1 dimension.

  19. Eigenvalues et eigenvectors Eigenvalues are multipliers. They are numbers that represent how much lineartransformation or stretchinghas taken place. An eigenvalue of a square matrix is a scalar that is represented by the Greek letter λ (lambda). Eigenvectors of a square matrix are non-zero vectors, after being multiplied by the matrix, remain parallel to the original vector. For each eigenvector, the corresponding eigenvalue is the factor by which the eigenvector is scaled when multiplied by the matrix. All eigenvalues and eigenvectors satisfy the equation Ax = λx for a given square matrix A. i.e. Matrix A acts by stretching the vector x, not changing its direction, so x is an eigenvector of A. One can represent eigenvectors of A as a set of orthogonal vectors representing different dimensions of the original matrix A. (Important in Principal Component Analysis, PCA)

  20. Excitatory Connection Inhibitory Connection Input Neuron Output Neuron Input Neuron Output Neuron Matrix Representations of Neural Connections • Can create a mathematical model of the connections in a neural system • Connections are the excitatory or inhibitory

  21. #2 -1 +1 #1 #3 Matrix Representations of Neural Connections Excitatory = Makes it easier for the post synaptic cell to fire Inhibitory = Makes it harder for the post synaptic cell to fire We can translate this information into a set of vectors (1 row matrices) • Input vector = (1 1) relates to activity (#1 #2) • Weight vector = (1 -1)relates to connection weight (#1 #2) Activity of Neuron 3 Input x weight Cancels out! But it is more complicated than this!

  22. How are matrices relevant to fMRI data?Basics of MR Physics • Angular Momentum: Neutrons, protons and electrons spin about their axis. The spinning of the nuclear particles produces angular momentum. • Certain nuclei exhibit magnetic properties. A proton has mass, a positive charge, and spins, it produces a small magnetic field. This small magnetic field is referred to as the magnetic moment that is a vector quantity with magnitude and direction and is oriented in the same direction as the angular momentum. • Under normal circumstances these magnetic moments have no fixed orientation (so no overall magnetic field). However, when exposed to an external magnetic field (B0), nuclei begin to align. To detect net magnetisation signal a second magnetic field is introduced (B1) which is applied perpendicular to B0, and it has to be at the resonant frequency.

  23. How are matrices relevant to fMRI data? Y= X . β+ ε Observed = Predictors * Parameters + Error BOLD = Design Matrix * Betas + Error Y • Y is a matrix of BOLD signals • Each column represents a single voxel sampled at successive time points. • Each voxel is considered as independent observation • So, we analysis of individual voxels over time, not groups over space Time Intensity

  24. Response variable • A single voxel sampled • at successive time points. • Each voxel is considered as • independent observation. Solve equation for β – tells us how much of the BOLD signal is explained by X • Explanatory variables • These are assumed to be measured without error. • May be continuous, indicating levels of an experimental factor. design matrix parameters data vector error vector a m b3 b4 b5 b6 b7 b8 b9 N of scans = + Y X b e ´ = + Observed Predictors

  25. Pseudoinverse In SPM, design matrices are NOT square matrices (more lines than columns, especially for fMRI). So, there is not a unique solution, i.e. there is more than one solution possible. SPM will use a mathematical trick called the pseudoinverse, which is an approximation, where the solution is constrained to be the one where the  values that are minimum.

  26. How are matrices relevant to fMRI? Image time-series Statistical Parametric Map Design matrix Spatial filter Realignment Smoothing General Linear Model StatisticalInference RFT Normalisation p <0.05 Anatomicalreference Parameter estimates

  27. In Practice • Estimate MAGNITUDE of signal changes and MR INTENSITY levels for each voxel at various time points • Relationship between experiment and voxel changes are established • Calculation require linear algebra and matrices manipulations • SPM builds up data as a matrix. • Manipulation of matrices enables unknown values to be calculated.

  28. Thank you! Questions?

More Related