matrix factorization methods ahmet oguz akyuz l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Matrix Factorization Methods Ahmet Oguz Akyuz PowerPoint Presentation
Download Presentation
Matrix Factorization Methods Ahmet Oguz Akyuz

Loading in 2 Seconds...

play fullscreen
1 / 25

Matrix Factorization Methods Ahmet Oguz Akyuz - PowerPoint PPT Presentation


  • 499 Views
  • Uploaded on

Matrix Factorization Methods Ahmet Oguz Akyuz Matrix Factorization Methods Principal component analysis Singular value decomposition Non-negative matrix factorization Independent component analysis Eigen decomposition Random projection Factor analysis Principal Component Anaylsis

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Matrix Factorization Methods Ahmet Oguz Akyuz' - liam


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
matrix factorization methods
Matrix Factorization Methods
  • Principal component analysis
  • Singular value decomposition
  • Non-negative matrix factorization
  • Independent component analysis
  • Eigen decomposition
  • Random projection
  • Factor analysis
what is pca
What is PCA?
  • Simply a change of basis

In graphics terms:

Translation followed by a rotation

slide5
PCA
  • F = ET N, where N = D - M
  • D: data matrix
  • M: mean matrix
  • ET: transpose of eigenvectors of covariance matrix of N
statistical terms review
Statistical Terms Review
  • Mean:
  • Standard Deviation:
  • Variance:
  • Covariance:
covariance
Covariance
  • Covariance measures correlation between two dimensions
  • + sign: increase together
  • - sign: one increases other decreases
  • zero: dimensions are independent
  • Magnitude gives the strength of the relationship
eigenvalues and eigenvectors
Eigenvalues and Eigenvectors

is an eigenvector

is the associated eigenvalue

  • Eigenvectors of a matrix are orthogonal
steps of pca
Steps of PCA
  • Compute covariance matrix C =
  • Find eigenvalues and eigenvectors of C
  • Form ET (sort eigenvectors, eigenvector of the largest eigenvalue is in the first row…)
  • F = ET N
results
Results
  • Data is represented in a more suitable basis
  • Redundancy (noise, etc…) can be reduced by using only a subset of eigenvectors (dimension reduction), compression
what is svd
What is SVD?
  • A more general means of change of basis
  • D = U W VT
  • U is the eigenvectors of DDT (orthogonal)
  • V is the eigenvectors of DTD (orthogonal)
  • W is square root of eigenvalues of U and V put in the diagonal (so it’s a sorted diagonal matrix)

Note: eigenvalues of DDT and DTD are same

how can we use svd
How can we use SVD?
  • Very useful in solving systems of linear equations
  • Gives the best possible answer (in a least squared sense) when the exact solution does not exist!
solution for a system of linear equations
Solution for a System of Linear Equations
  • Ax = y
  • x = A-1 y (what if A-1 does not exist)
  • If A = U W VT then A-1 = V W-1 UT this is called pseudoinverse (valid for nxm matrices)
  • W-1 = (diag(1/w1, 1/w2 …, 1/wm))
  • If wi is 0 for some i then set (1/wi) = 0
  • This is same as reducing dimensionality by PCA
what is nmf
What is NMF?
  • Given a non-negative matrix V find non-negative matrix factors W and H such that:

V ≈ WH

  • V = n x m
  • W = n x r
  • H = r x m

(n+m)r < nm so that data is compressed

slide18
NMF
  • NMF distinguished from other methods by its non-negativity constraint
  • This allows a parts-based representation because only addition is additive combinations are allowed
cost functions
Cost Functions
  • We need to quantify the quality of the approximation.
update rules
Update Rules
  • The Euclidean distance ||V - WH|| is non-increasing under the following update rules:
update rules21
Update Rules
  • The divergence D(V || WH) is nonincreasing under the following update rules:
how to perform nfa
How to perform NFA?
  • W and H can be seeded with non-negativerandom values
  • Then NMF is guaranteed to converge to a local minimum (of the used error function) by iteratively applying the update functions
example
Example

Note the sparseness of basis matrix

other examples
Other examples
  • BRDF is factored using NMF (Siggraph2004)
  • Phase function in volume rendering?
  • What else?
references
References
  • Learning the parts of objects by non-negative matrix factorization, Daniel D. Lee & H. Sebastian Seung, Nature 1999
  • Algorithms for non-negative matrix factorization, Daniel D. Lee & H. Sebastian Seung
  • Efficient BRDF importance sampling using a factored representation, Jason Lawrance, Szymon Rusinkiewicz, Ravi Ramamoorthi
  • A tutorial on principal component analysis, Jon Shlens
  • Singular value decomposition and principal component analysis, Rasmus Elsborg Madsen, Lars Kai Hansen, Ole Winther
  • Non-negative matrix factorization with sparseness constraints, Patrik O. Hoyer