1 / 29

Recognition, SVD, and PCA

Recognition, SVD, and PCA. Recognition. Suppose you want to find a face in an image One possibility: look for something that looks sort of like a face (oval, dark band near top, dark band near bottom) Another possibility: look for pieces of faces (eyes, mouth, etc.) in a specific arrangement.

elyse
Download Presentation

Recognition, SVD, and PCA

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Recognition, SVD, and PCA

  2. Recognition • Suppose you want to find a face in an image • One possibility: look for something that looks sort of like a face (oval, dark band near top, dark band near bottom) • Another possibility: look for pieces of faces (eyes, mouth, etc.) in a specific arrangement

  3. Templates • Model of a “generic” or “average” face • Learn templates from example data • For each location in image, look for template at that location • Optionally also search over scale, orientation

  4. Templates • In the simplest case, based on intensity • Template is average of all faces in training set • Comparison based on e.g. SSD • More complex templates • Outputs of feature detectors • Color histograms • Both position and frequency information (wavelets)

  5. Detection offrontal / profilefaces Sample Images Face Detection Results WaveletHistogramTemplate

  6. More Face Detection Results Schneiderman and Kanade

  7. Recognition UsingRelations Between Templates • Often easier to recognize a small feature • e.g., lips easier to recognize than faces • For articulated objects (e.g. people), template for whole class usually complicated • So, identify small pieces and look for spatial arrangements • Many false positives from identifying pieces

  8. Graph Matching Head Head Arm Leg Arm Body Arm Arm Body Leg Body Head Leg Leg Leg Leg Model Feature detection results

  9. Graph Matching Head Arm Body Arm Next to, right of Next to, below Leg Leg Constraints

  10. Graph Matching Head Head Arm Leg Arm Body Arm Arm Body Leg Body Head Leg Leg Leg Leg Combinatorial search

  11. Graph Matching Head Head Arm Leg Arm Body Arm Arm Body Leg Body Head OK Leg Leg Leg Leg Combinatorial search

  12. Graph Matching Head Head Arm Leg Arm Body Violatesconstraint Arm Arm Body Leg Body Head Leg Leg Leg Leg Combinatorial search

  13. Graph Matching • Large search space • Heuristics for pruning • Missing features • Look for maximal consistent assignment • Noise, spurious features • Incomplete constraints • Verification step at end

  14. Recognition • Suppose you want to recognize aparticular face • How does this face differ from average face

  15. How to Recognize Specific People? • Consider variation from average face • Not all variations equally important • Variation in a single pixel relatively unimportant • If image is high-dimensional vector, want to find directions in this space with high variation

  16. * * Second principal component * * First principal component * * * * * * * * * * * * * * * * * * * * Data points Principal Components Analaysis • Principal Components Analysis (PCA): approximating a high-dimensional data set with a lower-dimensional subspace Original axes

  17. Digression:Singular Value Decomposition (SVD) • Handy mathematical technique that has application to many problems • Given any mn matrix A, algorithm to find matrices U, V, and W such that A = UWVT U is mn and orthonormal V is nn and orthonormal W is nn and diagonal

  18. SVD • Treat as black box: code widely available (svd(A,0) in Matlab)

  19. SVD • The wi are called the singular values of A • If A is singular, some of the wiwill be 0 • In general rank(A) = number of nonzero wi • SVD is mostly unique (up to permutation of singular values, or if some wi are equal)

  20. SVD and Inverses • Why is SVD so useful? • Application #1: inverses • A-1=(VT)-1W-1U-1 = VW-1UT • This fails when some wi are 0 • It’s supposed to fail – singular matrix • Pseudoinverse: if wi=0, set 1/wi to 0 (!) • “Closest” matrix to inverse • Defined for all (even non-square) matrices

  21. SVD and Least Squares • Solving Ax=b by least squares • x=pseudoinverse(A) times b • Compute pseudoinverse using SVD • Lets you see if data is singular • Even if not singular, ratio of max to min singular values (condition number) tells you how stable the solution will be • Set 1/wi to 0 if wi is small (even if not exactly 0)

  22. SVD and Eigenvectors • Let A=UWVT, and let xi be ith column of V • Consider ATAxi: • So elements of W are squared eigenvalues and columns of V are eigenvectors of ATA

  23. SVD and Matrix Similarity • One common definition for the norm of a matrix is the Frobenius norm: • Frobenius norm can be computed from SVD • So changes to a matrix can be evaluated by looking at changes to singular values

  24. SVD and Matrix Similarity • Suppose you want to find best rank-k approximation to A • Answer: set all but the largest k singular values to zero • Can form compact representation by eliminating columns of U and V corresponding to zeroed wi

  25. SVD and Orthogonalization • The matrix U is the “closest” orthonormal matrix to A • Yet another useful application of the matrix-approximation properties of SVD • Much more stable numerically thanGraham-Schmidt orthogonalization • Find rotation given general affine matrix

  26. * * Second principal component * * First principal component * * * * * * * * * * * * * * * * * * * * Data points SVD and PCA • Principal Components Analysis (PCA): approximating a high-dimensional data set with a lower-dimensional subspace Original axes

  27. SVD and PCA • Data matrix with points as rows, take SVD • Subtract out mean (“whitening”) • Columns of Vk are principal components • Value of wi gives importance of each component

  28. PCA on Faces: “Eigenfaces” First principal component Averageface Othercomponents For all except average,“gray” = 0,“white” > 0, “black” < 0

  29. Using PCA for Recognition • Store each person as coefficients of projection onto first few principal components • Compute projections of target image, compare to database (“nearest neighbor classifier”)

More Related