1 / 16

Recognition with Expression Variations

This paper discusses the use of pattern recognition theory in recognizing facial expressions with variations. It explores techniques such as Principal Components Analysis (PCA) and Linear Discriminant Analysis (LDA) to improve the discriminative power of recognition systems.

kbrummett
Download Presentation

Recognition with Expression Variations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Recognition with Expression Variations Pattern Recognition Theory – Spring 2003 Prof. Vijayakumar Bhagavatula Derek Hoiem Tal Blum

  2. m < N Variables N Variables Method Overview Training Image (Reduced) Test Image Dimensionality Reduction 1-NN Euclidean Classification

  3. N å = - m - m T S ( x )( x ) T k k = k 1 = = T W arg max W S W [ w w ... w ] opt T 1 2 m W Principal Components Analysis • Minimize representational error in lower dimensional subspace of input • Choose eigenvectors corresponding to m largest eigenvalues of total scatter as the weight matrix

  4. c c å å å = - m - m T = m - m m - m S ( x ) ( x ) T S ( ) ( ) W k i k i B i i = Î w i 1 x = i 1 k i T W S W = = B W arg max [ w w ... w ] opt 1 2 m T W S W W W = l S w S w B i i W i Linear Discriminant Analysis • Maximize the ratio of the between-class scatter to the within-class scatter in lower dimensional space than input • Choose top m eigenvectors of generalized eigenvalue solution

  5. = W W W opt LDA PCA LDA: Avoiding Singularity • For N samples and c classes: • Reduce dimensionality to N - c using PCA • Apply LDA to reduced space • Combine weight matrices

  6. = W W W opt LDA PCA Discriminant Analysis of Principal Components • For N samples and c classes: • Reduce dimensionality m < N - c • Apply LDA to reduced space • Combine weight matrices

  7. When PCA+LDA Can Help • Test includes subjects not present in training set • Very few (1-3) examples available per class • Test samples vary significantly from training samples

  8. Why Would PCA+LDA Help? • Allows more freedom of movement for maximizing between-class scatter • Removes potentially noisy low-ranked principal components in determining LDA projection • Goal is improved generalization to non-training samples

  9. PCA ProjectionsBest 2-D Projection Training Testing

  10. LDA ProjectionsBest 2-D Projection Training Testing

  11. PCA+LDA ProjectionsBest 2-D Projection Training Testing

  12. Processing Time • Training time: < 3 seconds (Matlab 1.8 GHz) • Testing time: O( d * (N + T) )

  13. Results

  14. Sensitivity of PCA+LDA to Number of PCA Vectors Removed

  15. Conclusions • Recognition under varying expressions is an easy problem • LDA and LDA+PCA produce better subspaces for discrimination than PCA • Simply removing lowest ranked PCA vectors may not be good strategy for PCA+LDA • Maximizing the minimum between-class distance may be a better strategy than maximizing the Fisher ratio

  16. References • M. Turk and A. Pentland, “Face recognition using eigenfaces,” in Proc. IEEE Conf. on Comp. Vision and Patt. Recog., pages 586-591, 1991 • P.N. Belhumeur, J.P. Hespanha, and D.J. Kriegman, “Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection,” in Proc. European Conf. on Computer Vision, April 1996 • W. Zhao, R. Chellappa, and P.J. Phillips, “Discriminant Analysis of Principal Components for Face Recognition,” in Proceedings, International Conference on Automatic Face and Gesture Recognition, pp. 336-341, 1998

More Related