1 / 9

Adam Yeh

Adam Yeh. UCF Computer Vision REU Week 2. Facial Expressions. Problem: given a face image, determine which of 7 basic facial expressions is displayed Neutral, Anger, Disgust, Fear, Joy, Sadness, Surprise FACS (Facial Action Coding System) Identifies 44 basic facial actions Each expression

finnea
Download Presentation

Adam Yeh

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Adam Yeh UCF Computer Vision REU Week 2

  2. Facial Expressions • Problem: given a face image, determine which of 7 basic facial expressions is displayed • Neutral, Anger, Disgust, Fear, Joy, Sadness, Surprise • FACS (Facial Action Coding System) • Identifies 44 basic facial actions • Each expression • Basic object recognition problem

  3. Current Research • Wang et al. [1] • Adaboost with Haar wavelet features • Similar to Viola-Jones approach except train for specific expressions, not just faces • >90% precision, real-time • Other similar research, but slight modifications • Different classifiers • Different learning algorithm

  4. Current Research • Using FACS: Kotsia, Pitas [1] • Asks user to identify certain features in one control frame of face • Charts movement of features and looks up which FACS action(s) are present • Uses machine learning to determine actions • Achieves >95% performance • Problem: not full automated because of user input

  5. Current Research • Tong et al. [3] • Uses expert knowledge • Trains on specific action units, not the entire facial expression • Uses Gabor filters as features, Adaboost for feature selection • Use Bayesian network to combine AUs into facial expressions • 93% recognition

  6. Machine Learning Techniques • Support Vector Machines • Map data into higher dimensional space • Find hyperplane to separate/classify data • K-nearest neighbors • Group test sample with class that has most neighbors within a certain distance • Adaboost

  7. Project Possibility • Expand on Tong et al. • AdaSVM? • Has shown improvements over just Adaboost and SVM in other applications • Different classifiers? • Try different filters • Use knowledge of AUs • Different feature combination techniques

  8. Progress • This week • Background research • Coding parts of Viola-Jones (for specific parts needed) • Next week • Finish coding and test on current images • Try different features • Hopefully filter and modify training data as needed

  9. References • [1] Yubo Wang; Haizhou Ai; Bo Wu; Chang Huang . “Real time facial expression recognition with AdaBoost” Pattern Recognition, 2004. ICPR 2004. Proceedings of the 17th International Conference on, Vol.3, Iss., 23-26 Aug. 2004. Pages: 926- 929 Vol.3 • [2] Kotsia, I.; Pitas, I. “Real time facial expression recognition from image sequences using support vector machines.”Image Processing, 2005. ICIP 2005. IEEE International Conference on, Vol.2, Iss., 11-14 Sept. 2005 Pages: II- 966-9 • [3] Yan Tong, Wenhui Liao , and Qiang Ji, “Facial Action Unit Recognition by Exploiting Their Dynamic and Semantic Relationships”, accepted by IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI).

More Related