1 / 15

EMOTION ANALYSIS IN MAN-MACHINE INTERACTION SYSTEMS

EMOTION ANALYSIS IN MAN-MACHINE INTERACTION SYSTEMS. T. Balomenos, A.Raouzaiou, S.Ioannou, A.Drosopoulos, K.Karpouzis and S.Kollias. Image, Video and Multimedia Systems Laboratory National Technical University of Athens. Outline. Facial Expression Estimation Face Detection

lindsay
Download Presentation

EMOTION ANALYSIS IN MAN-MACHINE INTERACTION SYSTEMS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EMOTION ANALYSIS IN MAN-MACHINE INTERACTION SYSTEMS T. Balomenos, A.Raouzaiou, S.Ioannou, A.Drosopoulos, K.Karpouzis and S.Kollias Image, Video and Multimedia Systems LaboratoryNational Technical University of Athens

  2. Outline • Facial Expression Estimation • Face Detection • Facial Feature Extraction • Anatomical Constraints - Anthropometry • FP Localization • FAP calculation • Expression Profiles • Expression Confidence enforcement - Gesture analysis

  3. quick rejection quick rejection y y Variance > T # Skin Color Pixels > T preprocessing n n 1. Subtract mean 2. Divide by std.dev. no face no face face no face 1 classification classification subspace project. 0 Y=LTX Red. SVM(Y) Red. SVM(Y) no face 0 1 Classify Face Detection

  4. Multiple cue Facial Feature boundary extraction: eyes & mouth, eyebrows, nose • Edge-based mask • Intensity-based mask • NN-based (Y,Cr,Cb, DCT coefficients of neighborhood) mask Each mask is validated independently

  5. Multiple cue feature extraction – an example

  6. Final mask validation through Anthropometry Facial distances Male/Female separationmeasured by the US Army 30 year period The measured distances are normalized by division with Distance 7, i.e. the distance between the inner corners of left and right eye, both points the human cannot move.

  7. Anthropometry based confidence DA5n, DA10n: distances in figures normalized by division with distance DA7: (DA5n=DA5/DA7,DA10n=DA10/DA7) DAewn: eye width (calculated from DA5 and DA7) DAewn=((DA5-DA7)/2)/DA7

  8. Detected Feature Points (FPs)

  9. FAP-based description (Facial Animation Parameters) • Discrete features offer a neat, symbolic representation of expressions • Not constrained to a specific face model • Suitable for face cloning applications • MPEG-4 compatible: unified treatment of analysis and synthesis parts In MMI environments

  10. FAPs estimation • Absence of clear quantitative definition of FAPs • It is possible to model FAPs through FDP feature points movement using distances s(x,y) e.g. close_t_r_eyelid (F20) - close_b_r_eyelid (F22)  D13=s (3.2,3.4)  f13= D13 - D13-NEUTRAL

  11. Sample Profiles of Anger A1:F4[22, 124], F31[-131, -25], F32[-136,-34], F33[-189,-109], F34[-183,-105], F35[-101,-31], F36[-108,-32], F37[29,85], F38[27,89] A2:F19[-330,-200], F20[-335,-205], F21[200,330], F22[205,335], F31[-200,-80], F32[-194,-74], F33[-190,-70], F34=[-190,-70] A3:F19 [-330,-200], F20[-335,-205], F21[200,330], F22[205,335], F31[-200,-80], F32[-194,-74], F33[70,190], F34[70,190]

  12. Gesture Analysis • Gestures too ambiguous to indicate emotion on their own • Gestures are used to support the confidence outcome of facial expression analysis HMM gesture class probabilities to emotional state transformation table Cr/Cb based hand detection

  13. Emotion analysis system overview G: the value of a corresponding FAP f: Values derived from the calculated distances

  14. calculated FP distances rules activated recognised emotion System Interface

  15. Conclusions • Estimation of a user’s emotional state based on a fuzzy rules architecture • MPEG-4 :a compact and established means for HCI • Evaluation approach based on anthropometric models and measurements • Work validating the described developments in the framework of the IST ERMIS project and HUMAINE

More Related