Quantification of facial asymmetry for expression invariant human identification
Download
1 / 27

Quantification of Facial Asymmetry for Expression-invariant Human Identification - PowerPoint PPT Presentation


  • 53 Views
  • Uploaded on

Quantification of Facial Asymmetry for Expression-invariant Human Identification. Yanxi Liu [email protected] The Robotics Institute School of Computer Science Carnegie Mellon University Pittsburgh, PA USA. Acknowledgement.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Quantification of Facial Asymmetry for Expression-invariant Human Identification' - andrew-james


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Quantification of facial asymmetry for expression invariant human identification
Quantification of Facial Asymmetry for Expression-invariant Human Identification

Yanxi Liu

[email protected]

The Robotics Institute

School of Computer Science

Carnegie Mellon University

Pittsburgh, PA USA


Acknowledgement
Acknowledgement Human Identification

  • Joint work with Drs. Karen Schmidt and Jeff Cohn (Psychology, U. Of Pitt).

  • Students who work on the data as research projects: Sinjini Mitra, Nicoleta Serban, and Rhiannon Weaver (statistics, CMU), Yan Karklin, Dan Bohus (scomputer science) and Marc Fasnacht (physics).

  • Helpful discussions and advices provided by Drs. T. Minka, J. Schneider, B. Eddy, A. Moore and G. Gordon.

  • Partially funded by DARPA HID grant to CMU entitled:

  • “Space Time Biometrics for Human Identification in Video”


Human faces are asymmetrical
Human Faces are Asymmetrical Human Identification

Left Face

Right Face


Under Balanced Frontal Lighting Human Identification (from CMU PIE Database)


What is facial asymmetry
What is Facial Asymmetry? Human Identification

  • Intrinsic facial asymmetry in individuals is determined by biological growth, injury, age, expression …

  • Extrinsic facial asymmetry is affected by viewing orientation, illuminations, shadows, highlights …


Extrinsic facial asymmetry on an image is pose variant
Extrinsic Facial asymmetry on an image is Human IdentificationPose-variant

Left face

Original Image

Right Face


Facial asymmetry analysis
Facial Asymmetry Analysis Human Identification

  • A lot of studies in Psychology has been done on the topics of

    • attractiveness v. facial asymmetry (Thornhill & Buelthoff 1999)

    • expression v. facial movement asymmetry

  • Identification

    • Humans are extremely sensitive to facial asymmetry

    • Facial attractiveness for men is inversely related to recognition accuracy (O’Toole 1998)

Limitations: qualitative, subjective, still photos


Motivations
Motivations Human Identification

  • Facial (a)symmetry is a holistic structural feature that has not been explored quantitatively before

  • It is unknown whether intrinsic facial asymmetry is characteristic to human expressions or human identities


The question to be answered in this work
The question to be answered in this work Human Identification

How does intrinsic facial asymmetry affect human face identification?


Data expression videos cohn kanade au coded facial expression database
DATA: Expression Videos Human IdentificationCohn-Kanade AU-Coded Facial Expression Database

Neutral Peak

joy

anger

disgust


Sample facial expression frames
Sample Facial Expression Frames Human Identification

Total 55 subjects. Each subject has three distinct expression videos of varied number of frames. Total 3703 frames.

Neutral

Joy

Disgust

Anger


Face image normalization

Face Midline Human Identification

Face Image Normalization

Inner canthus

Philtrum

Affine

Deformation based on 3 reference points


Quantification of facial asymmetry
Quantification of Facial Asymmetry Human Identification

1. Density Difference: D-face

D (x,y) = I(x,y) – I’(x,y)

I(x,y) --- normalized face image,

I’(x,y) --- bilateral reflection of I(x,y) about face midline

2. Edge Orientation Similarity: S-face

S(x,y) = cos(Ie(x,y),I’e(x,y))

where Ie, Ie’ are edge images of I and I’ respectively,  is the angle between the two gradient vectors at each pair of corresponding points


Asymmetry faces
Asymmetry Faces Human Identification

An half of D-face or S-face contains all the needed information. We call these half faces Dh, Sh,Dhx, Dhy, Shx,ShyAsymFaces.

Original D-face S-face


Asymmetry measure d hy for two subjects each has 3 distinct expressions
Asymmetry Measure D Human Identificationhy for two subjects each has 3 distinct expressions

Dhy

Dhy

forehead

forehead

chin

chin

Joy anger | disgust

Joy | anger | disgust


spatial Human Identification

temporal

Forehead -- chin

Forehead -- chin

Forehead -- chin


spatial Human Identification

temporal

Forehead -- chin

Forehead -- chin

Forehead -- chin


spatial Human Identification

Forehead -- chin

Forehead -- chin

Forehead -- chin


Evaluation of discriminative power of each dimension in symface d hy
Evaluation of Discriminative Power of Each Dimension in SymFace Dhy

Variance Ratio

Bridge of nose

forehead

chin



Experiment setup
Experiment Setup SymFace D

55 subjects, each has three expression video sequences (joy, anger, disgust). Total of 3703 frames. Human identification test is done on ----

Experiment #1: train on joy and anger, test on disgust;

Experiment #2: train on joy and disgust, test on anger;

Experiment #3: train on disgust and anger, test on joy;

Experiment #4: train on neutral expression frames,test on peak

Experiment #5: train on peak expression frames,test on neutral

The above five experiments are carried out using

(1) AsymFaces, (2) Fisherfaces, and (3) AsymFaces and FisherFaces together.


Sample results combining fisherfaces ff with asymfaces af liu et al 2002
Sample Results: Combining Fisherfaces (FF) with AsymFaces (AF) (Liu et al 2002)

Data set is composed of 55 subjects, each has three expression videos.

There are 1218 joy frames, 1414 anger frames and 1071 disgust frames. Total number of frames is 3703.



Complement Conventional Face Classifier evaluated quantitatively

107 pairs of face images taken from Feret database.

It is shown that asymmetry-signature’s discriminating power

demonstrated

(1) has a p value << 0.001 from chance

(2) is independent from features used in conventional classifiers, decreases the error rate of a PCA classifier by 38% (15%  9.3%)



Summary
Summary evaluated quantitatively

  • Quantification of facial asymmetry is computationally feasible.

  • The intrinsic facial asymmetry of specific regions captures individual differences that are robust to variations in facial expression

  • AsymFaces provides discriminating information that is complement to conventional face identification methods (FisherFaces)


Future work
Future Work evaluated quantitatively

  • (1) construct multiple, more robust facial asymmetry measures that can capture intrinsic facial asymmetry under illumination and pose variationsusing PIE as well as publicly available facial data.

  • (2) develop computational models for studying how recognition rates is affected by facial asymmetry under gender, race, attractiveness, hyperspectral variations.

  • (3) study pose estimation using a combination of facial asymmetry with skewed symmetry.


ad