1 / 39

Face alignment using Boosted Appearance Model (BAM)

Face alignment using Boosted Appearance Model (BAM). Satya mahesh Muddamsetty Supervisor : Tommaso Gritti. Video processing & Analysis group. Examiner : Mikael Nilsson, Department of Signal processing, BTH September 30 , 2009. Outlines. Introduction . Brief summary of Previous methods

colum
Download Presentation

Face alignment using Boosted Appearance Model (BAM)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Face alignment using Boosted Appearance Model (BAM) Satya mahesh Muddamsetty Supervisor: Tommaso Gritti Video processing & Analysis group Examiner: Mikael Nilsson, Department of Signal processing, BTH September30, 2009

  2. Outlines • Introduction • Brief summary of Previous methods • Shape model learning in BAM • Appearance model learning in BAM • Alignment using BAM • Experiments & Results • Conclusion

  3. Introduction: Image alignment/fitting It is the process of moving and deforming a template to minimize the distance between template and an image. Alignment is done in 3 steps step1: model choice template, ASM,AAM step2 : distance metrics MSE (Mean Square Error) step3: optimization  Gradient descent methods

  4. Introduction: applications • Face fitting [Baker & Matthews’04IJCV] • Tracking [Hager & Belhumeur’98PAMI] • Medical Image Interpretation [Mitchell et al.’02TMI] • Industrial Inspection

  5. Outlines • Brief summary of previous methods • Shape model learning in BAM • Appearance model learning in BAM • Alignment using BAM • Experiments & Results • Conclusion Introduction

  6. Brief summary of previous methods • Point distribution model (PDM) [Cooteset al.’92BMVC] • Active shape model (ASM) [Cootes & Taylor’92BMVC] • Active appearance model (AAM)[Cooteset al.’01PAMI] • Inverse compositional (IC) and simultaneous inverse compositional (SIC) AAM fitting [Baker & Matthews’04IJCV]

  7. Brief summary of previous methods • Active shape model (ASM)[Cootes & Taylor’92BMVC] It uses the shape model (PDM) as a template Shape parameters Mean shape Eigen vectors It seeks to minimize the distance between model points and the corresponding points found in the image. Drawbacks: • Only the local appearance information around each landmarks is learned, which might not be the effective way of modeling

  8. Brief summary of previous methods • Inverse compositional (IC) and Simultaneous Inverse compositional (SIC) AAM fitting [Baker & Matthews’04IJCV] AAM template:

  9. Mean appearance Appearance parameter Mean shape Appearance basis Shape parameter Brief summary of previous methods • Inverse compositional (IC) and SimultaneousInverse compositional (SIC) AAM fitting [Baker & Matthews’04IJCV] AAM-distance Image coordinate Warping function * Image observation * Gross et al.’05IVC

  10. Problems of Previous methods • All these AAM methods know to have a generalization problem, they degrades quickly when the they trained on large dataset. • And the performance is poor on the unseen data • These models are generative models How to solve them? • New method known as Boosted Appearance Model (BAM) • It is an discriminative model. • It has shape model, appearance model and an specific alignment method.

  11. Outlines • Shape model learning in BAM • Introduction • Breif summary of previous methods • Appearance model learning in BAM • Alignment using BAM • Experiments & Results • Conclusion

  12. Shape model learning in BAM • Same shape model (PDM) used in previous methods • The shape model learned by applying principle component • analysis (PCA) to the set of shape vectors S , where Sample shape from training set The shape data consists of points in nD-space si = (xi0,xi1,xi2,…,xin,yi0,yi1,…,yin )T with observation i = {1,..,N} • Steps: • Compute the mean of the training data : 2. Compute the covariance of the data: (nd x nd matrix) 3. Compute the Eigen vectors Φiand the corresponding Eigen values of C

  13. Shape model learning in BAM • Shape model learning in BAM Sample shape from training set Eigenvectors: fi =(fi,1,fi,2,…,fi,nd-1,fi,nd) (which means (x0,x1,x2,…,xn,y0,y1,…,yn ) ) Eigen values I, 2 ,…, nd • The eigenvector with highest eigen value is the most dominant shape variation in the training set • The eigenvectors are therefore ordered in magnitude of eigen value. • Matrix of eigenvectors: F=[f1T, f2T,…, fkT] • So finally our parametric shape model s can be expresses as a mean shape plus a linear combination of k eigen vectors fk

  14. Shape model learning in BAM learned Shape variations By varying the shape parameters with respect to mean shape where i=1 i=2 i=3 i=4 i=5,..k

  15. Outlines • Appearance model learning in BAM • Introduction • Breif summary of Previous methods • Shape model learning in BAM • Alignment using BAM • Experiments & Results • Conclusion

  16. Appearance Model learning in BAM M- number of weak classifiers It is a function of warped image I(W(x;p) warped with the shape parameters p • Similar to AAM our appearance model is defined on the warped Image I(W(x;p) • In BAM, appearance model is a set of weak classifiers which learns the decision boundary between correct alignment (positive class) and incorrect alignment (negative class).

  17. Appearance Model learning in BAM - Is the shape vector - Is the matrix of Eigenvectors n – number of perturbed shape per each original shape v – is the k – dimensional vector with each element uniformly distributed from [-1,1] randomly - is a vector with k- Eigen values Training samples Positive samples: compute the shape parameters: where Negative samples: • Perturb the each element of original shape parameters: randomly and each element should be uniformly distributed between [-1,1].

  18. Appearance Model learning in BAM Positive samples Negativesamples N- original shapes N- original warped images Nq- perturbed shapes Nq- perturbed warped images

  19. Appearance Model learning in BAM Boosting: Label = 1 Label = -1 • Computing the Rectangular Haar features on the warped images via integral image (Viola and Jones)

  20. Appearance Model learning in BAM Boosting: Positive samples Number of features ‘K’ Gentle boosting Number of features ‘K’ Number of originalimages ‘N’ Negativesamples Numberof perturbed images ‘Nq’ Haar Features selected by Gentle boost

  21. - 1 1 • Appearance Model learning in BAM Selected feature by gentle boost computed on any warped image m=1 m=2 m=3 m=4 m=5 m=6,.M m = 1,2,…..M Selected Haar feature by gentle boost threshold Weak classifier design:

  22. Appearance Model learning in BAM Final Weak classifiers m=1 m=2 m=3 m=4 m=5 m=6 m=1 to 100

  23. Outlines • Alignment using BAM • Introduction • Breif summary of Previous methods • Shape model & learning in BAM • Appearance model & learning in BAM • Experiments & Results • Conclusion

  24. Alignment using BAM How to do alignment using BAM ? Use the classification scorefrom the trained strong classifier as a distance metrics. How this score is computed? Takes Input: as warped image Output: SCORE This score indicates the quality of alignment Training samples postivesamples400(red color) negative sample 4000(blue color)

  25. Alignment using BAM But finding this new parameters is a non-linear optimization problem To solve this iteratively we are using gradient ascent method • Alignment in the sense given an initial parameters will have negative score, trying to look new parameters will have maximum positive score.

  26. Alignment using BAM where Solving the correct parameters iteratively RMSE Alignment/fitting via Gradient Ascent method

  27. Alignment using BAM Gradient Ascent Solution Our trained two- class strong classifier Gradient of is

  28. Alignment using BAM: summary Inputs: Input Image I , Initial shape parameters ,warped jacobian BAM; Shape model {mean shape , Eigen vectors } Appearance model { ;m=1,2,…..M} Step 0: Compute the gradient of the image , repeat 1. Warp I with, to compute 2. Compute the selected feature for the each weak classifier on the warped input image : 3. Warp the gradient image with the 4. Compute the steepest descent image 5. Compute the integral images for each Colum of SD and obtain the rectangular features for each weak classifier: 6. Compute using 7. Update until

  29. Outlines • Experiments & Results • Conclusion • Introduction • Breif summary of Previous methods • Shape model learning in BAM • Appearance model learning in BAM • Alignment using BAM

  30. Experiments & Results We used challenging FERET DATASET which contains frontal images, with different variations, pose, race, illuminations, expressions. samples images are here

  31. Experiments & Results Training Shape model We trained shape model (PDM) with 1636 image annotations Appearance model 400 TRAINset1 400 4000 800 800 8000 TRAINset2 Created Negative samples for different perturbation ranges {.8,1,1.2} for TRAINset1 and TRAINset2

  32. Experiments & Results Alignment example RMSE

  33. Experiments & Results performance: TEST DATA TESTset1 300 From train data TESTset2 300 unseen data

  34. performance: On TRAINset1 Experiments & Results TESTset1 TESTset2(unseen data)

  35. Experiments & Results performance: On TRAINset2 TESTset1 TESTset2(unseen data)

  36. Experiments & Results • Test on different illumination image database YALE DATABASE • Collected 30 images • Generated 5 initializations randomly per each image finally 150 trails

  37. Outlines • Conclusion Introduction Breif summary of Previous methods Shape model & learning in BAM Appearance model & learning in BAM Alignment using BAM Experiments & Results

  38. Conclusions • Idea of discriminative method in AAM seems like a powerful extension to classical methods • Computational complexity still quite high • Influence of amount of perturbation on training set is never mentioned in literature, but very strong • integration of procrustes analysis not mentioned in the papers, even if it could help in building better shape models Future work • compare results with classical AAM implementations • test with very large training database

More Related