1 / 27

Recognition of textures and object classes

Recognition of textures and object classes. Introduction. Invariant local descriptors => robust recognition of specific objects or scenes Recognition of textures and object classes => description of intra-class variation, selection of discriminant features. texture recognition.

crescent
Download Presentation

Recognition of textures and object classes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Recognition oftextures and object classes

  2. Introduction • Invariant local descriptors => robust recognition of specific objects or scenes • Recognition of textures and object classes => description of intra-class variation, selection of discriminant features texture recognition car detection

  3. Overview • An affine-invariant texture recognition (CVPR’03) • A two-layer architecture for texture segmentation and recognition (ICCV’03) • Feature selection for object class recognition (ICCV’03)

  4. Affine-invariant texture recognition • Texture recognition under viewpoint changes and non-rigid transformations • Use of affine-invariant regions • invariance to viewpoint changes • spatial selection => more compact representation, reduction of redundancy in texton dictionary [A sparse texture representation using affine-invariant regions, S. Lazebnik, C. Schmid and J. Ponce, CVPR 2003]

  5. Overview of the approach

  6. Region extraction Harris detector Laplace detector

  7. Descriptors – Spin images

  8. Spatial selection clustering each pixel clustering selected pixels

  9. Signature and EMD • Hierarchical clustering => Signature : • Earth movers distance • robust distance, optimizes the flow between distributions • can match signatures of different size • not sensitive to the number of clusters S = { ( m1 , w1 ) , … , ( mk , wk ) } D( S , S’ ) = [i,jfij d( mi , m’j)] / [i,j fij]

  10. Database with viewpoint changes 20 samples of 10 different textures

  11. Results Spin images Gabor-like filters

  12. A two-layer architecture • Texture recognition + segmentation • Classification of individual regions + spatial layout [A generative architecture for semi-supervised texture recognition, S. Lazebnik, C. Schmid, J. Ponce, ICCV 2003]

  13. A two-layer architecture Modeling : • Distribution of the local descriptors (affine invariants) • Gaussian mixture model • estimation with EM, allows incorporating unsegmented images • Co-occurrence statistics of sub-class labels over affinely adapted neighborhoods Segmentation + Recognition : • Generative model for initial class probabilities • Co-occurrence statistics + relaxation to improve labels

  14. Texture Dataset – Training Images T5 (floor 2) T1 (brick) T2 (carpet) T3 (chair) T4 (floor 1) T6 (marble) T7 (wood)

  15. Effect of relaxation + co-occurrence Original image Top: before relaxation (indivual regions), bottom: after relaxation (co-occurrence)

  16. Recognition + Segmentation Examples

  17. Animal Dataset – Training Images • no manual segmentation, weakly supervised • 10 training images per animal (with background) • no purely negative images

  18. Recognition + Segmentation Examples

  19. Object class detection • Description of intra-class variations of object parts [Selection of scale inv. regions for object class recognition, G. Dorko and C. Schmid, ICCV’03]

  20. Object class detection • Description of intra-class variations of object parts • Selection of discrimiant features

  21. Outline of the approach

  22. Clustering of descriptors • Descriptors are labeled as positive/negative • Hierarchical clustering of the positive/negative set • Examples of positive clusters

  23. Clustering of descriptors • Descriptors are labeled as positive/negative • Hierarchical clustering of the positive/negative set • Examples of positive clusters

  24. Classification • Learn a separate classifier for each cluster • Classifier : Support Vector Machine • Select significant classifiers • Feature selection with likelihood ratio / mutual information

  25. Likelihood – mutual information Likelihood Mutual Information 5 10 25

  26. Summary - Approach • Automatic construction of object part classifiers • scale and rotation invariant • no normalization/alignment of the training and test images • Selection of discriminant features • interest points, clustering • feature selection with likelihood or mutual information • Comparison of two feature selection methods • likelihood: more discriminant but very specific • mutual Information: discriminant but not too specific

  27. Material • Powerpoint presention and papers will be available at http://www.inrialpes.fr/movi/people/Schmid/cvpr-tutorial03

More Related