1 / 37

Unsupervised Learning of Categories from Sets of Partially Matching Image Features

Unsupervised Learning of Categories from Sets of Partially Matching Image Features. Kristen Grauman Trevor Darrell MIT. Spectrum of supervision. Less. More. Costs of supervision. Sacrifice scalability: practical limit on number of classes, number of training examples per class

benoit
Download Presentation

Unsupervised Learning of Categories from Sets of Partially Matching Image Features

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Unsupervised Learning of Categories from Sets of Partially Matching Image Features Kristen Grauman Trevor Darrell MIT

  2. Spectrum of supervision Less More

  3. Costs of supervision • Sacrifice scalability: practical limit on number of classes, number of training examples per class • Biases possible: human labeling could hinder potential performance

  4. Related work • Feature/part selection given category [Weber et al. ECCV 2000, Fergus et al. CVPR 2003, Berg et al. CVPR 2005,…] • Leveraging previously seen categories or images [Fei-Fei et al. ICCV 2003, Murphy et al. NIPS 2003, Holub et al. 2005,…] • Unsupervised category learning with probabilistic Latent Semantic Analysis [Hofmann 1999] [Fergus et al., Quelhas et al., Sivic et al., ICCV 2005]

  5. Goals Automatically recover categories from an unlabeled collection of images, and form predictive classifiers to label new images • Tolerate clutter, occlusion, common transformations • Allow optional, variable amount of supervision • Efficiency

  6. Sets of local features

  7. similarity Partially matching sets optimal partial matching

  8. Clustering with a partial matching

  9. Clustering with a partial matching

  10. Clustering with a partial matching

  11. Clustering with a partial matching

  12. Computing the partial matching • Optimal matching • Greedy matching • Pyramid match [Grauman and Darrell, ICCV 2005] for sets with features

  13. optimal partial matching Review: Pyramid match

  14. Review: Pyramid match • Efficient • time for pyramids and matching • Orders of magnitude faster than optimal match in practice • Accurate • Produces rankings that are highly correlated with optimal match • Useful as kernel in discriminative classifier: 50% accuracy on Caltech101 with 15 training examples per class (58% with 30) • Bounded expected cost error [ICCV 2005, JMLR (to appear)]

  15. Pyramid match graph Build graph over image collection, with edges weighted by pyramid match similarity values

  16. Optional semi-supervision Adjust pyramid match graph when pair-wise constraints are available. should group should not group

  17. Graph partitioning Efficiently identify initial clusters with spectral clustering and normalized cuts criterion of [Shi & Malik]

  18. Multiple object matches Limitation of partial match graph partition Background feature matches

  19. Extracting correspondences Extend pyramid match to return approximate feature correspondences

  20. Extracting correspondences Extend pyramid match to return approximate feature correspondences

  21. Extracting correspondences

  22. Multiple object matches Limitation of partial match graph partition Background feature matches

  23. feature index Inferring feature masks contribution to match

  24. weighted feature mask Inferring feature masks

  25. weighted feature mask Refining intra-cluster matches

  26. Refining intra-cluster matches weighted feature mask

  27. weighted feature mask Refining intra-cluster matches

  28. weighted feature mask Refining intra-cluster matches

  29. Refining intra-cluster matches weighted feature mask

  30. Selecting category prototypes 1 3 5 4 2

  31. Selecting category prototypes

  32. Low mask weight High mask weight Inferred feature masks Harris-Affine detector [Mikolajczyk and Schmid] SIFT descriptors [Lowe]

  33. Unsupervised recovery of category prototypes Caltech-4 data set • 40 runs with 400 randomly selected images Prototype accuracy / category Top percentile of prototypes

  34. Semi-supervised category labeling Caltech-4 data set • Recover categories and SVM classifiers from 400 unlabeled images • Classify 2788 unseen examples • 40 runs with random cluster/test set/supervision selections Recognition rate / class Amount of supervisory information (number of “must-group” pairs)

  35. Vocabulary-guided bins Recent work:Vocabulary-guided pyramid match Uniform bins • Extracting correspondences can be slow, scores inaccurate in high dimensions with uniform bins • A vocabulary-guided pyramid match tunes pyramid partitions to the feature distribution • Accurate for d > 100 [See our recent CSAIL tech report]

  36. Contributions • Efficient unsupervised / semi-supervised category learning from sets of local features • Automatic recovery of per-image feature masks without class labels • Extension to pyramid match for explicit correspondences

  37. Future work • Enforce geometry, contiguous spatial regions for matching feature mask • Explore exemplar-based classifiers • Automatic selection of number of categories • Iterative cluster refinement / mask inference • Optimizing semi-supervision with a user

More Related