1 / 74

Similarity-based Classifiers: Problems and Solutions

Similarity-based Classifiers: Problems and Solutions. Classifying based on similarities :. Van Gogh. Monet. Van Gogh Or Monet ?. the Similarity-based Classification Problem. (paintings). (painter). the Similarity-based Classification Problem.

varden
Download Presentation

Similarity-based Classifiers: Problems and Solutions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Similarity-based Classifiers:Problems and Solutions

  2. Classifying based on similarities: Van Gogh Monet Van Gogh Or Monet ?

  3. the Similarity-based Classification Problem (paintings) (painter)

  4. the Similarity-based Classification Problem

  5. the Similarity-based Classification Problem ?

  6. Examples of Similarity Functions Computational Biology • Smith-Waterman algorithm (Smith & Waterman, 1981) • FASTA algorithm (Lipman & Pearson, 1985) • BLAST algorithm (Altschul et al., 1990) Computer Vision • Tangent distance (Duda et al., 2001) • Earth mover’s distance (Rubner et al., 2000) • Shape matching distance (Belongie et al., 2002) • Pyramid match kernel (Grauman & Darrell, 2007) Information Retrieval • Levenshtein distance (Levenshtein, 1966) • Cosine similarity between tf-idf vectors (Manning & Schütze, 1999)

  7. Approaches to Similarity-based Classification

  8. Approaches to Similarity-based Classification

  9. Can we treat similarities as kernels?

  10. Can we treat similarities as kernels?

  11. Can we treat similarities as kernels?

  12. Example: Amazon similarity 96 books 96 books

  13. Example: Amazon similarity 96 books 96 books

  14. Example: Amazon similarity 96 books Eigenvalues Rank 96 books

  15. Well, let’s just make S be a kernel matrix 0 0

  16. Well, let’s just make S be a kernel matrix 0 0

  17. Well, let’s just make S be a kernel matrix 0 0

  18. Well, let’s just make S be a kernel matrix Flip, Clip or Shift? Best bet is Clip. 0 0

  19. Well, let’s just make S be a kernel matrix Learn the best kernel matrix for the SVM: (Luss NIPS 2007, Chen et al. ICML 2009)

  20. Approaches to Similarity-based Classification

  21. Let the similarities to the training samples be features • SVM (Graepel et al., 1998; Liao & Noble, 2003) • Linear programming (LP) machine (Graepel et al., 1999) • Linear discriminant analysis (LDA) (Pekalska et al., 2001) • Quadratic discriminant analysis (QDA) (Pekalska & Duin, 2002) • Potential support vector machine (P-SVM) (Hochreiter & Obermayer, 2006; Knebel et al., 2008)

  22. Approaches to Similarity-based Classification

  23. Weighted Nearest-Neighbors Take a weighted vote of the k-nearest-neighbors: Algorithmic parallel of the exemplar model of human learning. ?

  24. Weighted Nearest-Neighbors Take a weighted vote of the k-nearest-neighbors: Algorithmic parallel of the exemplar model of human learning.

  25. Design Goals for the Weights ?

  26. Design Goals for the Weights ? Design Goal 1 (Affinity):wi should be an increasing function of ψ(x, xi).

  27. Design Goals for the Weights ?

  28. Design Goals for the Weights (Chen et al. JMLR 2009) ? Design Goal 2 (Diversity):wi should be a decreasing function of ψ(xi, xj).

  29. Linear Interpolation Weights Linear interpolation weights will meet these goals:

  30. Linear Interpolation Weights Linear interpolation weights will meet these goals:

  31. LIME weights Linear interpolation weights will meet these goals: Linear interpolation with maximum entropy (LIME) weights (Gupta et al., IEEE PAMI 2006):

  32. LIME weights Linear interpolation weights will meet these goals: Linear interpolation with maximum entropy (LIME) weights (Gupta et al., IEEE PAMI 2006):

  33. LIME weights Linear interpolation weights will meet these goals: Linear interpolation with maximum entropy (LIME) weights (Gupta et al., IEEE PAMI 2006):

  34. LIME weights Linear interpolation weights will meet these goals: Linear interpolation with maximum entropy (LIME) weights (Gupta et al., IEEE PAMI 2006):

  35. Kernelize Linear Interpolation (Chen et al. JMLR 2009)

  36. Kernelize Linear Interpolation regularizes the variance of the weights

  37. Kernelize Linear Interpolation only need inner products – can replace with kernel or similarities!

  38. KRI Weights Satisfy Design Goals Kernel ridge interpolation (KRI) weights:

  39. KRI Weights Satisfy Design Goals Kernel ridge interpolation (KRI) weights: affinity:

  40. KRI Weights Satisfy Design Goals Kernel ridge interpolation (KRI) weights: diversity:

  41. KRI Weights Satisfy Design Goals Kernel ridge interpolation (KRI) weights:

  42. KRI Weights Satisfy Design Goals Kernel ridge interpolation (KRI) weights: Remove the constraints on the weights: Can show equivalent to local ridge regression: KRR weights.

  43. Weighted k-NN: Example 1 KRI weights KRR weights

  44. Weighted k-NN: Example 2 KRI weights KRR weights

  45. Weighted k-NN: Example 3 KRI weights KRR weights

More Related