Learning multiple nonredundant clusterings
Download
1 / 23

Learning multiple nonredundant clusterings - PowerPoint PPT Presentation


  • 66 Views
  • Uploaded on

Learning multiple nonredundant clusterings. Presenter : Wei- Hao Huang Authors : Ying Gui , Xiaoli Z. Fern, Jennifer G. DY TKDD, 2010. Outlines. Motivation Objectives Methodology Experiments Conclusions Comments. Motivation.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Learning multiple nonredundant clusterings' - gwen


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Learning multiple nonredundant clusterings

Learning multiple nonredundantclusterings

Presenter : Wei-Hao Huang

Authors : Ying Gui, Xiaoli Z. Fern, Jennifer G. DY

TKDD, 2010


Outlines
Outlines

  • Motivation

  • Objectives

  • Methodology

  • Experiments

  • Conclusions

  • Comments


Motivation
Motivation

  • Data exist multiple groupings that are reasonable and interesting from different perspectives.

  • Traditional clustering is restricted to finding only one single clustering.


Objectives
Objectives

  • To propose a new clustering paradigm for finding all non-redundant clustering solutions of the data.


Methodology
Methodology

  • Orthogonal clustering

    • Cluster space

  • Clustering in orthogonal subspaces

    • Feature space

  • Automatically Finding the number of clusters

  • Stopping criteria


Orthogonal c lustering framework
Orthogonal Clustering Framework

X (Face dataset)


Orthogonal clustering
Orthogonal clustering

)

Residue space


Clustering in orthogonal subspaces
Clustering in orthogonal subspaces

Projection Y=ATX

  • Feature space

    • linear discriminant analysis (LDA)

    • singular value decomposition (SVD)

    • LDA v.s. SVD

      • where


Clustering in orthogonal subspaces1
Clustering in orthogonal subspaces

A(t)= eigenvectors of

Residue space


Compare moethod1 and mothod2
Compare moethod1 and mothod2

A(t)= eigenvectors of

M’=M then P1=P2

  • Residue space

  • Moethod1

  • Moethod2

  • Moethod1 is a special case of Moethod2.


Experiments
Experiments

  • To use PCA to reduce dimensional

  • Clustering

    • K-means clustering

      • Smallest SSE

    • Gaussian mixture model clustering (GMM)

      • Largest maximum likelihood

  • Dataset

    • Synthetic

    • Real-world

      • Face, WebKB text, Vowel phoneme, Digit


Experiments1
Experiments

Evaluation


Experiments2
Experiments

Synthetic


Experiments3
Experiments

Face dataset


Experiments4
Experiments

WebKB dataset

Vowe phoneme dataset


Experiments5
Experiments

Digit dataset


Experiments6
Experiments

  • Finding the number of clusters

    • K-means  Gap statistics


Experiments7
Experiments

  • Finding the number of clusters

  • GMMBIC

  • Stopping Criteria

    • SSE is less than 10% at first iteration

    • Kopt=1

    • Kopt> Kmax Select Kmax

    • Gap statistics

    • BIC Maximize value of BIC


  • Experiments8
    Experiments

    Synthetic dataset


    Experiments9
    Experiments

    Face dataset


    Experiments10
    Experiments

    WebKB dataset


    Conclusions
    Conclusions

    • To discover varied interesting and meaningful clustering solutions.

    • Method2 is able to apply any clustering and dimensionality reduction algorithm.


    Comments
    Comments

    • Advantages

      • Find Multiple non-redundant clustering solutions

    • Applications

      • Data Clustering


    ad