Discriminative training of chow liu tree multinet classifiers
Download
1 / 21

Discriminative Training of Chow-Liu tree Multinet Classifiers - PowerPoint PPT Presentation


  • 133 Views
  • Uploaded on

Discriminative Training of Chow-Liu tree Multinet Classifiers. Huang, Kaizhu Dept. of Computer Science and Engineering, CUHK. Outline. Background Classifiers Discriminative classifiers Generative classifiers Bayesian Multinet Classifiers Motivation

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Discriminative Training of Chow-Liu tree Multinet Classifiers' - quinta


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Discriminative training of chow liu tree multinet classifiers

Discriminative Training of Chow-Liu tree Multinet Classifiers

Huang, Kaizhu

Dept. of Computer Science and Engineering,

CUHK


Outline
Outline Classifiers

  • Background

    • Classifiers

      • Discriminative classifiers

      • Generative classifiers

        • Bayesian Multinet Classifiers

  • Motivation

  • Discriminative Bayesian Multinet Classifiers

  • Experiments

  • Conclusion


Discriminative classifiers

SVM Classifiers

Discriminative Classifiers

  • Directly maximize a discriminative function


Generative classifiers

P2(x|C2) Classifiers

P1(x|C1)

Generative Classifiers

  • Estimate the distribution for each class, and then use Bayes rule to perform classification


Comparison
Comparison Classifiers

Example of Missing Information:

From left to right: Original digit, Cropped and resized digit, 50% missing digit, 75% missing digit, and occluded digit.


Comparison continue
Comparison (Continue) Classifiers

  • Discriminative Classifiers cannot deal with missing information problems easily.

  • Generative Classifiers provide a principled way to handle missing information problems.

  • When is missing, we can use MarginalizedP1 and P2 to perform classification


Handling missing information problem
Handling Missing Information Problem Classifiers

SVM

TJT: a generative model


Motivation
Motivation Classifiers

  • It seems that a good classifier should combine the strategies of discriminative classifiers and generative classifiers

  • Our work trains the one of the generative classifier: the generativeBayesian Multinet classifier in a discriminative way


Roadmap of our work
Roadmap of our work Classifiers


How our work relates to other work

Discriminative Classifiers Classifiers

HMM and GMM

Generative Classifiers

Discriminative training

1.

2.

How our work relates to other work?

Jaakkola and Haussler NIPS98

Difference: Our method performs a reverse process:

From Generative classifiers to Discriminative classifiers

Beaufays etc., ICASS99, Hastie etc., JRSS 96

Difference: Our method is designed for Bayesian Multinet Classifiers, a more general classifier.


Problems of bayesian multinet classifiers

Pre-classified dataset Classifiers

Sub-dataset D1

for Class I

Sub-dataset D2

for Class 2

Estimate the distribution P1 to

approximate D1 accurately

Estimate the distribution P2 to

approximate D2 accurately

Use Bayes rule to

perform classification

Problems of Bayesian Multinet Classifiers

Comments: This framework discards the divergence information between classes.


Our training scheme
Our Training Scheme Classifiers


Mathematic explanation
Mathematic Explanation Classifiers

  • Bayesian Multinet Classifiers (BMC)

  • Discriminative Training of BMC



Finding p1 and p2
Finding ClassifiersP1 and P2


Finding p1 and p21
Finding ClassifiersP1 and P2


Experimental setup
Experimental Setup Classifiers

  • Datasets

    • 2 benchmark datasets from UCI machine learning repository

      • Tic-tac-toe

      • Vote

  • Experimental Environments

    • Platform:Windows 2000

    • Developing tool: Matlab 6.5


  • Error rate
    Error Rate Classifiers



    Conclusion
    Conclusion Classifiers

    • A discriminative training procedure for generative Bayesian Multinet Classifiers is presented

    • This approach improves the recognition rate for two benchmark datasets significantly

    • The theoretic exploration on the convergence performance of this approach is on the way.


    ad