Bayesian Decision Theory
This presentation is the property of its rightful owner.
Sponsored Links
1 / 50

COMPE 467 - Pattern Recognition PowerPoint PPT Presentation


  • 94 Views
  • Uploaded on
  • Presentation posted in: General

Bayesian Decision Theory. COMPE 467 - Pattern Recognition. Covariance matrices for all of the classes are identical, But covariance matrices are arbitrary. and are independent of i. So,. Covariance matrices are different for each category.

Download Presentation

COMPE 467 - Pattern Recognition

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Compe 467 pattern recognition

Bayesian Decision Theory

COMPE 467 - Pattern Recognition


Compe 467 pattern recognition

  • Covariance matrices for all of the classes are identical,

  • But covariance matrices are arbitrary.


Compe 467 pattern recognition

and are independent of i.

So,


Compe 467 pattern recognition

  • Covariance matrices are different for each category.


Compe 467 pattern recognition

only is independent of i.

So,


Example

EXAMPLE:


Example cont

EXAMPLE (cont.):


Example cont1

EXAMPLE (cont.):

Find the discriminant function for the first class.


Example cont2

EXAMPLE (cont.):

Find the discriminant function for the first class.


Example cont3

EXAMPLE (cont.):

Similarly, find the discriminant function for the second class.


Example cont4

EXAMPLE (cont.):

The decision boundary:


Example cont5

EXAMPLE (cont.):

The decision boundary:


Example cont6

EXAMPLE (cont.):

Using MATLAB we can draw the decision boundary:

(to draw the decision boundary in MATLAB)

>> s = 'x^2-10*x-4*x*y+8*y-1+2*log(2)';

>> ezplot(s)


Example cont7

EXAMPLE (cont.):

Using MATLAB we can draw the decision boundary:


Example cont8

EXAMPLE (cont.):


Error probabilities and integrals

Error Probabilities and Integrals


Error probabilities and integrals1

Error Probabilities and Integrals


Error probabilities and integrals2

Error Probabilities and Integrals


Receiver operating characteristics

Receiver Operating Characteristics

  • Another measure of distance between two Gaussian distributions.

  • found a great use in medicine, radar detection and other fields.


Receiver operating characteristics1

Receiver Operating Characteristics


Receiver operating characteristics2

Receiver Operating Characteristics


Receiver operating characteristics3

Receiver Operating Characteristics


Receiver operating characteristics4

Receiver Operating Characteristics

  • If both diagnosis and test are positive,it is called a true positive. The probability of a TP to occur is estimated by counting thetrue positives in the sample and divide by the sample size.

  • If the diagnosis is positive andthe test is negative it is called a false negative (FN).

  • False positive (FP) and true negative(TN) are defined similarly.


Receiver operating characteristics5

Receiver Operating Characteristics

  • The values described are used to calculate different measurements of the quality of thetest.

  • The first one is sensitivity, SE, which is the probability of having a positive test among the patients who have a positive diagnosis.


Receiver operating characteristics6

Receiver Operating Characteristics

  • Specificity, SP, is the probability of having a negative test among the patients who have anegative diagnosis.


Receiver operating characteristics7

Receiver Operating Characteristics

  • Example:


Receiver operating characteristics8

Receiver Operating Characteristics

  • Example (cont.):


Receiver operating characteristics9

Receiver Operating Characteristics

  • Overlap in distributions:


Bayes decision theory discrete features

Bayes Decision Theory – Discrete Features

  • Assume x = (x1.......xd) takes only m discrete values

  • Components of x are binary or integer valued, x can take only one of m discrete values

    v1, v2, …, vm


Bayes decision theory discrete features decision strategy

Bayes Decision Theory – Discrete Features (Decision Strategy)

  • Maximize class posterior using bayes decision rule

    P(1 | x) = P(x | j) . P (j)

    S P(x | j) . P (j)

    Decide i if P(i | x) > P(i | x)for all i ≠ j

    or

  • Minimize the overall risk by selecting the action i = *: deciding 2

    * = arg min R(i | x)


Bayes decision theory discrete features independent binary features in 2 category problem

Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem)


Bayes decision theory discrete features independent binary features in 2 category problem1

Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem)


Bayes decision theory discrete features independent binary features in 2 category problem2

Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem)


Bayes decision theory discrete features independent binary features in 2 category problem3

Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem)


Bayes decision theory discrete features independent binary features in 2 category problem4

Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem)


Bayes decision theory discrete features independent binary features in 2 category problem5

Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem)


Bayes decision theory discrete features independent binary features in 2 category problem example

Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem) EXAMPLE


Bayes decision theory discrete features independent binary features in 2 category problem example1

Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem) EXAMPLE


Bayes decision theory discrete features independent binary features in 2 category problem example2

Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem) EXAMPLE


Bayes decision theory discrete features independent binary features in 2 category problem example3

Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem) EXAMPLE


Bayes decision theory discrete features independent binary features in 2 category problem example4

Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem) EXAMPLE


Application example

APPLICATION EXAMPLE

Bit – matrix for machine – printed characters

a pixel

Here, each pixel may be taken as a feature

For above problem, we have

is the probabilty that for letter A,B,…

B

A


Summary

Summary

  • To minimize the overall risk, choose the action thatminimizes the conditional risk R (α |x).

  • To minimize the probability of error, choose the class thatmaximizes the posterior probability P (wj |x).

  • If there are different penalties for misclassifying patternsfrom different classes, the posteriors must be weightedaccording to such penalties before taking action.

  • Do not forget that these decisions are the optimal onesunder the assumption that the “true” values of theprobabilities are known.


References

References

  • R.O. Duda, P.E. Hart, and D.G. Stork, Pattern Classification, New York: John Wiley, 2001.

  • Selim Aksoy, Pattern Recognition Course Materials, 2011.

  • M. Narasimha Murty, V. Susheela Devi, Pattern Recognition an Algorithmic Approach, Springer, 2011.

  • “Bayes Decision Rule Discrete Features”, http://www.cim.mcgill.ca/~friggi/bayes/decision/binaryfeatures.html, access: 2011.


  • Login