slide1
Download
Skip this Video
Download Presentation
COMPE 467 - Pattern Recognition

Loading in 2 Seconds...

play fullscreen
1 / 50

COMPE 467 - Pattern Recognition - PowerPoint PPT Presentation


  • 135 Views
  • Uploaded on

Bayesian Decision Theory. COMPE 467 - Pattern Recognition. Covariance matrices for all of the classes are identical, But covariance matrices are arbitrary. and are independent of i. So,. Covariance matrices are different for each category.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' COMPE 467 - Pattern Recognition' - guillermo-manzanero


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

Bayesian Decision Theory

COMPE 467 - Pattern Recognition

slide2

Covariance matrices for all of the classes are identical,

  • But covariance matrices are arbitrary.
example cont1
EXAMPLE (cont.):

Find the discriminant function for the first class.

example cont2
EXAMPLE (cont.):

Find the discriminant function for the first class.

example cont3
EXAMPLE (cont.):

Similarly, find the discriminant function for the second class.

example cont4
EXAMPLE (cont.):

The decision boundary:

example cont5
EXAMPLE (cont.):

The decision boundary:

example cont6
EXAMPLE (cont.):

Using MATLAB we can draw the decision boundary:

(to draw the decision boundary in MATLAB)

>> s = \'x^2-10*x-4*x*y+8*y-1+2*log(2)\';

>> ezplot(s)

example cont7
EXAMPLE (cont.):

Using MATLAB we can draw the decision boundary:

receiver operating characteristics
Receiver Operating Characteristics
  • Another measure of distance between two Gaussian distributions.
  • found a great use in medicine, radar detection and other fields.
receiver operating characteristics4
Receiver Operating Characteristics
  • If both diagnosis and test are positive,it is called a true positive. The probability of a TP to occur is estimated by counting thetrue positives in the sample and divide by the sample size.
  • If the diagnosis is positive andthe test is negative it is called a false negative (FN).
  • False positive (FP) and true negative(TN) are defined similarly.
receiver operating characteristics5
Receiver Operating Characteristics
  • The values described are used to calculate different measurements of the quality of thetest.
  • The first one is sensitivity, SE, which is the probability of having a positive test among the patients who have a positive diagnosis.
receiver operating characteristics6
Receiver Operating Characteristics
  • Specificity, SP, is the probability of having a negative test among the patients who have anegative diagnosis.
receiver operating characteristics9
Receiver Operating Characteristics
  • Overlap in distributions:
bayes decision theory discrete features
Bayes Decision Theory – Discrete Features
  • Assume x = (x1.......xd) takes only m discrete values
  • Components of x are binary or integer valued, x can take only one of m discrete values

v1, v2, …, vm

bayes decision theory discrete features decision strategy
Bayes Decision Theory – Discrete Features (Decision Strategy)
  • Maximize class posterior using bayes decision rule

P(1 | x) = P(x | j) . P (j)

S P(x | j) . P (j)

Decide i if P(i | x) > P(i | x)for all i ≠ j

or

  • Minimize the overall risk by selecting the action i = *: deciding 2

* = arg min R(i | x)

bayes decision theory discrete features independent binary features in 2 category problem
Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem)
bayes decision theory discrete features independent binary features in 2 category problem1
Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem)
bayes decision theory discrete features independent binary features in 2 category problem2
Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem)
bayes decision theory discrete features independent binary features in 2 category problem3
Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem)
bayes decision theory discrete features independent binary features in 2 category problem4
Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem)
bayes decision theory discrete features independent binary features in 2 category problem5
Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem)
bayes decision theory discrete features independent binary features in 2 category problem example
Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem) EXAMPLE
bayes decision theory discrete features independent binary features in 2 category problem example1
Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem) EXAMPLE
bayes decision theory discrete features independent binary features in 2 category problem example2
Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem) EXAMPLE
bayes decision theory discrete features independent binary features in 2 category problem example3
Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem) EXAMPLE
bayes decision theory discrete features independent binary features in 2 category problem example4
Bayes Decision Theory – Discrete Features (Independent binary features in 2-category problem) EXAMPLE
application example
APPLICATION EXAMPLE

Bit – matrix for machine – printed characters

a pixel

Here, each pixel may be taken as a feature

For above problem, we have

is the probabilty that for letter A,B,…

B

A

summary
Summary
  • To minimize the overall risk, choose the action thatminimizes the conditional risk R (α |x).
  • To minimize the probability of error, choose the class thatmaximizes the posterior probability P (wj |x).
  • If there are different penalties for misclassifying patternsfrom different classes, the posteriors must be weightedaccording to such penalties before taking action.
  • Do not forget that these decisions are the optimal onesunder the assumption that the “true” values of theprobabilities are known.
references
References
  • R.O. Duda, P.E. Hart, and D.G. Stork, Pattern Classification, New York: John Wiley, 2001.
  • Selim Aksoy, Pattern Recognition Course Materials, 2011.
  • M. Narasimha Murty, V. Susheela Devi, Pattern Recognition an Algorithmic Approach, Springer, 2011.
  • “Bayes Decision Rule Discrete Features”, http://www.cim.mcgill.ca/~friggi/bayes/decision/binaryfeatures.html, access: 2011.
ad