1 / 20

CHAPTER 15

CLASSIFICATION. CHAPTER 15. Supervised Classification. A. Dermanis. . . 1 n i. m i = x. 1 n i. x  S i. x  S i. C i = ( x – m i )( x – m i ) T. Supervised Classification. The known pixels in each one of the predecided classes ω 1 , ω 2 , ..., ω K ,

Download Presentation

CHAPTER 15

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CLASSIFICATION CHAPTER 15 Supervised Classification A. Dermanis

  2.  1 ni mi = x 1 ni xSi xSi Ci = (x – mi)(x – mi)T Supervised Classification The known pixels in each one of the predecided classes ω1, ω2, ..., ωK, form corresponding “sample sets”S1, S2, ..., SK with n1, n2, ..., nK number of pixels respectively. Estimates from each sample set Si, (i = 1, 2, …, K) : Class mean vectors: Class covariance matrices: Supervised classification methods: Parallelepiped Euclidean distance (minimization) Mahalanobis distance (minimization) Maximum likelihood Bayesian (maximum a posteriori probability density) A. Dermanis

  3. dE(x,x) = || x – x || = (x1 – x1)2 + (x2 – x2)2 + … + (xB – xB)2 || x – mi || = min || x – mk || x i k Classification with Euclidean distance (a) Simple Assign each pixel to the class of the closest center (class mean) Boundaries between class regions = = perpendicular at middle of segment joining the class centers A. Dermanis

  4. dE(x,x) = || x – x || = (x1 – x1)2 + (x2 – x2)2 + … + (xB – xB)2 x i || x – mi || T || x – mi || = min || x – mk || k Classification with Euclidean distance (b) with threshold T Assign each pixel to the class of the closest center (class mean) if distance < threshold || x – mi || > T, ix 0 Leave pixel unclassified (class ω0) if all class centers are at distances larger than threshold A. Dermanis

  5. dE(x,x) = || x – x || = (x1 – x1)2 + (x2 – x2)2 + … + (xB – xB)2 Classification with Euclidean distance WRONG RIGHT The role of statistics (dispersion) in classification A. Dermanis

  6. standard deviations for each band ij = (Ci)jjj=1,2,…,B parallelepipedsPi x = [x1 … xj … xB]TPj mij – kij  xj  mij + kij j=1,2,…,B Classification: xPjxi  x Pix0 i Classification with the parallelepiped method A. Dermanis

  7. dM(x,x) = (x – x)TC–1 (x – x)  i xSi  1 N C = (x – mi)(x – mi)T = niCi i dM(x,mi) < dM(x,mk), ki dM(x,mi)  T, 1 N xi Classificationwiththe Mahalanobis distance Mahalanobis distance: (total covariance matrix) dM(x,mi) < dM(x,mk), kixi Classification (simple): Classification with threshold: dM(x,mi) > T, ix0 A. Dermanis

  8. 1 1 2 li(x) = exp [ – (x – mi)TCi–1 (x – mi) ] (2)B/2 |Ci|1/2 Classification with the maximum likelihood method Probability distribution density function or likelihood functionof class ωi: li(x) > lk(x) k i  xi Classification: Equivalent use of decision function: di(x) = 2 ln[li(x)] + B ln(2) = – ln |Ci| – (x–mi)TCi–1 (x–mi) di(x) > dk(x) k i  xi A. Dermanis

  9. Classification using the Bayesian approach N : total number of pixels in the image (i.e. in each band) B : number of bands, ω1, ω2, …, ωK : the K classes present in the image Ni :number of image pixels belonging to the classωi (i = 1,2, …, K) nx :number of pixels with value x (= vector of values in all bands) nxi :number of pixels with value x which also belong to the class ωi A. Dermanis

  10. Classification using the Bayesian approach N : total number of pixels in the image (i.e. in each band) B : number of bands, ω1, ω2, …, ωK : the K classes present in the image Ni :number of image pixels belonging to the classωi (i = 1,2, …, K) nx :number of pixels with value x (= vector of values in all bands) nxi :number of pixels with value x which also belong to the class ωi A. Dermanis

  11. Classification using the Bayesian approach N : total number of pixels in the image (i.e. in each band) B : number of bands, ω1, ω2, …, ωK : the K classes present in the image Ni :number of image pixels belonging to the classωi (i = 1,2, …, K) nx :number of pixels with value x (= vector of values in all bands) nxi :number of pixels with value x which also belong to the class ωi Basic identity: A. Dermanis

  12. Classification using the Bayesian approach N : total number of pixels in the image (i.e. in each band) B : number of bands, ω1, ω2, …, ωK : the K classes present in the image Ni :number of image pixels belonging to the classωi (i = 1,2, …, K) nx :number of pixels with value x (= vector of values in all bands) nxi :number of pixels with value x which also belong to the class ωi Basic identity: A. Dermanis

  13. Classification using the Bayesian approach N : total number of pixels in the image (i.e. in each band) B : number of bands, ω1, ω2, …, ωK : the K classes present in the image Ni :number of image pixels belonging to the classωi (i = 1,2, …, K) nx :number of pixels with value x (= vector of values in all bands) nxi :number of pixels with value x which also belong to the class ωi Basic identity: A. Dermanis

  14. p(i) = p(x) = nxi Ni nxi N nxi nx Ni N nx N p(x|i) = p(i|x) = p(x,i) = probability of a pixel to belong to the class ωi probability of a pixel to have the value x probability of a pixel belonging to the class ωi to have value x (conditional probability) probability of a pixel having value x to belong to the class ωi (conditional probability) probability of a pixel to have the value x and to simultaneously belong to ωi (joint probability) A. Dermanis

  15. p(i) = p(x) = nxi Ni nxi N nxi nx Ni N nx N p(x|i) = p(i|x) = p(x,i) = probability of a pixel to belong to the class ωi probability of a pixel to have the value x probability of a pixel belonging to the class ωi to have value x (conditional probability) probability of a pixel having value x to belong to the class ωi (conditional probability) probability of a pixel to have the value x and to simultaneously belong to ωi (joint probability) A. Dermanis

  16. p(i) = p(x) = nxi Ni nxi nx nxi N Ni N nx N p(x|i) = p(i|x) = p(x,i) = probability of a pixel to belong to the class ωi probability of a pixel to have the value x probability of a pixel belonging to the class ωi to have value x (conditional probability) probability of a pixel having value x to belong to the class ωi (conditional probability) probability of a pixel to have the value x and to simultaneously belong to ωi (joint probability) formula of Bayes A. Dermanis

  17. Pr(AB) Pr(A|B) = Pr(B) Pr(A|B) Pr(B) Pr(B|A) = Pr(A) p(x|i)p(i) p(i|x) = p(x) The Bayes theorem: Pr(A|B)Pr(B) = Pr(AB) = Pr(B|A)Pr(A) event A = occurrence of the value xevent B = occurence of the classωi p(i|x) > p(k|x) kixi Classification: p(x) = not necessary (common constant factor) p(x|i) p(i) > p(x|k) p(k) k ixi Classification: A. Dermanis

  18. p(x|i)p(i) = max [p(x|k)p(k) xi k 1 1 2 p(x|i) = li(x) = exp{– –(x–mi)TCi–1(x–mi) } (2)B/2|Ci|1/2 1 2 1 2 – –(x–mi)TCi–1(x–mi) – –ln[ |Ci| + ln[p(i)] = max (x–mi)TCi–1(x–mi) + ln[ |Ci| + ln[p(i)] = min Classification: for Gaussian distribution: p(x|i) p(i) = max Instead of ln[p(x|i) p(i)] = ln[p(x|i) + ln[p(i) = max equivalent or finally: A. Dermanis

  19. (x–mi)TCi–1(x–mi) = min p(1) = p(2) = … = p(K) C1 = C2 = … = CK = C p(1) = p(2) = … = p(K) C1 = C2 = … = CK = I   (x–mi)TCi–1(x–mi) + ln[ |Ci| = min (x–mi)TCi–1(x–mi) + ln[ |Ci| + ln[p(i)] = min (x–mi)T(x–mi) = min Bayesian Classification for Gaussian distribution : SPECIAL CASES: p(1) = p(2) = … = p(K)  Maximum Likelihood ! Mahalanobis distance ! Euclidean distance ! A. Dermanis

  20. Want to learn more ? A. Dermanis L. Biagi: Telerilevamento Casa Editrice Ambrosiana

More Related