1 / 24

Feature extraction using fuzzy complete linear discriminant analysis

Feature extraction using fuzzy complete linear discriminant analysis. The reporter : Cui Yan. 2012. 4. 26. The report outlines. 1.The fuzzy K-nearest neighbor classifier (FKNN) 2.The fuzzy complete linear discriminant analysis 3.Expriments. The Fuzzy K-nearest neighbor classifier

katima
Download Presentation

Feature extraction using fuzzy complete linear discriminant analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Feature extraction using fuzzy complete linear discriminant analysis The reporter:Cui Yan 2012. 4. 26

  2. The report outlines 1.The fuzzy K-nearest neighbor classifier (FKNN) 2.The fuzzy complete linear discriminant analysis 3.Expriments

  3. The Fuzzy K-nearest neighbor classifier (FKNN)

  4. Each sample should be classified similarly to its surrounding samples, therefore, a unknown sample could be predicated by considering the classification of its nearest neighbor samples. The K-nearest neighbor classifier (KNN)

  5. KNN tries to classify an unknown sample based on its k-known classification neighbors.

  6. FKNN Given a sample set , a fuzzy M -class partition of these vectors specify the membership degrees of each sample corres-ponding to each class. The membership degree of a training vector to o each of M classes is specified by , which is computed by the following steps:

  7. Step 1: Compute the distance matrix between pairs of feature vectors in the training. Step 2: Set diagonal elements of this matrix to infinity (practically place large numeric values there).

  8. Step 3: Sort the distance matrix (treat each of its column separately) in an ascending order. Collect the class labels of the patterns located in the closest neigh- borhood of the pattern under consi- deration (as we are concerned with k neighbors, this returns a list of k integers).

  9. Step 4: Compute the membership grade to class i for j-th pattern using the expression proposed in [1]. [1] J.M. Keller, M.R. Gray, J.A. Givens, A fuzzy k-nearest neighbor algorithm, IEEE Trans. Syst.Man Cybernet. 1985, 15(4):580-585

  10. A example for FKNN

  11. Set k=3

  12. The fuzzy complete linear discriminant analysis

  13. For the training set , we define the i-th class mean by combining the fuzzy membership degree as And the total mean as (1) (2)

  14. Incorporating the fuzzy membership degree, the between-class, the within-class and the total class fuzzy scatter matrix of samples can be defined as (3)

  15. step1: Calculate the membership degree matrix U by the FKNN algorithm. step 2: According toEqs.(1)-(3)work out the between-class, within-class and total class fuzzy scatter matrices. step 3: Work out the orthogonal eigenvectors p1, . . . , pl of the total class fuzzy scatter matrix corresponding to positive eigenvalues. Algorithm of the fuzzy complete linear analysis

  16. step 4: Let P = (p1, . . . , pl) and , work out the orthogonal eigenvectors g1, . . . , gr of correspending the zero eigenvalues. step 5: Let P1 = (g1, . . . , gr) and , work out the orthogonal eigenvectors v1, . . . , vr of , calculate the irregular discriminant vectors by .

  17. step 6: Work out the orthogonal eigenvectors q1,…, qs of correspending the non-zero eigenvalues. step 7: Let P2 = (q1,…, qs) and , work out the optimal discriminant vectors vr+1, . . . , vr+s by the Fisher LDA, calculate the regular discriminant vectors by . step 8: (Recognition): Project all samples into the obtained optimal discriminant vectors and classify.

  18. Experiments

  19. We compare Fuzzy-CLDA with CLDA, UWLDA, FLDA, Fuzzy Fisherface, FIFDA on 3 different data sets from the UCI data sources. The characteristics of the three datasets can be found from (http://archive.ics.uci.edu/ml/datasets). All data sets are randomly split to the train set and test set with the ratio 1:4. Experiments are repeated 25 times to obtain mean prediction error rate as a performance measure, NCC is adopted to classify the test samples by using L2 norm.

  20. Thanks! 2012. 4. 26

More Related