1 / 62

Prof NB Venkateswarlu Head, IT, GVPCOE Visakhapatnam venkat_ritch@yahoo

Prof NB Venkateswarlu Head, IT, GVPCOE Visakhapatnam venkat_ritch@yahoo.com www.ritchcenter.com/nbv. First Let me say Hearty Welcome to you All. Also, let me congrachulate Chairman, Secretary/Correspondent. Principal, Prof. Ravindra Babu Vice-Principal.

maris
Download Presentation

Prof NB Venkateswarlu Head, IT, GVPCOE Visakhapatnam venkat_ritch@yahoo

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Prof NB Venkateswarlu Head, IT, GVPCOE Visakhapatnam venkat_ritch@yahoo.com www.ritchcenter.com/nbv

  2. First Let me say Hearty Welcome to you All

  3. Also, let me congrachulate Chairman, Secretary/Correspondent

  4. Principal, Prof. Ravindra Babu Vice-Principal

  5. and other Organizers for planning for such a nice workshop with excellent themes.

  6. My Talk Feature Extraction/ Selection

  7. A Typical Image Processing System contains Image Acquisition Image Pre-Processing Image En-hancement Image Seg-mentation Image Unde-rstanding Image Class-fication Image Featu-re Extraction

  8. Two Aspects of Feature Extraction Extracting useful features from images or any other measurements.

  9. Identifying Transformed Variables which are functions of original variables and having some charcateristics.

  10. Feature Selection Selecting Important Variables is Feature Selection

  11. Some Features Used in I.P Applications

  12. Shape based • Contour based • Area based • Transform based • Projections • Signature • Problem specific

  13. Perimeter, length etc. First Convex hull is extracted

  14. Skeletons

  15. Averaged Radial density

  16. Radial Basis functions

  17. Rose Plots

  18. Chain Codes

  19. Crack code - 32330300

  20. Signature

  21. Bending Energy

  22. Chord Distribution

  23. Fourier Descriptors

  24. Structure

  25. Splines

  26. Horizontal and vertical projections

  27. Elongatedness

  28. Convex Hull

  29. Compactness

  30. RGB, R ,G and B bands

  31. Classification/Pattern Recognition • Statistical • Syntactical Linguistic • Discriminant function • Fuzzy • Neural • Hybrid

  32. Dimensionality Reduction • Feature selection (i.e., attribute subset selection): • Select a minimum set of features such that the probability distribution of different classes given the values for those features is as close as possible to the original distribution given the values of all features • reduce # of patterns in the patterns, easier to understand • Heuristic methods (due to exponential # of choices): • step-wise forward selection • step-wise backward elimination • combining forward selection and backward elimination • decision-tree induction

  33. A4 ? A6? A1? Class 2 Class 2 Class 1 Class 1 Reduced attribute set: {A1, A4, A6} > Example of Decision Tree Induction Initial attribute set: {A1, A2, A3, A4, A5, A6}

  34. Heuristic Feature Selection Methods • There are 2dpossible sub-features of d features • Several heuristic feature selection methods: • Best single features under the feature independence assumption: choose by significance tests. • Best step-wise feature selection: • The best single-feature is picked first • Then next best feature condition to the first, ... • Step-wise feature elimination: • Repeatedly eliminate the worst feature • Best combined feature selection and elimination: • Optimal branch and bound: • Use feature elimination and backtracking

  35. Why do We need? • A classifier performance depends on • No of features • Feature distinguishability • No of groups • Groups characteristics in multidimensional space. • Needed response time • Memory requirements

  36. Feature Extraction Methods We will find transformed variables which are functions of original variables. A good example: Though we may conduct tests in more than test (K-D), finally grading is done based on total marks (1-D)

  37. Principal Component Analysis • Given N data vectors from k-dimensions, find c <= k orthogonal vectors that can be best used to represent data • The original data set is reduced to one consisting of N data vectors on c principal components (reduced dimensions) • Each data vector is a linear combination of the c principal component vectors • Works for numeric data only • Used when the number of dimensions is large

  38. X2 Y1 Y2 X1 Principal Component Analysis

  39. Principal Component Analysis Aimed at finding new co-ordinate system which has some characteristics. M=[4.5 4.25 ] Cov Matrix [ 2.57 1.86 ] [ 1.86 6.21] Eigen Values = 6.99, 1.79 Eigen Vectors = [ 0.387 0.922 ] [ -0.922 0.387 ]

  40. However in some cases it is not possible to have PCA working.

  41. Canonical Analysis

  42. Unlike PCA which takes global mean and covariance, this takes between the group and within the group covariance matrix and the calculates canonical axes.

  43. Standard Deviation – A Simple Indicator Correlation Coefficient

  44. Feature Selection – Group Separability Indices

More Related