1 / 81

Introduction to Pattern Recognition

For 정보과학회 , Pattern Recognition Winter School. Introduction to Pattern Recognition. 2011 년 2 월 김 진형 KAIST 전산학과 http://ai.kaist.ac.kr/~jkim. What is Pattern R ecognition?. A pattern is an object, process or event that can be given a name Pattern Recognition

Download Presentation

Introduction to Pattern Recognition

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. For 정보과학회, Pattern Recognition Winter School Introduction to Pattern Recognition 2011년 2월 김 진형 KAIST 전산학과 http://ai.kaist.ac.kr/~jkim

  2. What is Pattern Recognition? • A pattern is an object, process or event that can be given a name • Pattern Recognition • assignment of physical object or event to one of several prespecified categeries -- Duda & Hart • A subfield of Artificial Intelligence • human intelligence is based on pattern recognition 2

  3. Examples of Patterns 3

  4. Pattern Recognition Related fields Application areas • Machine learning • Mathematical statistics • Neural networks • Signal processing • Robotics and vision • Cognitive science • Nonlinear optimization • Exploratory data analysis • Fuzzy and genetic algorithm • Detection and estimation theory • Formal languages • Structural modeling • Biological cybernetics • Computational neuroscience • … Image processing /segmentation Computer vision Speech recognition Automated target recognition Optical character recognition Seismic analysis Man and machine dialogue Fingerprint identification Industrial inspection Medical diagnosis ECG signal analysis Data mining Gene sequence analysis Protein structure analysis Remote sensing Aerial reconnaissance … 4

  5. 패턴인식의 응용 예 • Computer aided diagnosis • Medical imaging, EEF, ECG, X-ray mammography • 영상인식 • 공장 자동화, Robot Navigation • 얼굴식별, Gesture Recognition • Automatic Target Recognition • 음성인식 • Speaker identification • Speech recognition 5 Google Maps Navigation (Beta): search by voice

  6. 패턴인식의 응용 생체 인식(Biometrics Recognition) • 불변의 생체 특징을 이용한 사람 식별 • 정적 패턴 • 지문, 홍채, 얼굴, 장문, … • DNA • 동적 패턴 • Signature, 성문 • Typing pattern • 활용 • 출입통제 • 전자상거래 인증 6

  7. 패턴인식의 응용 Gesture Recognition • Text editing on Pen Computers • Tele-operations • Control remote by gesture input • TV control by hand motion • Sign language Interpretation 7

  8. 패턴인식의 응용 데이터로부터 패턴의 추출 Data Mining 데이타 의사결정 정보 인구통계 Point of Sale ATM 금융통계 신용정보 문헌 첩보자료 진료기록 신체검사기록 A상품 구매자의 80%가 B상품도 구매한다 (CRM) 미국시장의 자동차 구매력이 6개월간 감소 A상품의 매출 증가가 B상품의 2배 탈수 증상을 보이면 위험 광고전략은 ? 상품의 진열 최적의 예산 할당은 ? 시장점유의 확대방안은 ? 고객의 이탈 방지책은 ? 처방은 ? 국내 사례: 신용카드 사용 패턴의 학습에 의한 분실 카드 사용 방지 8

  9. 패턴인식의 응용 e-Book, Tablet PC, iPad, Smart-phone 9

  10. Smart Phone with Rich Sensors

  11. 패턴인식의 응용 Online한글 인식기 비교 11

  12. 패턴인식의 응용 KAIST Math Expression Recognizer : Demo

  13. 패턴인식의 응용 MathTutor-SE Demo

  14. 패턴인식의 응용 古文書 認識 : 承政院 日記 14

  15. 패턴인식의 응용 文書 認識 Verification & Correction Interface 15

  16. Mail Sorter 패턴인식의 응용 16

  17. 패턴인식의 응용 Scene Text Recognition 17

  18. 패턴인식의 응용 Autonomous Land Vehicle(DARPA’s GrandChallenge contest) 18 http://www.youtube.com/watch?v=yQ5U8suTUw0

  19. 패턴인식의 응용 Protein Structure Analysis 19

  20. 패턴인식의 응용 Protein Structure Analysis 20

  21. From Ricardo Gutierrez-Osuna,Texas A&M Univ. Types of PR problems • Classification • Assigning an object to a class • Output: a label of class • Ex: classifying a product as ‘good’ or ‘bad’ in quality control • Clustering • Organizing objects into meaningful groups • Output: (hierarchical) grouping of objects • Ex: taxonomy of species • Regression • Predict value based on observation • Ex: predict stock price, future prediction • Description • Representing an object in terms of a series of primitives • Output: a structural or linguistic description • Ex: labeling ECG signals, video indexing, protein structure indexing 21

  22. Pattern Class • A collection of “similar” (not necessarily identical) objects • Inter-class variability • Intra-class variability • Pattern Class Model • descriptions of each class/population (e.g., a probability density like Gaussian) 22

  23. Classification vs Clustering • Classification (known categories) • Clustering (creation of new categories) Category “A” Category “B” Clustering (Unsupervised Classification) Classification (Recognition) (Supervised Classification) 23

  24. Pattern Recognition : Key Objectives • Process the sensed data to eliminate noise • Data vs Noise • Hypothesize models that describe each class population • Then we may recover the process that generated the patterns. • Choose the best-fitting model for given sensed data to assign the class label associated with the model. 24

  25. Feature Extractor Classifier 일반적인 Classification 과정 Sensor signal Feature Sensor Class Membership 25

  26. Example : Salmon or Sea Bass • Sort incoming fish on a belt according to two classes: • Salmon or • Sea Bass • Steps: • Preprocessing (segmentation) • Feature extraction (measure features or properties) • Classification (make final decision) 26

  27. Sea bass vs Salmon (by Image) • Length • Lightness • Width • Number and shape of fins • Position of the mouth • … 27

  28. Salmon vs. Sea Bass (by length) 28

  29. Best Decision Strategy with lightness Salmon vs. Sea Bass (by lightness) 29

  30. Cost of Misclassification • There are two possible classification errors. (1) deciding a sea bass into a salmon. (2) deciding a salmon into a sea bass. • Which error is more important ? • Generalized as Loss function • Then, look for the decision of minimun Risk • Risk = Expected Loss Loss Function decision truth 30

  31. It is possibly better. Classification with more features(by length and lightness) Really ?? 31

  32. How Many Features and Which? • Choice of features determines success or failure of classification task • For a given feature, we may compute the best decision strategy from the (training) data • Is called training, parameter adaptation, learning • Machine Learning Issues 32

  33. Issues with feature extraction: • Correlated features do not improve performance. • It might be difficult to extract certain features. • It might be computationally expensive to extract many features. • “Curse” of dimensionality … 33

  34. Feature and Feature Vector • Length • Lightness • Width • … • Number and shape of fins • Position of the mouth 34

  35. Goodness of Feature • Features and separability 35

  36. Developing PR system • Sensors and preprocessing. • A feature extraction aims to create discriminative features good for classification. • A classifier. • A teacher provides information about hidden state -- supervised learning. • A learning algorithm sets PR from training examples. Sensors and preprocessing Feature extraction Class assignment Pattern Classifier Teacher Learning algorithm 36

  37. PR Approaches • Template matching • The pattern to be recognized is matched against a stored template • Statistical PR: • based on underlying statistical model of patterns(features) and pattern classes. • Structural PR: Syntactic pattern recognition • pattern classes represented by means of formal structures as grammars, automata, strings, etc. • Not only for classification but also description • Neural networks • classifier is represented as a network of cells modeling neurons of the human brain (connectionist approach). • Knowledge is stored in the connectivity and strength of synaptic weights • Statistical structure Analysis • Combining Structure and statistical analysis • Bayesian Network, MRF 등의 Probabilistic framework을 활용 • … 37 Modified From Vojtěch Franc

  38. PR Approaches Template Matching Template 38 Input scene

  39. PR Approaches Deformable Template Matching: Snake • Example : Corpus Callosum Segmentation Prototype registration to the low-level segmented image Prototype and variation learning Shape training set Prototype warping 39

  40. From Ricardo Gutierrez-Osuna,Texas A&M Univ. PR Approaches 40

  41. Classifier • The task of classifier is to partition feature space into class-labeled decision regions • Borders between decision regions  decision boundaries • Determining decision region of a feature vector X 41

  42. From Vojtěch Franc Representation of classifier A classifier is typically represented as a set of discriminant functions … The classifier assigns a feature vector x to the i-the class if Class identifier …. Feature vector 42 Discriminant function

  43. Classification of Classifiers by Form of Discriminant Function 43

  44. Bayesian Decision Making • Statistical approach • the optimal classifier with Minimum error • Assume that complete statistical model is known. • Decision given the posterior probabilities X is an observation : if P(1 | x) > P(2 | x) decide state of nature = 1 if P(1 | x) < P(2 | x) decide state of nature = 2 44

  45. Searching Decision Boundary 45

  46. Bayesian Rule : P(x|1) P(1|x) 46

  47. From Vojtěch Franc Limitations of Bayesian approach • Statistical model p(x,y) is mostly not known • learning to estimate p(x,y) from training examples {(x1,y1),…,(x,y)} • Usually p(x,y) is assumed to be a parametric form • Ex: multivariate normal distribution • Non-parametric estimation of p(x,y) requires a large set of training samples • Non-Bayesian methods offers equally good (??) 47

  48. From Vojtěch Franc Polynomial Discriminative Function approaches • Assume that G(x) is a polynomial function • Linear function – Linear Discriminant Analysis (LDA) • Quadratic function • Classifier design is determination of separating hyperplane. 48

  49. From Vojtěch Franc LDA Example : 기수(J)-농구선수(H)분리 height Task:기수(J)-농구선수(H)분리 The set of hidden state is The feature space is … Training examples weight Linear classifier: 49

  50. Artificial Neural Network Design • For a given structure, find best weight sets which minimizes sum of square error, J(w), from training examples {(x1,y1),…,(x,y)} 50

More Related