1 / 16

Project 1: Text Classification by Neural Networks

Project 1: Text Classification by Neural Networks. Ver 1.1. Outline. Classification using ANN Learn and classify text documents Estimate several statistics on the dataset. Class 1. Input. Class 2. Class 3. …. Network Structure. CLASSIC3 Dataset. CLASSIC3.

Download Presentation

Project 1: Text Classification by Neural Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Project 1:Text Classification by Neural Networks Ver 1.1

  2. Outline • Classification using ANN • Learn and classify text documents • Estimate several statistics on the dataset (C) 2006, SNU Biointelligence Laboratory

  3. Class 1 Input Class 2 Class 3 … Network Structure (C) 2006, SNU Biointelligence Laboratory

  4. CLASSIC3 Dataset

  5. CLASSIC3 • Three categories: 3891 documents • CISI: 1,460 document abstracts on information retrieval from Institute of Scientific Information. • CRAN: 1,398 document abstracts on Aeronautics from Cranfield Institute of Technology. • MED: 1,033 biomedical abstracts from MEDLINE. (C) 2006, SNU Biointelligence Laboratory

  6. 1 1 0 0 0 3 0 1 0 0 0 1 2 0 0 0 0 0 0 0 0 2 1 1 1 0 2 0 1 0 1 0 0 1 1 3 1 0 0 1 2 0 0 0 0 1 0 1 0 0 1 0 0 0 3 0 0 2 1 0 0 0 0 1 0 0 3 0 0 1 0 0 1 0 1 1 0 0 2 1 0 1 1 0 1 0 0 0 0 0 0 0 3 1 0 0 Text Presentation in Vector Space 문서집합 stemming stop-words elimination feature selection . . . VSM representation Term vectors Bag-of-Words representation d1 d2 d3 dn baseball specs graphics hockey Term-document matrix unix Dataset Format space (C) 2006, SNU Biointelligence Laboratory

  7. ML algorithm Dimensionality Reduction term (or feature) vectors individual feature Scoring measure (on individual feature) Sort by score scores choose terms with higher values documents in vector space Term Weighting TF or TF x IDF TF: term frequency IDF: Inverse Document Frequency N:Number of documents ni: number of documents that contain the j-th word (C) 2006, SNU Biointelligence Laboratory

  8. Construction of Document Vectors • Controlled vocabulary • Stopwords are removed • Stemming is used. • Words of which document frequency is less than 5 is removed.  Term size: 3,850 • A document is represented with a 3,850-dimensional vector of which elements are the frequency of words. • Words are sorted according to their values of information gain.  Top 100 terms are selected  3,830 (examples) x 100 (terms) matrix (C) 2006, SNU Biointelligence Laboratory

  9. Experimental Results

  10. Data Setting for the Experiments • Basically, training and test set are given. • Training : 2,683 examples • Test : 1,147 examples • N-fold cross-validation (Optional) • Dataset is divided into N subsets. • The holdout method is repeated N times. • Each time, one of the N subsets is used as the test set and the other (N-1) subsets are put together to form a training set. • The average performance across all N trials is computed. (C) 2006, SNU Biointelligence Laboratory

  11. Number of Epochs (C) 2006, SNU Biointelligence Laboratory

  12. Number of Hidden Units • Number of Hidden Units • Minimum 10 runs for each setting (C) 2006, SNU Biointelligence Laboratory

  13. (C) 2006, SNU Biointelligence Laboratory

  14. Other Methods/Parameters • Normalization method for input vectors • Class decision policy • Learning rates • …. (C) 2006, SNU Biointelligence Laboratory

  15. ANN Sources • Source codes • Free software  Weka • NN libraries (C, C++, JAVA, …) • MATLAB tool box • Web sites • http://www.cs.waikato.ac.nz/~ml/weka/ • http://www.faqs.org/faqs/ai-faq/neural-nets/part5/ (C) 2006, SNU Biointelligence Laboratory

  16. Submission • Due date: October 12 (Thur) • Both ‘hardcopy’ and ‘email’ • Used software and running environments • Experimental results with various parameter settings • Analysis and explanation about the results in your own way • FYI, it is not important to achieve the best performance (C) 2006, SNU Biointelligence Laboratory

More Related