1 / 62

15-505: Lecture 11 Generative Models for Text Classification and Information Extraction

15-505: Lecture 11 Generative Models for Text Classification and Information Extraction. Kamal Nigam. Some slides from William Cohen, Andrew McCallum. Text Classification by Example. Text Classification by Example. Text Classification by Example. Text Classification by Example.

leighanna
Download Presentation

15-505: Lecture 11 Generative Models for Text Classification and Information Extraction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 15-505: Lecture 11Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

  2. Text Classification by Example

  3. Text Classification by Example

  4. Text Classification by Example

  5. Text Classification by Example

  6. Text Classification by Example

  7. How could you build a text classifier? • Take some ideas from machine learning • Supervised learning setting • Examples of each class (a few or thousands) • Take some ideas from machine translation • Generative models • Language models • Simplify each and stir thoroughly

  8. Basic Approach of Generative Modeling • Pick representation for data • Write down probabilistic generative model • Estimate model parameters with training data • Turn model around to calculate unknown values for new data

  9. Naïve Bayes: Bag of Words Representation Occurrence counts All words in dictionary Corn prices rose today while corn futures dropped in surprising trading activity. Corn ...

  10. Naïve Bayes: Mixture of Multinomials Model • Pick the class: P(class) • For every word, pick from the class urn: P(word|class) java ball modem polo soccer windows dropped web the the while in in again the the soccer activity windows SPORTS COMPUTERS Word independence assumption!

  11. Naïve Bayes: Estimating Parameters • Just like estimating biased coin flip probabilities • Estimate MAP word probabilities: • Estimate MAP class priors:

  12. Naïve Bayes: Performing Classification • Word independence assumption • Take the class with the highest probability

  13. Classification Tricks of the Trade • Stemming • run, runs, running, ran  run • table, tables, tabled  table • computer, compute, computing  compute • Stopwords • Very frequent function words generally uninformative • if, in, the, like, … • Information gain feature selection • Keep just most indicative words in the vocabulary

  14. Naïve Bayes Rules of Thumb • Need hundreds of labeled examples per class for good performance (~85% accuracy) • Stemming and stopwords may or may not help • Feature selection may or may not help • Predicted probabilities will be very extreme • Use sum of logs instead of multiplying probabilities for underflow prevention • Coding this up is trivial, either as a mapreduce or not

  15. Information Extraction with Generative Models

  16. Example: A Problem Mt. Baker, the school district Baker Hostetler, the company Baker, a job opening Genomics job

  17. Example: A Solution

  18. Job Openings: Category = Food Services Keyword = Baker Location = Continental U.S.

  19. Extracting Job Openings from the Web Title: Ice Cream Guru Description: If you dream of cold creamy… Contact:susan@foodscience.com Category: Travel/Hospitality Function: Food Services

  20. Potential Enabler of Faceted Search

  21. Lots of Structured Information in Text

  22. IE from Research Papers

  23. What is Information Extraction? • Recovering structured data from formatted text

  24. What is Information Extraction? • Recovering structured data from formatted text • Identifying fields (e.g. named entity recognition)

  25. What is Information Extraction? • Recovering structured data from formatted text • Identifying fields (e.g. named entity recognition) • Understanding relations between fields (e.g. record association)

  26. What is Information Extraction? • Recovering structured data from formatted text • Identifying fields (e.g. named entity recognition) • Understanding relations between fields (e.g. record association) • Normalization and deduplication

  27. What is Information Extraction? • Recovering structured data from formatted text • Identifying fields (e.g. named entity recognition) • Understanding relations between fields (e.g. record association) • Normalization and deduplication • Today, focus on field identification

  28. IE Posed as a Machine Learning Task • Training data: documents marked up with ground truth • In contrast to text classification, local features crucial. Features of: • Contents • Text just before item • Text just after item • Begin/end boundaries 00 : pm Place : Wean Hall Rm 5409 Speaker : Sebastian Thrun … … prefix contents suffix

  29. Good Features for Information Extraction Creativity and Domain Knowledge Required! contains-question-mark contains-question-word ends-with-question-mark first-alpha-is-capitalized indented indented-1-to-4 indented-5-to-10 more-than-one-third-space only-punctuation prev-is-blank prev-begins-with-ordinal shorter-than-30 begins-with-number begins-with-ordinal begins-with-punctuation begins-with-question-word begins-with-subject blank contains-alphanum contains-bracketed-number contains-http contains-non-space contains-number contains-pipe Example word features: • identity of word • is in all caps • ends in “-ski” • is part of a noun phrase • is in a list of city names • is under node X in WordNet or Cyc • is in bold font • is in hyperlink anchor • features of past & future • last person name was female • next two words are “and Associates”

  30. Good Features for Information Extraction Creativity and Domain Knowledge Required! Is Capitalized Is Mixed Caps Is All Caps Initial Cap Contains Digit All lowercase Is Initial Punctuation Period Comma Apostrophe Dash Preceded by HTML tag Character n-gram classifier says string is a person name (80% accurate) In stopword list(the, of, their, etc) In honorific list(Mr, Mrs, Dr, Sen, etc) In person suffix list(Jr, Sr, PhD, etc) In name particle list (de, la, van, der, etc) In Census lastname list;segmented by P(name) In Census firstname list;segmented by P(name) In locations lists(states, cities, countries) In company name list(“J. C. Penny”) In list of company suffixes(Inc, & Associates, Foundation) Word Features • lists of job titles, • Lists of prefixes • Lists of suffixes • 350 informative phrases HTML/Formatting Features • {begin, end, in} x {<b>, <i>, <a>, <hN>} x{lengths 1, 2, 3, 4, or longer} • {begin, end} of line

  31. Sliding Window Abraham Lincoln was born in Kentucky. Classifier which class? Try alternatewindow sizes: Boundary Models Finite State Machines Wrapper Induction Abraham Lincoln was born in Kentucky. Abraham Lincoln was born in Kentucky. <b><i>Abraham Lincoln</i></b> was born in Kentucky. BEGIN Most likely state sequence? Learn and apply pattern for a website <b> Classifier <i> which class? PersonName BEGIN END BEGIN END Landscape of ML Techniques for IE: Classify Candidates Abraham Lincoln was born in Kentucky. Classifier which class? Any of these models can be used to capture words, formatting or both.

  32. Sliding Windows & Boundary Detection

  33. Information Extraction by Sliding Windows GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

  34. Information Extraction by Sliding Windows GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

  35. Information Extraction by Sliding Window GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

  36. Information Extraction by Sliding Window GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

  37. Information Extraction by Sliding Window GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

  38. Information Extraction with Sliding Windows [Freitag 97, 98; Soderland 97; Califf 98] 00 : pm Place : Wean Hall Rm 5409 Speaker : Sebastian Thrun … … w t-m w t-1 w t w t+n w t+n+1 w t+n+m prefix contents suffix • Standard supervised learning setting • Positive instances: Windows with real label • Negative instances: All other windows • Features based on candidate, prefix and suffix • Special-purpose rule learning systems work well courseNumber(X) :- tokenLength(X,=,2), every(X, inTitle, false), some(X, A, <previousToken>, inTitle, true), some(X, B, <>. tripleton, true)

  39. IE by Boundary Detection GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

  40. IE by Boundary Detection GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

  41. IE by Boundary Detection GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

  42. IE by Boundary Detection GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

  43. IE by Boundary Detection GRAND CHALLENGES FOR MACHINE LEARNING Jaime Carbonell School of Computer Science Carnegie Mellon University 3:30 pm 7500 Wean Hall Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on. CMU UseNet Seminar Announcement

  44. BWI: Learning to detect boundaries [Freitag & Kushmerick, AAAI 2000] • Another formulation: learn three probabilistic classifiers: • START(i) = Prob( position i starts a field) • END(j) = Prob( position j ends a field) • LEN(k) = Prob( an extracted field has length k) • Then score a possible extraction (i,j) by START(i) * END(j) * LEN(j-i) • LEN(k) is estimated from a histogram • START(i) and END(j) learned by boosting over simple boundary patterns and features

  45. Problems with Sliding Windows and Boundary Finders • Decisions in neighboring parts of the input are made independently from each other. • Sliding Window may predict a “seminar end time” before the “seminar start time”. • It is possible for two overlapping windows to both be above threshold. • In a Boundary-Finding system, left boundaries are laid down independently from right boundaries, and their pairing happens as a separate step.

  46. Hidden Markov Models

  47. Citation Parsing • Fahlman, Scott & Lebiere, Christian(1989).The cascade-correlation learning architecture.Advances in Neural Information Processing Systems, pp. 524-532. • Fahlman, S.E. and Lebiere, C.,“The Cascade Correlation Learning Architecture,”Neural Information Processing Systems, pp. 524-532, 1990. • Fahlman, S. E.(1991)The recurrent cascade-correlation learning architecture.NIPS 3, 190-205.

  48. Can we do this with probabilistic generative models? • Could have classes for {author, title, journal, year, pages} • Could classify every word or sequence? • Which sequences? • Something interesting in the sequence of fields that we’d like to capture • Authors come first • Title comes before journal • Page numbers come near the end

  49. Hidden Markov Models: The Representation • A document is a sequence of words • Each word is tagged by its class • fahlman s e and lebiere c the cascade correlation learning architectureneural information processing systems pp 524 532 1990

  50. HMM: Generative Model (1) Journal Author Title Year Pages

More Related