1 / 55

Natural Language Processing

Natural Language Processing. Lecture 1 Sudeshna Sarkar 26 July 2007. Notes adapted from Martin’s NLP slides. Text Books . Daniel Jurafsky, and James H. Martin, "Speech and Language Processing", Prentice Hall, 2000. Other References

tanuja
Download Presentation

Natural Language Processing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Natural Language Processing Lecture 1 Sudeshna Sarkar 26 July 2007

  2. Notes adapted from Martin’sNLP slides

  3. Text Books • Daniel Jurafsky, and James H. Martin, "Speech and Language Processing", Prentice Hall, 2000. Other References • James Allen, "Natural Language Understanding", Second edition, Pearson • Christopher D. Manning, and Hinrich Schutze, "Foundations of Statistical Natural Language Processing", The MIT Press, 1999.

  4. Final Project • This will be a research-oriented project. The goal is to have a paper suitable for a conference submission. • These will preferably be done in groups.

  5. Natural Language Processing • What is it? • We’re going to study what goes into getting computers to perform useful and interesting tasks involving human languages. • We will be secondarily concerned with the insights that such computational work gives us into human processing of language.

  6. Why Should You Care? Two trends • An enormous amount of knowledge is now available in machine readable form as natural language text • Conversational agents are becoming an important form of human-computer communication

  7. Major Topics • Words • Syntax • Meaning • Dialog and Discourse Applications

  8. Applications • First, what makes an application a language processing application (as opposed to any other piece of software)? • An application that requires the use of knowledge about human languages • Example: Is Unix wc (word count) a language processing application?

  9. Applications • Word count? • When it counts words: Yes • To count words you need to know what a word is. That’s knowledge of language. • When it counts lines and bytes: No • Lines and bytes are computer artifacts, not linguistic entities

  10. Big Applications • Question answering • Conversational agents • Summarization • Machine translation

  11. Big Applications • These kinds of applications require a tremendous amount of knowledge of language. • Consider the following interaction with HAL the computer from 2001: A Space Odyssey

  12. HAL • Dave: Open the pod bay doors, Hal. • HAL: I’m sorry Dave, I’m afraid I can’t do that.

  13. What’s needed? • Speech recognition and synthesis • Knowledge of the English words involved • What they mean • How they combine (bay, vs. pod bay) • How groups of words clump • What the clumps mean

  14. What’s needed? • Dialog • It is polite to respond, even if you’re planning to kill someone. • It is polite to pretend to want to be cooperative (I’m afraid, I can’t…)

  15. Real Example What is the Fed’s current position on interest rates? • What or who is the “Fed”? • What does it mean for it to to have a position? • How does “current” modify that?

  16. Caveat NLP has an AIaspect to it. • We’re often dealing with ill-defined problems • We don’t often come up with perfect solutions/algorithms • We can’t let either of those facts get in our way

  17. Basic algorithm and data structure analysis Ability to program Some exposure to logic Exposure to basic concepts in probability Familiarity with linguistics, psychology, and philosophy Ability to write well in English Preparation

  18. Topics: Linguistics • Word-level processing • Syntactic processing • Lexical and compositional semantics • Discourse and dialog processing

  19. Finite-state methods Context-free methods Augmented grammars Unification Logic Probabilistic versions Supervised machine learning Topics: Techniques

  20. Small Spelling correction Medium Word-sense disambiguation Named entity recognition Information retrieval Large Question answering Conversational agents Machine translation Topics: Applications

  21. Commercial World • Lot’s of exciting stuff going on… • Some samples… • Machine translation • Question answering • Buzz analysis

  22. Google/Arabic

  23. Google/Arabic Translation

  24. Web Q/A

  25. Summarization • Current web-based Q/A is limited to returning simple fact-like (factoid) answers (names, dates, places, etc). • Multi-document summarization can be used to address more complex kinds of questions. Circa 2002: What’s going on with the Hubble?

  26. NewsBlaster Example The U.S. orbiter Columbia has touched down at the Kennedy Space Center after an 11-day mission to upgrade the Hubble observatory. The astronauts on Columbia gave the space telescope new solar wings, a better central power unit and the most advanced optical camera. The astronauts added an experimental refrigeration system that will revive a disabled infrared camera. ''Unbelievable that we got everything we set out to do accomplished,'' shuttle commander Scott Altman said. Hubble is scheduled for one more servicing mission in 2004.

  27. Weblog Analytics • Textmining weblogs, discussion forums, user groups, and other forms of user generated media. • Product marketing information • Political opinion tracking • Social network analysis • Buzz analysis (what’s hot, what topics are people talking about right now).

  28. Web Analytics

  29. Umbria

  30. Forms of Natural Language • The input/output of a NLP system can be: • written text: newspaper articles, letters, manuals, prose, … • Speech: read speech (radio, TV, dictations), conversational speech, commands, … • To process written text, we need: • lexical, • syntactic, • Semantic knowledge about the language • discourse information, • real world knowledge • To process spoken language, we need additionally • speech recognition • speech synthesis

  31. Components of NLP • Natural Language Understanding • Mapping the given input in the natural language into a useful representation. • Different level of analysis required: morphological analysis, syntactic analysis, semantic analysis, discourse analysis, … • Natural Language Generation • Producing output in the natural language from some internal representation. • Different level of synthesis required: • deep planning (what to say), • syntactic generation Which is harder?

  32. Natural language understanding • Uncovering the mappings between the linear sequence of words (or phonemes) and the meaning that it encodes. • Representing this meaning in a useful (usually symbolic) representation. • By definition - heavily dependent on the target task • Words and structures mean different things in different contexts • The required target representation is different for different tasks. Why is NLU hard? • The mapping between words, their linguistic structure and the meaning that they encode is extremely complex and difficult to model and decompose. • Natural language is very ambiguous • The goal of understanding is itself task dependent and very complex.

  33. Why NL Understanding is hard? • Natural language is extremely rich in form and structure, and very ambiguous. • How to represent meaning, • Which structures map to which meaning structures. • Ambiguity: ne input can mean many different things • Lexical (word level) ambiguity -- different meanings of words • Syntactic ambiguity -- different ways to parse the sentence • Interpreting partial information -- how to interpret pronouns • Contextual information -- context of the sentence may affect the meaning of that sentence. • Many input can mean the same thing. • Interaction among components of the input. • Noisy input (e.g. speech)

  34. Knowledge of Language • Phonology– concerns how words are related to the sounds that realize them. • Morphology – concerns how words are constructed from more basic meaning units called morphemes. A morpheme is the primitive unit of meaning in a language. • Syntax – concerns how can be put together to form correct sentences and determines what structural role each word plays in the sentence and what phrases are subparts of other phrases. • Semantics – concerns what words mean and how these meaning combine in sentences to form sentence meaning. The study of context-independent meaning.

  35. Knowledge of Language • Pragmatics – concerns how sentences are used in different situations and how use affects the interpretation of the sentence. • Discourse – concerns how the immediately preceding sentences affect the interpretation of the next sentence.For example, interpreting pronouns and interpreting the temporal aspects of the information. • World Knowledge – includes general knowledge about the world. What each language user must know about the other’s beliefs and goals.

  36. Ambiguity At last, a computer that understands you like your mother. -- 1985 McDonnell-Douglas Ad Different interpretations: • The computer understands you as well as your mother understands you. • The computer understands that you like your mother. • The computer understands you as well as it understands your mother. Speech : ….. a computer that understands your lie cured mother …

  37. Why is NLP difficult? • Because Natural Language is highly ambiguous. • Syntactic ambiguity • The president spoke to the nation about the problem of drug use in the schools from one coast to the other. • has 720 parses. • Ex: • “to the other” can attach to any of the previous NPs (ex. “the problem”), or the head verb  6 places • “from one coast” has 5 places to attach • …

  38. Why is NLP difficult? • Word category ambiguity • book -->verb? or noun? • Word sense ambiguity • bank --> financial institution? building? or river side? • Words can mean more than their sum of parts • make up a story • Fictitious worlds • People on mars can fly. • Defining scope • People like ice-cream. • Does this mean that all (or some?) people like ice cream? • Language is changing and evolving • I’ll email you my answer. • This new S.U.V. has a compartment for your mobile phone. • Googling, …

  39. Resolve Ambiguities • We will introduce models and algorithms to resolve ambiguities at different levels. • part-of-speech tagging -- Deciding whether duck is verb or noun. • word-sense disambiguation -- Deciding whether make is create or cook. • lexical disambiguation -- Resolution of part-of-speech and word-sense ambiguities are two important kinds of lexical disambiguation. • syntactic ambiguity -- her duck is an example of syntactic ambiguity, and can be addressed by probabilistic parsing.

  40. Resolve Ambiguities (cont.) I made her duck S S NP VP NP VP I V NP NP I V NP made her duck made DET N her duck

  41. Dealing with Ambiguity • Three approaches: • Tightly coupled interaction among processing levels; knowledge from other levels can help decide among choices at ambiguous levels. • Pipeline processing that ignores ambiguity as it occurs and hopes that other levels can eliminate incorrect structures. • Syntax proposes/semantics disposes approach • Probabilistic approaches based on making the most likely choices

  42. Models to Represent Linguistic Knowledge • Different formalisms (models) are used to represent the required linguistic knowledge. • State Machines -- FSAs, HMMs, ATNs, RTNs • Formal Rule Systems -- Context Free Grammars, Unification Grammars, Probabilistic CFGs. • Logic-based Formalisms -- first order predicate logic, some higher order logic. • Models of Uncertainty -- Bayesian probability theory.

  43. Algorithms to Manipulate Linguistic Knowledge • We will use algorithms to manipulate the models of linguistic knowledge to produce the desired behavior. • Most of the algorithms we will study are transducers and parsers. • These algorithms construct some structure based on their input. • Since the language is ambiguous at all levels, these algorithms are never simple processes. • Categories of most algorithms that will be used can fall into following categories. • state space search • dynamic programming

  44. Language and Intelligence Turing Test ComputerHuman Human Judge • Human Judge asks tele-typed questions to Computer and Human. • Computer’s job is to act like a human. • Human’s job is to convince Judge that he is not machine. • Computer is judged “intelligent” if it can fool the judge • Judgment of intelligence is linked to appropriate answers to questions from the system.

  45. NLP - an inter-disciplinary Field • NLP borrows techniques and insights from several disciplines. • Linguistics: How do words form phrases and sentences? What constraints the possible meaning for a sentence? • ComputationalLinguistics: How is the structure of sentences are identified? How can knowledge and reasoning be modeled? • ComputerScience: Algorithms for automatons, parsers. • Engineering: Stochastic techniques for ambiguity resolution. • Psychology: What linguistic constructions are easy or difficult for people to learn to use? • Philosophy: What is the meaning, and how do words and sentences acquire it?

  46. Some Buzz-Words • NLP – Natural Language Processing • CL – Computational Linguistics • SP – Speech Processing • HLT – Human Language Technology • NLE – Natural Language Engineering • SNLP – Statistical Natural Language Processing • Other Areas: • Speech Generation, Text Generation, Speech Understanding, Information Retrieval, • Dialogue Processing, Inference, Spelling Correction, Grammar Correction, • Text Summarization, Text Categorization,

  47. Some NLP Applications • Machine Translation – Translation between two natural languages. • Babel Fish translations system, Systran • Information Retrieval – Web search (uni-lingual or multi-lingual). • Query Answering/Dialogue – Natural language interface with a database system, or a dialogue system. • Report Generation – Generation of reports such as weather reports. • Other Applications – • Grammar Checking, Spell Checking, Spell Corrector

  48. The Big Picture Source Language Speech Signal Target Language Speech Signal Speech recognition Speech Synthesis Target text Generation Source text Analysis

  49. The Reductionist Approach Source Language Analysis Target Language Generation Text Normalization Text Rendering Morphological Analysis Morphological Synthesis POS Tagging Phrase Generation Parsing Role Ordering Semantic Analysis Lexical Choice Discourse Analysis Discourse Planning

  50. Natural Language Understanding Words Morphological Analysis Morphologically analyzed words(another step: POS tagging) Syntactic Analysis Syntactic Structure Semantic Analysis Context-independent meaning representation Discourse Processing Final meaning representation

More Related