1 / 40

Administration

Administration. Introduction/Signup sheet Course web site http://www.cs.princeton.edu/courses/archive/spring09/cos401/ Course location and time: Thursday, 1:30pm – 4:20pm, Robertson Hall 023 TA: Juan Carlos Niebles Office: 215 Computer Science Bldg. Phone: (609) 258-8241

bran
Download Presentation

Administration

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Administration • Introduction/Signup sheet • Course web site • http://www.cs.princeton.edu/courses/archive/spring09/cos401/ • Course location and time: Thursday, 1:30pm – 4:20pm, Robertson Hall 023 • TA: Juan Carlos Niebles • Office: 215 Computer Science Bldg. • Phone: (609) 258-8241 • Email:  jniebles [at] princeton • Office hour: TBD or by appointment. • Suggested Reading List: • (NSW) Readings in Machine Translation, S. Nirenberg, H. Somers and Y. Wilks, MIT Press, 2002 • (AT) Translation Engines: Techniques for Machine Translation, Arturo Trujillo, Springer 1999 • (JM) Speech and Language Processing, Jurafsky and Martin, Prentice Hall • (HS) An introduction to machine translation, W.John Hutchins and Harold L. Somers, London: Academic Press, 1992. • Assessment: • Class participation and attendance 15% • Homework assignments 20% • Midterm exam 30% • Final exam/Term Paper 35%

  2. Srinivas Bangalore AT&T Research Florham Park, NJ 07932 Machine Translation

  3. The funnier side of translation… • In a Belgrade hotel elevator • “The lift is being fixed for the next day. During that time we regret that you will be unbearable” • In a Paris hotel lobby • “Please leave your values at the front desk” • On the menu of a Swiss restaurant • “Our wines leave you nothing to hope for” • Outside a Hong Kong tailor shop • “Ladies may have a fit upstairs” • In an advertisement by a Hong Kong dentist • “Teeth extracted by the latest Methodists” • In a Norwegian cocktail lounge • “Ladies are requested not to have children in the bar” • In a pet shop in Malaysia • “For hygienic purposes, do not feed your hand to the dog” • Machine Translation: • The spirit is willing but the flesh is weak  Russian  The vodka is good but the meat is rotten Source: the web

  4. Outline • History of Machine Translation • Machine Translation Paradigms • Machine Translation Evaluation • Applications of Machine Translation

  5. Early days of Machine Translation • Success in cryptography (code-breaking) during the war • Source Text  Encoded Source Text  Transmit Text • Receive Text  Decode Text  Target Text • Ciphers: algorithms to encode and decode • Plain text  cipher text  decoded cipher text • cat  dog; fog  bat; ??  bog; • Warren Weaver (1947) • When I look at an article in Russian, I say: 'This is really written in English, but it has been coded in some strange symbols. I will now proceed to decode. • Ciphers are created to be hard to break, but are usually unambiguous. • Natural Languages are not as simple!!

  6. Complexity of Machine Translation • Computer program compilation is translation • Languages are designed to be unambiguous and formal • Source language and target language • Natural languages are ambiguous • Lexical (e.g. bank, lead) • Structural (e.g. john saw a man with a telescope; flying planes can be dangerous) • For Machine Translation: • Ambiguity is compounded!! • Mapping between words of the two languages is not unique • Lexical gaps • Languages have different mappings from concepts to words • Word order differences • English: Subject-Verb-Object; • Japanese, Hindi: Subject-Object-Verb.

  7. Issues in Machine Translation • Orthography • Writing from left-to-right vs right-to-left • Character sets (alphabetic, logograms, pictograms) • Segmentation into word/word-like units • Morphology • Lexical: Word senses • bank  “river bank”, “financial institution” • Syntactic: Word order • Subject-verb-object  subject-object-verb • Semantic: meaning • “ate pasta with a spoon”, “ate pasta with marinara”, “ate pasta with John” • Pragmatic: world knowledge • “Can you pass me the salt?” • Social: conversational norms • pronoun usage depends on the conversational partner • Cultural: idioms and phrases • “out of the ballpark”, “came from leftfield” • Contextual • In addition for Speech Translation • Prosody: JOHN eats bananas: John EATS bananas; John eats BANANAS • Pronunciation differences • Speech recognition errors • In a multilingual environment • Code Switching: Use of linguistic apparatus of one language to express ideas in another language.

  8. Machine Translation: Why and what’s it good for? • Understanding people across linguistic barriers • Socio-Political • Commercial: Globalization • Limited availability of human expertise • What is it good for? • Tasks with limited vocabulary and syntax (technical manuals) • Rough translations for web pages, emails • Applications that use translation as one of the components • What is it not good for? • Hard and Important domains (Literature, Legal, Medical) • Machine Translation need not be fully automated!! • Human assisted machine translation • Machine assisted human translation • Machine Translation as a productivity enhancement tool.

  9. Machine Translation: Past and Present Corpus-based methods: IBM’s Candide, Japanese ‘example-based’ translation. Speech-to-Speech translation: Verbmobil, Janus. ‘Pure’ to practical MT for embedded applications: Cross-lingual IR 1990- present 1980-1990 Emphasis on ‘indirect’ translation: semantic and knowledge-based. Advent of microcomputers. Translation companies: Systran, Logos, GlobalLink. Domain specific machine-aided translation systems. 1966-1980s Translation continued in Canada, France and Germany. Beyond English-Russian translation. Meteo for translating weather reports. Systran in 1970 ALPAC report: “there is no immediate or predictable prospect of useful fully automatic machine translation”. 1966 1954-1966 Large bilingual dictionaries, linguistic and formal grammar motivated syntactic reordering, lots of funding, little progress 1947-1954 MT as code breaking, IBM-Georgetown Univ. demonstration

  10. MT Approaches: Different levels of meaning transfer Interlingua Semantic Interpretation Semantic Generation Depth of Analysis Syntactic Structure Syntactic Structure Transfer-based MT Syntactic Generation Parsing Target Source Direct MT

  11. Direct Machine Translation • Words are replaced using a dictionary • Some amount of morphological processing • Word reordering is limited • Quality depends on the size of the dictionary, closeness of languages Spanish : ajá quiero usar mi tarjeta de crédito English : yeah I wanna use my credit card Alignment : 1 3 4 5 7 0 6 • English :I need to make a collect call • Japanese :私は コレクト コールを かける 必要があります • Alignment : 1 5 0 3 0 2 4

  12. ALIGNMENT (transfer) MATCHING (analysis) RECOMBINATION (generation) Exact match (direct translation) Target Source Example-based MT • Translation-by-analogy: • A collection of source/target text pairs • A matching metric • An word or phrase-level alignment • Method for recombination • ATR EBMT System (E. Sumita, H. Iida, 1991); CMU Pangloss EBMT (R. Brown, 1996)

  13. Example run of EBMT English-Japanese Examples in the Corpus: • He buys a notebook Kare wa noto o kau • I read a book on international politics Watashi wa kokusai seiji nitsuite kakareta hon o yomu Translation Input:He buysa book on international politics Translation Output:Kare wakokusai seiji nitsuite kakareta hono kau • Challenge: Finding a good matching metric • He bought a notebook • A book was bought • I read a book on world politics

  14. NLP Pipeline: Beads on a String Tokenization Sentence Segmentation Part-of-speech tagging Noun/Verb Chunking Named Entity Detection Syntactic Parsing Word Sense Disambiguation Co-reference resolution Semantic Role Labeling

  15. Noun/Verb Chunking Named Entity Detection Syntactic Parsing Word Sense Disambiguation Co-reference resolution Semantic Role Labeling NLP Pipeline: Sentence Segmentation Tokenization Sentence Segmentation Part-of-speech tagging U.S. President lives in Washington D.C. He will travel to Florida this week. U.S. President lives in Washington D.C. He will travel to Florida this week.

  16. Noun/Verb Chunking Named Entity Detection Syntactic Parsing Word Sense Disambiguation Co-reference resolution Semantic Role Labeling NLP Pipeline: Part-of-speech Tagging Tokenization Sentence Segmentation Part-of-speech tagging He/PRP will/MD travel/VB to/TO Florida/NNP this/DT week/NN ./. He will travel to Florida this week .

  17. NLP Pipeline: Named Entity Detection Tokenization Sentence Segmentation Part-of-speech tagging President Bush will travel to Florida on February 20 2007 to meet with the CEO of AT&T Noun/Verb Chunking Named Entity Detection Syntactic Parsing President Bush will travel to Florida on February 20 2007 to meet with the CEO of AT&T Word Sense Disambiguation Co-reference resolution Semantic Role Labeling

  18. NLP Pipeline: Noun/Verb Chunking Tokenization Sentence Segmentation Part-of-speech tagging Noun/Verb Chunking Named Entity Detection Syntactic Parsing Word Sense Disambiguation President Bush will travel to Florida on February 20 2007 to meet with the CEO of AT&T Co-reference resolution President Bush will travel to Florida on February 20 2007 to meet with the CEO of AT&T Semantic Role Labeling

  19. NLP Pipeline: Syntactic Parsing Tokenization Sentence Segmentation Part-of-speech tagging $PERSON will travel to $PLACE on $DATE to meet with the $JOB of $ORG Noun/Verb Chunking will travel Named Entity Detection Syntactic Parsing $Person to on to meet with $PLACE $DATE Word Sense Disambiguation $JOB Co-reference resolution of the Semantic Role Labeling $ORG

  20. NLP Pipeline: Semantic Role Labeling Part-of-speech tagging Tokenization Sentence Segmentation will travel Named Entity Detection to $Person on $PLACE $DATE Noun/Verb Chunking Co-reference resolution will travel of the Part-of-speech tagging ARGM-tmp ARG0 Named Entity Detection Syntactic Parsing ARGM-loc $ORG $Person on to Word Sense Disambiguation $PLACE $DATE Semantic Role Labeling

  21. NLP Pipeline: Word Sense Disambiguation Tokenization Sentence Segmentation The man went to the bank to get some money The man went to the bank to get some money Part-of-speech tagging The man went to the bank to get some flowers The man went to the bank to get some flowers Noun/Verb Chunking Syntactic Parsing Co-reference resolution Semantic Role Labeling Word Sense Disambiguation

  22. NLP Pipeline: Co-reference resolution Tokenization Sentence Segmentation Part-of-speech tagging The U.S. President lives in Washington D.C. He will return to the capital this week . Noun/Verb Chunking The U.S. President lives in Washington D.C. He will return to the capital this week . Syntactic Parsing Semantic Role Labeling Word Sense Disambiguation Co-reference resolution

  23. Syntactic Transfer-based Machine Translation • Direct and Example-based approaches • Two ends of a spectrum • Recombination of fragments for better coverage. • What if the matching/transfer is done at syntactic parse level • Three Steps • Parse: Syntactic parse of the source language sentence • Hierarchical representation of a sentence • Transfer: Rules to transform source parse tree into target parse tree • Subject-Verb-Object  Subject-Object-Verb • Generation: Regenerating target language sentence from parse tree • Morphology of the target language • Tree-structure provides better matching and longer distance transformations than is possible in string-based EBMT.

  24. Examples of SynTran-MT quiero wanna I ajá usar yeah use mi tarjeta my card credit de crédito • Mostly parallel parse structures • Might have to insert word – pronouns, morphological particles

  25. 必要があります (need) need I make かける (make) 私は (I) to call コールを (call) a collect コレクト (collect) Example of SynTran MT -2 • Pros: • Allows for structure transfer • Re-orderings are typically restricted to the parent-child nodes. • Cons: • Transfer rules are for each language pair (N2 sets of rules) • Hard to reuse rules when one of the languages is changed

  26. Interlingua-based Machine Translation Interlingua Semantic Interpretation Semantic Generation • Syntactic transfer-based MT • Couples the syntax of the two languages • What if we abstract away the syntax • All that remains is meaning • Meaning is the same across languages • Simplicity: Only N components needed to translate among N languages • Two “small” problems: • What is meaning? • How do we represent meaning? Syntactic Structure Syntactic Structure Transfer-based MT Syntactic Generation Parsing DirectMT Target Source English analyzer Spanish analyzer Japanese analyzer Interlingual representation English generator Spanish Generator Japanese Generator

  27. 必要があります (need) need I make かける (make) 私は (I) to call コールを (call) a collect コレクト (collect) Example of Interlingua Machine Translation Interlingua representation

  28. Probabilistic Direct Machine Translation • Starting early 1990s, full circle back to code-breaking paradigm of machine translation • With a probabilistic twist • What is it: • If you want to translate from English to Japanese • assume that the English text started out as a Japanese text • but went through a noisy channel which changed it into English • Goal is to recover the best (most probable) Japanese text • J*=argmaxJ P(J|E) = argmaxJ P(E|J)*P(J) • P(E|J) : Translation faithfulness; P(J): Translation fluency • Popular approach due to: • Availability of large amounts of bilingual data (parallel data) • Large memory and high speed computers Noisy Channel/Encryption 私は コレクト コールを かける 必要があります P(E|J) I need to make a collect call

  29. Probabilistic Direct Machine Translation • Learn pattern mappings (words and sequences of words) between pairs of sentences in the two languages. • - Use the result of translation; not the process of translation • - Infer a process that produces a similar result. • English :I need to make a collect call • Japanese :私は コレクト コールを かける 必要があります • Alignment : 1 5 0 3 0 2 4 • Spanish :ajá quiero usar mi tarjeta de crédito • English : yeah I wanna use my credit card • Alignment : 1 3 4 5 7 0 6

  30. Applications of Machine Translation

  31. Applications of Machine Translation

  32. Multilingual Customer Care

  33. Making Travel Arrangements using Multilingual Chat

  34. Large Vocabulary Speech Recognition and Translation

  35. Large Vocabulary Speech Recognition and Translation

  36. Evaluation of Machine Translation

  37. Machine Translation Evaluation • What is a good translation? • Meaning preserving and (social, cultural, conversation) context- appropriate rendering of the source language sentence • Bilingual Human Annotators • Mark the output of a translation system on a 5 point scale. • Expensive!! • Too coarse to arrive at a feedback signal to improve the translation system • Objective Metrics: Approximations to the real thing!! • Lexical Accuracy (LA) • Bag of words. • Translation Accuracy (TA) • Based on string alignment • Application-driven evaluation • “How May I Help You?” • Spoken dialog for call routing • Classification based on salient phrase detection

  38. Machine Translation Evaluation for call routing

  39. Summary • Fully Automatic Machine Translation in its full complexity is a very hard task • Pragmatic approaches to Machine Translation have been successful • Limited domain/vocabulary • Human-assisted machine translation • Machine-assisted human translation • A range of applications for “rough” machine translation • Machine Translation will improve as we better understand how people communicate.

  40. 010000100000100110 100100110100101110 010100101000100110 ... ... Spoken Language Translation ENGLISH SPEECH FEATURE EXTRACTION ACOUSTIC SEGMENTFEATURE VALUES RECOGNITION SEARCH please the flies book ENGLISH WORD LATTICE this flight three MACHINE TRANSLATION 請預訂這班機 CHINESE TEXT PHONETIC ANALYSIS qing3 yU4ding4 zhe4 ban1ji1 PRONUNCIATION AUDIO SYNTHESIS CHINESE SPEECH

More Related