1 / 28

Events

Events. From KRR lecture. Ask Jeeves – A Q/A, IR ex. Blockbuster. What do you call a successful movie? Tips on Being a Successful Movie Vampire ... I shall call the police.

gzifa
Download Presentation

Events

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Events • From KRR lecture NLP

  2. Ask Jeeves – A Q/A, IR ex. Blockbuster What do you call a successful movie? • Tips on Being a Successful Movie Vampire ... I shall call the police. • Successful Casting Call & Shoot for ``Clash of Empires'' ... thank everyone for their participation in the making of yesterday's movie. • Demme's casting is also highly entertaining, although I wouldn't go so far as to call it successful. This movie's resemblance to its predecessor is pretty vague... • VHS Movies: Successful Cold Call Selling: Over 100 New Ideas, Scripts, and Examples from the Nation's Foremost Sales Trainer. NLP

  3. Ask Jeeves – filtering w/ POS tag What do you call a successful movie? • Tips on Being a Successful Movie Vampire ... I shall call the police. • Successful Casting Call & Shoot for ``Clash of Empires'' ... thank everyone for their participation in the making of yesterday's movie. • Demme's casting is also highly entertaining, although I wouldn't go so far as to call it successful. This movie's resemblance to its predecessor is pretty vague... • VHS Movies: Successful Cold Call Selling: Over 100 New Ideas, Scripts, and Examples from the Nation's Foremost Sales Trainer. NLP

  4. Filtering out “call the police” Different senses, - different syntax, - different participants call(you,movie,what)≠ call(you,police) you movie whatyou police NLP

  5. “Meaning” • Shallow semantic annotation • Captures critical dependencies, • Highlights participants • Reflects clear sense distinctions • Supports training of supervised automatic taggers • Works for other languages NLP

  6. Cornerstone: English lexical resource • That provides sets of possible syntactic frames for verbs. • And provides clear, replicable sense distinctions. AskJeeves: Who do you call for a good electronic lexical database for English? NLP

  7. WordNet – Princeton (Miller 1985, Fellbaum 1998) • On-line lexical reference (dictionary) • Nouns, verbs, adjectives, and adverbs grouped into synonym sets • Other relations include hypernyms (ISA), antonyms, meronyms NLP

  8. WordNet • http://www.cogsci.princeton.edu/~wn/ NLP

  9. WordNet – call, 28 senses • name, call -- (assign a specified, proper name to; "They named their son David"; …) -> LABEL 2. call, telephone, call up, phone, ring -- (get or try to get into communication (with someone) by telephone; "I tried to call you all night"; …) ->TELECOMMUNICATE 3. call-- (ascribe a quality to or give a name of a common noun that reflects a quality; "He called me a bastard"; …) -> LABEL 4. call, send for -- (order, request, or command to come; She was called into the director's office"; "Call the police!") -> ORDER NLP

  10. WordNet – Princeton (Miller 1985, Fellbaum 1998) • Limitations as a computational lexicon • Contains little syntactic information • Comlex has syntax but no sense distinctions • No explicit lists of participants • Sense distinctions very fine-grained, • Definitions often vague • Causes problems with creating training data for supervised Machine Learning – SENSEVAL2 • Verbs > 16 senses (including call) • Inter-annotator Agreement ITA 73%, • Automatic Word Sense Disambiguation, WSD 60.2% NLP Dang & Palmer, SIGLEX02

  11. WordNet: - call, 28 senses WN2 , WN13,WN28 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN5 WN 16 WN6 WN23 WN12 WN17 , WN 11 WN10, WN14, WN21, WN24 NLP

  12. WordNet: - call, 28 senses, Senseval2 groups (engineering!) WN5, WN16,WN12 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN2 WN 13 WN6 WN23 WN28 WN17 , WN 11 WN10, WN14, WN21, WN24, Loud cry Bird or animal cry Request Label Call a loan/bond Challenge Visit Phone/radio Bid NLP

  13. Grouping improved scores: ITA 82%, MaxEnt WSD 69% • Call: 31% of errors due to confusion between senses within same group 1: • name, call -- (assign a specified, proper name to; They named their son David) • call -- (ascribe a quality to or give a name of a common noun that reflects a quality; He called me a bastard) • call -- (consider or regard as being;I would not call her beautiful) • 75% with training and testing on grouped senses vs. • 43% with training and testing on fine-grained senses Palmer, Dang, Fellbaum,, submitted, NLE NLP

  14. Powell met Zhu Rongji battle wrestle join debate Powell and Zhu Rongji met consult Powell met with Zhu Rongji Proposition:meet(Powell, Zhu Rongji) Powell and Zhu Rongji had a meeting Proposition Bank:From Sentences to Propositions (Predicates!) meet(Somebody1, Somebody2) . . . When Powell met Zhu Rongji on Thursday they discussed the return of the spy plane. meet(Powell, Zhu) discuss([Powell, Zhu], return(X, plane)) NLP

  15. Capturing semantic roles* SUBJ • Richard broke [ ARG1 the laser pointer.] • [ARG1 The windows] were broken by the hurricane. • [ARG1 The vase] broke into pieces when it toppled over. SUBJ SUBJ *See also Framenet, http://www.icsi.berkeley.edu/~framenet/ NLP

  16. Word Senses in PropBank • Orders to ignore word sense not feasible for 700+ verbs • Mary left the room • Mary left her daughter-in-law her pearls in her will Frameset leave.01 "move away from": Arg0: entity leaving Arg1: place left Frameset leave.02 "give": Arg0: giver Arg1: thing given Arg2: beneficiary How do these relate to traditional word senses in VerbNet and WordNet? NLP

  17. WordNet: - call, 28 senses, groups WN5, WN16,WN12 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN2 WN 13 WN6 WN23 WN28 WN17 , WN 11 WN10, WN14, WN21, WN24, Loud cry Bird or animal cry Request Label Call a loan/bond Challenge Visit Phone/radio Bid NLP

  18. Overlap with PropBank Framesets WN5, WN16,WN12 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN2 WN 13 WN6 WN23 WN28 WN17 , WN 11 WN10, WN14, WN21, WN24, Loud cry Bird or animal cry Request Label Call a loan/bond Challenge Visit Phone/radio Bid NLP

  19. Overlap between Senseval2Groups and Framesets – 95% Frameset2 Frameset1 WN1 WN2 WN3 WN4 WN6 WN7 WN8 WN5 WN 9 WN10 WN11 WN12 WN13 WN 14 WN19 WN20 develop Palmer, Babko-Malaya,Dang SNLU 2004 NLP

  20. Sense Hierarchy • PropBank Framesets – coarse grained distinctions 20 Senseval 2 verbs w/ > 1 Frameset Maxent WSD system, 73.5% baseline, 90% accuracy • Sense Groups (Senseval-2) intermediate level (includes Levin classes) – 69% • WordNet – fine grained distinctions, 60.2% NLP

  21. Maximum Entropy WSDHoa Dang, best performer on Verbs • Maximum entropy framework, p(sense|context) • Contextual Linguistic Features • Topical feature for W: +2.5%, • keywords(determined automatically) • Local syntactic features for W: +1.5 to +5%, • presence of subject, complements, passive? • words in subject, complement positions, particles, preps, etc. • Local semantic features for W: +6% • Semantic class info from WordNet (synsets, etc.) • Named Entity tag (PERSON, LOCATION,..) for proper Ns • words within +/- 2 word window NLP

  22. Context Sensitivity • Programming languages are Context Free • Natural languages are Context Sensitive? • Movement • Features • respectively John, Mary and Bill ate peaches, pears and apples, respectively. NLP

  23. The Chomsky Grammar Hierarchy • Regular grammars, aabbbb S → aS | nil | bS • Context free grammars, aaabbb S → aSb | nil • Context sensitive grammars, aaabbbccc xSy → xby • Turing Machines NLP

  24. Recursive transition nets for CFGs • s :- np,vp. • np:- pronoun; noun; det,adj, noun; np,pp. NP VP S S1 S2 S3 pp det noun NP adj S4 S5 S6 noun pronoun NLP

  25. Most parsers are Turing Machines • To give a more natural and comprehensible treatment of movement • For a more efficient treatment of features • Not because of respectively – most parsers can’t handle it. NLP

  26. Nested Dependencies and Crossing Dependencies John, Mary and Bill ate peaches, pears and apples, respectively CF The dog chased the cat that bit the mouse that ran. CF The mouse the cat the dog chased bit ran. CS NLP

  27. Movement What did John give to Mary? *Where did John give to Mary? John gave cookies to Mary. John gave <what> to Mary. NLP

  28. Handling Movement:Hold registers/Slash Categories • S :- Wh, S/NP • S/NP :- VP • S/NP :- NP VP/NP • VP/NP :- Verb NLP

More Related