1 / 16

Lexical Ambiguity Resolution / Sense Disambiguation

Lexical Ambiguity Resolution / Sense Disambiguation. Supervised methods Non-supervised methods Class-based models Seed models Vector models EM Iteration Unsupervised clustering Sense induction Anaphosa Resolution. For sense disambiguation,.

hobby
Download Presentation

Lexical Ambiguity Resolution / Sense Disambiguation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lexical Ambiguity Resolution / Sense Disambiguation • Supervised methods • Non-supervised methods • Class-based models • Seed models • Vector models • EM Iteration • Unsupervised clustering • Sense induction • Anaphosa Resolution -- CS466 Lecture XVIII --

  2. For sense disambiguation, ** Ambiguous verbs (e.g., to fire) depend heavily on words in local context (in particular, their objects). ** Ambiguous nouns (e.g., plant) depend on wider context. For example, seeing [ greenhouse, nursery, cultivation ] within a window of +/- 10 words is very indicative of sense. -- CS466 Lecture XVI --

  3. Deficiency of “Bag-of-words” Approach context is treated as an unordered bag of words -> like vector model (and also previous neural network models etc.) -- CS466 Lecture XVI --

  4. Observations Words tend to exhibit only one sense in a given collocation or word association 2 word Collocations (word to left or word to the right) Formally P (sense | collocation) is a low entropy distribution

  5. Collocation Means (originally): - “in the same location” - “co-occurring” in some defined relationship • Adjacent (bigram allocations) • Verb/Object collocations • Co-occurrence within +/- k words collocations Fireher Fire the long rifles Made of lead, iron, silver, … Other Interpretation: • An idiomatic (non-compositional high frequency association) • Eg. Soap opera, Hong Kong -- CS466 Lecture XVI --

  6. Order and Sequence Matter: plant pesticide  living plant pesticide plant manufacturing plant a solid lead  advantage or head start a solid wall of lead  metal a hotel in Madison place I saw Madisonin a hotel bar  person -- CS466 Lecture XVI --

  7. Observation • Distance matters • Adjacent words more salient than those 20 words away All positions give same weight -- CS466 Lecture XVI --

  8. Observations Words tend to exhibit only one sense in a given discourse or document = word form • Very unlikely to have living Plants / manufacturing plants referenced in the same document (tendency to use synonym like factory to minimize ambiguity) communicative efficiency (Grice) • Unlikely to have Mr. Madison and Madison City in the same document • Unlikely to have Turkey (both country and bird) in the same document -- CS466 Lecture XVI --

  9. Vector Models for Word Sense KEY Sense 1 Centroid Sense 1 Sense 2 Sense 2 Centroid -- CS466 Lecture XVI --

  10. Plant S1 Sum += V[i] For each vector Xi Sim (1, i) For each term in vecs[docn] Sum[term] += vec[docn] S2 Sim (2,i) S1 > S2 assign sense 1 else sense 2 S1 – S2 for all terms in sum vec[sum][term] != 0 -- CS466 Lecture XVI --

  11. Vector Models for Person / Place KEY PERSON CENTROID PERSON PLACE PLACE CENTROID -- CS466 Lecture XVI --

  12. Vector Models for Lexical Ambiguity Resolution / Lexical Classification Treat labeled contexts as vectors Class W-3 W-2 W-1 W0 W1 W2 W3 PLACE long way from Madison to Chicago COMPANY When Madison investors issued a Convert to a traditional vector just like a short query V328 V329 -- CS466 Lecture XVI --

  13. Training Space (Vector Model) Per Pl Pl Pl Per Pl Per Per Pl Per Pl Per Per Pl Person Centroid Place Centroid new example Eve Co Company Centroid Co Eve Co Co Eve Co Co Event Centroid Co -- CS466 Lecture XVI --

  14. Problem with supervised methods • Tagged training data is expensive (time, resources) • Solution: • Class discriminators can serve as effective wordsense discriminators And are much less costly to train if we can tolerate some noise in the models -- CS466 Lecture XVIII --

  15. Pseudo-Class Discriminators What if class lists (like Rogets) are not available? Create small classes optimized for the target ambiguity Class (Crane 1) = heron, stork, eagle, condor, … Class (Crane 2) = derrick, forklift, bulldozers, … Class (Tank 1) = Jeep, Vehicle, Humvee, Bradley, Abrams, … Class (Tank 2) = Vessel, container, flask, pool Include synonyms, hype-nyms, hyponyms, topically related Smaller and potentially more specific but less robust (parent in tree) (child in tree) -- CS466 Lecture XVIII --

  16. Goal: Iterative Refinement Small sets of hand tagged data • Sorting State: • Output of (poor) class models • Context that match reliable Seed Words: Reliable collocations of target words need not be synonyms. -- CS466 Lecture XIX --

More Related