1 / 14

Parts of Speech

Parts of Speech. Generally speaking, the “grammatical type” of word: Verb, Noun, Adjective, Adverb, Article, … We can also include inflection: Verbs: Tense, number, … Nouns: Number, proper/common, … Adjectives: comparative, superlative, … …

ingo
Download Presentation

Parts of Speech

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Parts of Speech • Generally speaking, the “grammatical type” of word: • Verb, Noun, Adjective, Adverb, Article, … • We can also include inflection: • Verbs: Tense, number, … • Nouns: Number, proper/common, … • Adjectives: comparative, superlative, … • … • Most commonly used POS sets for English have 50-80 different tags

  2. BNC Parts of Speech • Nouns: NN0 Common noun, neutral for number (e.g. aircraft NN1 Singular common noun (e.g. pencil, goose, time NN2 Plural common noun (e.g. pencils, geese, times NP0 Proper noun (e.g. London, Michael, Mars, IBM • Pronouns: PNI Indefinite pronoun (e.g. none, everything, one PNP Personal pronoun (e.g. I, you, them, ours PNQ Wh-pronoun (e.g. who, whoever, whom PNX Reflexive pronoun (e.g. myself, itself, ourselves

  3. Verbs: VVB finite base form of lexical verbs (e.g. forget, send, live, return VVD past tense form of lexical verbs (e.g. forgot, sent, lived VVG -ing form of lexical verbs (e.g. forgetting, sending, living VVI infinitive form of lexical verbs (e.g. forget, send, live, return VVN past participle form of lexical verbs (e.g. forgotten, sent, lived VVZ -s form of lexical verbs (e.g. forgets, sends, lives, returns VBB present tense of BE, except for is …and so on: VBD VBG VBI VBN VBZ VDB finite base form of DO: do …and so on: VDD VDG VDI VDN VDZ VHB finite base form of HAVE: have, 've …and so on: VHD VHG VHI VHN VHZ VM0 Modal auxiliary verb (e.g. will, would, can, could, 'll, 'd)

  4. Articles AT0 Article (e.g. the, a, an, no) DPS Possessive determiner (e.g. your, their, his) DT0 General determiner (this, that) DTQ Wh-determiner (e.g. which, what, whose, whichever) EX0 Existential there, i.e. occurring in “there is…”or “there are…” • Adjectives AJ0 Adjective (general or positive) (e.g. good, old, beautiful) AJC Comparative adjective (e.g. better, older) AJS Superlative adjective (e.g. best, oldest) • Adverbs AV0 General adverb (e.g. often, well, longer (adv.), furthest. AVP Adverb particle (e.g. up, off, out) AVQ Wh-adverb (e.g. when, where, how, why, wherever)

  5. Miscellaneous: CJC Coordinating conjunction (e.g. and, or, but) CJS Subordinating conjunction (e.g. although, when) CJT The subordinating conjunction that CRD Cardinal number (e.g. one, 3, fifty-five, 3609) ORD Ordinal numeral (e.g. first, sixth, 77th, last) ITJ Interjection or other isolate (e.g. oh, yes, mhm, wow) POS The possessive or genitive marker 's or ' TO0 Infinitive marker to PUL Punctuation: left bracket - i.e. ( or [ PUN Punctuation: general separating mark - i.e. . , ! , : ; - or ? PUQ Punctuation: quotation mark - i.e. ' or " PUR Punctuation: right bracket - i.e. ) or ] XX0 The negative particle not or n't ZZ0 Alphabetical symbols (e.g. A, a, B, b, c, d)

  6. Task: Part-Of-Speech Tagging • Goal: Assign the correct part-of-speech to each word (and punctuation) in a text. • Example: • Learn a local model of POS dependencies, usually from pre-tagged data • No parsing

  7. 0.2 AJ0 0.3 “cats” NN2 0.2 0.5 0.6 “a” 0.3 AT0 “men” 0.9 NN1 0.5 0.4 “the” 0.1 “cat” “bet” Hidden Markov Models • Assume: POS (state) sequence generated as time-invariant random process, and each POS randomly generates a word (output symbol)

  8. Definition of HMM for Tagging • Set of states – all possible tags • Output alphabet – all words in the language • State/tag transition probabilities • Initial state probabilities: the probability of beginning a sentence with a tag t (t0t) • Output probabilities – producing word w at state t • Output sequence – observed word sequence • State sequence – underlying tag sequence

  9. HMMs For Tagging • First-order (bigram) Markov assumptions: • Limited Horizon: Tag depends only on previous tag P(ti+1 = tk | t1=tj1,…,ti=tji) = P(ti+1 = tk | ti = tj) • Time invariance: No change over time P(ti+1 = tk | ti = tj) = P(t2 = tk | t1 = tj) = P(tj  tk) • Output probabilities: • Probability of getting word wk for tag tj: P(wk | tj) • Assumption: Not dependent on other tags or words!

  10. Combining Probabilities • Probability of a tag sequence: P(t1t2…tn) = P(t1)P(t1t2)P(t2t3)…P(tn-1tn) Assume t0 – starting tag: = P(t0t1)P(t1t2)P(t2t3)…P(tn-1tn) • Prob. of word sequence and tag sequence: P(W,T) = i P(ti-1ti) P(wi | ti)

  11. Training from Labeled Corpus • Labeled training = each word has a POS tag • Thus: PMLE(tj) = C(tj) / N PMLE(tjtk) = C(tj,tk) / C(tj) PMLE(wk | tj) = C(tj:wk) / C(tj) • Smoothing applies as usual

  12. Viterbi Tagging • Most probable tag sequence given text: T* = arg maxT Pm(T | W) = arg maxT Pm(W | T) Pm(T) / Pm(W) (Bayes’ Theorem) = arg maxT Pm(W | T) Pm(T) (W is constant for all T) = arg maxTi[m(ti-1ti) m(wi | ti) ] = arg maxTi log[m(ti-1ti) m(wi | ti) ] • Exponential number of possible tag sequences – use dynamic programming for efficient computation

  13. w1 w2 w3 -1.7 -1.7 t1 t1 t1 -3 -2.3 -0.3 -0.3 -1.7 t0 t2 t2 t2 -3.4 -1.3 -1.3 -1 t3 t3 t3 -2.7 -7.3 -6 -10.3 -4.7 -6.7 -9.3

  14. Viterbi Algorithm • D(0, START) = 0 • foreach tag t != STARTdo: D(1, t) = - • for i  1 toN do: • for each tag tjdo: D(i, tj)  maxkD(i-1,tk) + lm(tktj) + lm(wi|tj) Record best(i,j)=k which yielded the max • log P(W,T) = maxj D(N, tj) • Reconstruct path from maxj backwards Where: lm(.) = log m(.) and D(i, tj) – max joint probability of state and word sequences till position i, ending at tj. Complexity: O(Nt2 N)

More Related