Loading in 5 sec....

Computational Lexicology, Morphology and SyntaxPowerPoint Presentation

Computational Lexicology, Morphology and Syntax

- 181 Views
- Uploaded on
- Presentation posted in: General

Computational Lexicology, Morphology and Syntax

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Computational Lexicology, Morphology and Syntax

Diana Trandabăț

2013-2014

Transcript with Eliza, simulation of a Rogerian psychotherapist

(Weizenbaum, 1966)

User: Men are all alike

ELIZA: IN WHAT WAY

User: They’re always bugging us about something or other

ELIZA: CAN YOU THINK OF A SPECIFIC EXAMPLE?

User: Well, my boyfriend made me come here

ELIZA: YOUR BOYFRIEND MADE YOU COME HERE

User: He says I’m depressed much of the time

ELIZA: I AM SORRY TO HEAR THAT YOU ARE DEPRESSED

- .* all .*
→IN WHAT WAY

- .* always .*
→CAN YOU THINK OF A SPECIFIC EXAMPLE

- .* I’m (depressed|sad) .*
→ I AM SORRY TO HEAR YOU ARE \1

- .* I’m (depressed|sad) .*
→WHY DO YOU THINK YOU ARE \1?

- What is intelligence?
- What does Eliza tell us about intelligence?

The sentence as a string of wordsE.g I saw the lady with the binocularsstring = a b c d e b f

The relations of parts of a string to each other may be different

I saw the lady with the binoculars

is stucturally ambiguous

Who has the binoculars?

[I] saw the lady [ with the binoculars]= [a] b c d [e b f]I saw[ the lady with the binoculars]= a b [c d e b f]

How can we represent the difference?

By assigning them different structures.

We can represent structures with 'trees'.

I

read

thebook

a. I saw the lady with the binoculars SNPVPVNPNPPP

Isaw the ladywith the binocularsI saw [the lady with the binoculars]

b. I saw the lady with the binoculars SNPVPVPPP

Isaw the ladywith the binocularsI[ saw the lady ] with the binoculars

birdsfly

S

NPVP

NV

birdsfly

S →NPVP

NP → N

VP → V

Syntactic rules

S

NPVP

birdsfly

ab

ab = string

S

A B

a b

ab

S → A B

A → a

B → b

Rules

Assumption:

natural language grammars are a rule-based systems

What kind of grammars describe natural language phenomena?

What are the formal properties of grammatical rules?

Chomsky (1957) Syntactic Structures. The Hague: Mouton

Chomsky, N. and G.A. Miller (1958) Finite-state languages Information and Control 1, 99-112

Chomsky (1959) On certain formal properties of languages. Information and Control 2, 137-167

Rules in Linguistics1.PHONOLOGY/s/ → [θ] V ___VRewrite /s/ as [θ] when /s/ occurs in context V ____ VWith:V =auxiliary nodes, θ=terminal nodes

Rules in Linguistics2.SYNTAXS →NP VPVP→VNP→NRewrite S as NP VP in any contextWith:S, NP, VP=auxiliary nodesV, N=terminal node

SYNTAX (phrase/sentence formation)

sentence:

The boykissed the girl

Subjectpredicate

noun phraseverb phrase

art + nounverb + noun phrase

S→NPVP

VP→VNP

NP→ARTN

Chomsky Hierarchy

0.Type 0 (recursively enumerable) languages

Only restrictionon rules: left-hand side cannot be the empty string (* Ø …….)

1. Context-Sensitive languages - Context-Sensitive (CS) rules

2. Context-Free languages - Context-Free (CF) rules

3. Regular languages - Non-Context-Free (CF) rules

0 ⊇ 1⊇ 2 ⊇ 3

a⊇b meaning a properly includes b (aisasupersetofb),

i.e. b is a proper subset of a or b is in a

Generative power

0.Type 0 (recursively enumerable) languages

- only restriction on rules: left-hand side cannot
be the empty string (* Ø …….)

- is the most powerful system

3. Type 3(regularlanguage)

- is the least powerful

Superset/subset relation

S1S2

a c

b

d

f g

a

b

S1 is a subset of S2 ; S2 is a superset of S1

Rule Type – 3

Name: Regular

Example:Finite State Automata (Markov-process Grammar)

Rule type:

a) right-linear

AxB or

A x

with:

A, B = auxiliary nodes and

x = terminal node

b) or left-linear

ABx or

A x

Generates: ambn with m,n 1

Cannot guarantee that there are as many a’s as b’s; no embedding

A regular grammar for natural language sentences

S→theA

A→catB

A→mouseB

A→duckB

B→bitesC

B→seesC

B→eatsC

C→theD

D→boy

D→girl

D→monkey

the cat bites the boy

the mouse eats the monkey

the duck sees the girl

Regular grammars

Grammar 1: Grammar 2:

A → aA → a

A → a BA → B a

B → b AB → A b

Grammar 3: Grammar 4:

A → aA → a

A → a BA → B a

B → bB → b

B → b AB → A b

Grammar 5:Grammar 6:

S→a AA → A a

S→b BA → B a

A→a SB → b

B→b b SB → A b

S→A → a

Grammars: non-regular

Grammar 6:Grammar 7:

S→A BA → a

S→b BA → B a

A→a SB → b

B→b b SB → b A

S→

Finite-State Automaton

articlenoun

NP NP1 NP2

adjective

NP

article NP1

adjectiveNP1

nounNP2

NP → article NP1

NP1 →adjective NP1

NP1 → noun NP2

A parse tree

S root node

NPVPnon-

terminal

N VNP nodes

DETN

terminal nodes

Rule Type – 2

Name: Context Free

Example:

Phrase Structure Grammars/

Push-Down Automata

Rule type:

A

with:

A = auxiliary node

= any number of terminal or auxiliary nodes

Recursiveness(centre embedding) allowed:

AA

CF Grammar

A Context Free grammar consists of:

a)a finite terminal vocabulary VT

b)a finite auxiliary vocabulary VA

c)an axiom S VA

- a finite number of context free rules of
form A → γ,

whereAVA

and γ {VAVT}*

In natural language syntax S is interpreted as the start symbol for sentence, as in S → NP VP

Natural language

Is English regular or CF?

If centre embedding is required, then it cannot be regular

Centre Embedding:

1.[The cat][likes tuna fish]

ab

2.The cat the dog chased likes tuna fish

aa b b

3.The cat the dog the rat bit chased likes tuna fish

a a abb b

4.The cat the dog the rat the elephant admired bit chased likes tuna fish

a a a a bb b b

ab

aabb

aaabbb

aaaabbbb

[The cat][likes tuna fish]

ab

2.[The cat] [the dog] [chased] [likes ...]

aa bb

Centre embedding

S

NPVP

thelikes

cattuna

ab

= ab

S

NPVP

likes

NPStuna

theb

catNPVP

athechased

dogb

a

=aabb

S

NPVP

likes

NPStuna

theb

catNPVP

achased

NPSb

the

dog NPVP

athebit

ratb

a

=aaabbb

Natural language 2

More Centre Embedding:

1.If S1, then S2

a a

2.Either S3, or S4

b b

Sentence with embedding:

If either the man is arriving today or the woman is arriving tomorrow, then the child is arriving the day after.

a =[if

b =[either the man is arriving today]

b =[or the woman is arriving tomorrow]]

a =[then the child is arriving the day after]

= abba

CS languages

The following languages cannot be generated by a CF grammar (by pumping lemma):

anbmcndm

Swiss German:

A string of dative nouns (e.g. aa), followed by a string of accusative nouns (e.g. bbb), followed by a string of dative-taking verbs (cc), followed by a string of accusative-taking verbs (ddd)

= aabbbccddd

= anbmcndm

Swiss German:

Jan sait das (Jan says that) …

merem HansesHuushälfedaastriiche

weHans/DATthe house/ACChelpedpaint

we helped Hans paint the house

abcd

NPdatNPdatNPaccNPaccVdatVdatVaccVacc

aabbccd d

Sets of rules expressing how symbols of the language fit together, e.g.S -> NP VPNP -> Det NDet -> theN -> dog

- LHS of rule is just one symbol.
- Can haveNP -> Det N
- Cannot haveX NP Y -> X Det N Y

- Non Terminal Symbols
- Terminal Symbols
- Words
- Preterminals

- Symbols which have definitions
- Symbols which appear on the LHS of rulesS-> NP VPNP -> Det NDet -> theN-> dog

- Same Non Terminals can have several definitionsS-> NP VPNP -> Det N
NP -> N Det -> theN-> dog

- Symbols which appear in final string
- Correspond to words
- Are not defined by the grammar
S -> NP VPNP -> Det NDet -> theN -> dog

- NT Symbols which produce terminal symbols are sometimes called pre-terminals
S -> NP VPNP -> Det NDet -> theN-> dog

- Sometimes we are interested in the shape of sentences formed from pre-terminalsDet N VAux N V D N

A CFG is a tuple (N,,R,S)

- N is a set of non-terminal symbols
- is a set of terminal symbols disjoint from N
- R is a set of rules each of the form A where A is non-terminal
- S is a designated start-symbol

grammar:

S NP VP

NP N

VP V NP

lexicon:

V kicks

N John

N Bill

N = {S, NP, VP, N, V}

= {kicks, John, Bill}

R = (see opposite)

S = “S”

- Write grammars that generate the following languages, for m > 0
(ab)m

anbm

anbn

- Which of these are Regular?
- Which of these are Context Free?

S -> a b

S -> a b S

S -> a b

S -> a b S

S -> a X

X -> b Y

Y -> a b

Y -> S

S -> A B

A -> a

A -> a A

B -> b

B -> b B

S -> A B

A -> a

A -> a A

B -> b

B -> b B

S -> a AB

AB -> a AB

AB -> B

B -> b

B -> b B

NP

VP

N

V

NP

N

John

kicks

Bill

grammar:

S NP VP

NP N

VP V NP

lexicon:

V kicks

N John

N Bill

S

NP

N

Bill

grammar:

S NP NP

NP N V

NP N

lexicon:

V kicks

N John

N Bill

S

NP

N

V

John

kicks

- The structure assigned by the grammar should be appropriate.
- The structure should
- Be understandable
- Allow us to make generalisations.
- Reflect the underlying meaning of the sentence.

- A grammar is ambiguous if it assigns two or more structures to the same sentence.
NP NP CONJ NP

NP N

lexicon:

CONJ and

N John

N Bill

- The grammar should not generate too many possible structures for the same sentence.

- Does it undergenerate?
- Does it overgenerate?
- Does it assign appropriate structures to sentences it generates?
- Is it simple to understand? How many rules are there?
- Does it contain just a few generalisations or is it full of special cases?
- How ambiguous is it? How many structures does it assign for a given sentence?

- The PATR-II formalism can be viewed as a computer language for encoding linguistic information.
- A PATR-II grammar consists of a set of rules and a lexicon.
- The rules are CFG rules augmented with constaints.
- The lexicon provides information about terminal symbols.

Grammar (grammar.grm)

Rule

s -> np vp

Rule

np -> n

Rule

vp -> v

Lexicon (lexicon.lex)

\w uther

\c n

\w sleeps

\c v

- A PCFG is a probabilistic version of a CFG where each production has a probability.
- String generation is now probabilistic where production probabilities are used to non-deterministically select a production for rewriting a given non-terminal.

- In a PCFG, the probability P(Aβ) expresses the likelihood that the non-terminal A will expand as β.
- e.g. the likelihood that S NP VP
- (as opposed to SVP, or S NP VP PP, or… )

- e.g. the likelihood that S NP VP
- can be interpreted as a conditional probability:
- probability of the expansion, given the LHS non-terminal
- P(Aβ) = P(Aβ|A)

- Therefore, for any non-terminal A, probabilities of every rule of the form A β must sum to 1
- If this is the case, we say the PCFG is consistent

Lexicon

Prob

Grammar

S → NP VP

S → Aux NP VP

S → VP

NP → Pronoun

NP → Proper-Noun

NP → Det Nominal

Nominal → Noun

Nominal → Nominal Noun

Nominal → Nominal PP

VP → Verb

VP → Verb NP

VP → VP PP

PP → Prep NP

0.8

0.1

0.1

0.2

0.2

0.6

0.3

0.2

0.5

0.2

0.5

0.3

1.0

Det → the | a | that | this

0.6 0.2 0.1 0.1

Noun → book | flight | meal | money

0.1 0.5 0.2 0.2

Verb → book | include | prefer

0.5 0.2 0.3

Pronoun → I | he | she | me

0.5 0.1 0.1 0.3

Proper-Noun → Houston | NWA

0.8 0.2

Aux → does

1.0

Prep → from | to | on | near | through

0.25 0.25 0.1 0.2 0.2

+

1.0

+

1.0

+

1.0

+

1.0

- Assume productions for each node are chosen independently.
- Probability of a parse tree (derivation) is the product of the probabilities of its productions.
- Resolve ambiguity by picking most probable parse tree.
- Probability of a sentence is the sum of the probabilities of all of its derivations.

- S NP VP1.0
- NP DT NN0.5
- NP NNS0.3
- NP NP PP 0.2
- PP P NP1.0
- VP VP PP 0.6
- VP VBD NP0.4

- DT the1.0
- NN gunman0.5
- NN building0.5
- VBD sprayed 1.0
- NNS bullets1.0
- P with1.0

- The gunman sprayed the building with bullets.

S1.0

P (t1) = 1.0 * 0.5 * 1.0 * 0.5 * 0.6 * 0.4 * 1.0 * 0.5 * 1.0 * 0.5 * 1.0 * 1.0 * 0.3 * 1.0 = 0.00225

NP0.5

VP0.6

NN0.5

DT1.0

PP1.0

VP0.4

P1.0

NP0.3

NP0.5

VBD1.0

The

gunman

DT1.0

NN0.5

with

NNS1.0

sprayed

the

building

bullets

- The gunman sprayed the building with bullets.

S1.0

P (t2) = 1.0 * 0.5 * 1.0 * 0.5 * 0.4 * 1.0 * 0.2 * 0.5 * 1.0 * 0.5 * 1.0 * 1.0 * 0.3 * 1.0 = 0.0015

NP0.5

VP0.4

NN0.5

DT1.0

VBD1.0

NP0.2

The

gunman

sprayed

NP0.5

PP1.0

P (sentence) = = P (t1) + P(t2) = = 0.00225 + 0.0015 = 0.00375

DT1.0

NN0.5

P1.0

NP0.3

NNS1.0

the

building

with

bullets

Great!

See you next time!