1 / 34

Dependency structure and cognition

This text explores the nature of syntactic structure, specifically focusing on dependency structure and its inclusion of word-word dependencies. It discusses the familiarity of dependency structure in university courses, school grammar, and Czech children's education. Additionally, it examines the convenience of dependency structure in computational linguistics and the relevance of cognition in understanding syntactic structures.

dday
Download Presentation

Dependency structure and cognition

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dependency structure and cognition Richard Hudson Depling2013, Prague

  2. The question • What is syntactic structure like? • Does it include dependencies between words (dependency structure)? • Or does it only contain part-whole links (phrase structure)? She looked after him after him She looked after him

  3. Relevant evidence: familiarity • University courses teach only one approach. • School grammar sometimes offers one. • Usually dependency structure • even in the USA • Reed-Kellogg sentence-diagramming • especially in Europe • and especially in the Czech Republic!

  4. What Czech children do at school kingcups blossomed out by stream yellow near Jirka Hana & Barbora Hladká 2012

  5. or even …

  6. Relevant evidence: convenience • Dependency structure is popular in computational linguistics. • Maybe because of its simplicity: • few nodes • little but orthographic words • Good for lexical cooccurrence relations

  7. Relevant evidence: cognition • Language competence is memory • Language processing is thinking • Memory and thinking are part of cognition • So what do we know about cognition? • A. Very generally, cognition is not simple • so maybe syntactic structures aren't in fact simple?

  8. B. Knowledge is a network Gretta John Colin Gaynor me Lucy Peter

  9. C. Links are classified relations relative person is-a parent child woman man mother father

  10. D. Nodes are richly related Gretta John m f f m s s s s Colin Gaynor me b w b h d gf Lucy s Peter

  11. E. Is-a allows default inheritance • Is-a forms taxonomies. • e.g. 'linguist is-a person', 'Dick is-a linguist' • Properties 'inherit' down a taxonomy. • But only 'by default' – exceptions are ok. • e.g. birds (normally) fly • but penguins don't.

  12. Penguins bird 'flies' robin penguin 'doesn't fly' penguin* robin* 'flies' 'doesn't fly'

  13. Cognitivism • 'Cognitivism' • 'Language is an example of ordinary cognition' • So all our general cognitive abilities are available for language • and we have no special language abilities. • Cognitivism matters for linguistic theory.

  14. Some consequences of cognitivism • Word-word dependencies are real. • 'Deep' and 'surface' properties combine. • Mutual dependency is ok. • Dependents create new word tokens. • Extra word tokens allow raising. • But lowering may be ok too.

  15. 1. Word-word dependencies are real • Do word-word dependencies exist (in our minds)? • Why not? • Compare social relations between individuals. • What about phrases? • Why not? • But maybe only their boundaries are relevant? • They're not classified, so no unary branching.

  16. Punctuation marks boundaries • At the end of the road, turn right. • Not: • At the end of the, road turn right. • At the end, of the road turn right. • At the end of the road turn right, • How do we learn to punctuate if we can't recognise boundaries?

  17. No unary branching • If S NP + VP, then: S But if a verb's subject is a noun: VP NP V N V N moo. Cows moo. Cows

  18. 2. 'Deep' and 'surface' properties combine. • Dependencies are relational concepts. • Concepts record bundles of properties that tend to coincide • e.g. 'bird': beak, flying, feathers, two legs, eggs • 'mother': bearer, carer • So one dependency has many properties: • semantic, syntactic, morphosyntactic • e.g. 'subject' ….

  19. 'subject' The typical subject is defined by • meaning • typically 'actor' or … • word order and/or case • typically before verb and/or nominative • agreement • typically the verb agrees with it • status • obligatory or optional, according to finiteness

  20. So … • Cognition suggests that 'deep' and 'surface' properties should be combined • not separated • They are in harmony by default • but exceptionally they may be out of harmony • this is allowed by default inheritance

  21. 3. Mutual dependency is ok. • Mutual dependency is formally impossible in standard notation • And is formally impossible in phrase structure theory • So if it exists, we need to • resist PS theory • change the standard notation

  22. Mutual dependency exists • I wonder who came? • Who is subject of came, • so who depends on came. • But who depends on wonder • and came can be omitted: • e.g. Someone came – I wonder who. • So came depends on who.

  23. Standard notation A 'dominates' B so A is above B A B so B cannot 'dominate' A B A

  24. 4. Dependents create new word tokens. • General cognition: • every exemplar needs a mental node. • no node carries contradictory properties. • so some exemplars need two nodes. • E.g. when we re-classify things. • NB we can remember both classifications

  25. What kind of bird? bird blackbird B ? mate B*

  26. And in language … word LIKE-verb like I ? subject like* NB like* is a token of a token

  27. The effect of a dependent • When we recognise a dependent for W, we change W into a new token W*. • The classification of W* may change. • W* also has a new meaning • normally a hyponym of W • but may be idiomatic • If we add dependents singly, this gives a kind of phrase structure!

  28. typical French house HOUSE house meaning house house meaning French house* French house meaning house** typical French house typical meaning

  29. Notation house** house* typical French house typical French house

  30. 5. Extra word tokens allow raising. rains it subject it subject predicative subject raining it* keeps

  31. Raising in the grammar A* is-a A, so A* wins. higher parent A* B shared lower parent C A

  32. 6. But lowering may be ok too. • Raising is helpful for processing • the higher parent is nearer to the sentence root. • But sometimes lowering is helpful too • e.g. if it allows a new meaning-unit. • Eine Concorde gelandet ist hier nie. a Concorde landed has here never. A-Concorde-landing has never happened here.

  33. German Partial VP fronting Eine Concorde higher parent gelandet ist nie hier Eine Concorde* lower parent lowered

  34. Conclusions • Language is just part of cognition. • So syntactic dependencies are: • psychologically real • rich (combining 'deep' and 'surface' properties) • complex (e.g. mutual, multiple). • Anddependency combines with • default inheritance • multiple tokens

More Related