1 / 25

Word Grammar in Theory

Word Grammar in Theory. Dick Hudson Cardiff, May 2013. History. 1963: PhD on Beja grammar at SOAS Halliday or Chomsky? Halliday was more convincing 1964-71: worked with Halliday 64-67 with Huddleston on a corpus study 66-67 met Terry Winograd (AI) read about Lamb's network theory

agatha
Download Presentation

Word Grammar in Theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Word Grammar in Theory Dick Hudson Cardiff, May 2013

  2. History • 1963: PhD on Beja grammar at SOAS • Halliday or Chomsky? • Halliday was more convincing • 1964-71: worked with Halliday • 64-67 with Huddleston on a corpus study • 66-67 met Terry Winograd (AI) • read about Lamb's network theory • wrote first generative Systemic Grammar book

  3. Systemic Grammar

  4. Dependency theory • 1976: why no word-word dependencies? • Systemic + Dependency = Daughter-Dependency Grammar

  5. Psychological reality • AI (e.g. Winograd, Schank, Anderson) • Mental networks (Lamb, Halliday) • Linguists (Bresnan, Chomsky) • Sociolinguistics (e.g. Labov, Gumperz)

  6. Word Grammar 1984 • Why recognise phrases? • Daughter-Dependency Grammar + psychological reality – phrases = WG

  7. What am I? • a lapsed systemicist • a dependency grammarian • a cognitive linguist • a sociolinguist • a descriptive linguist • a theoretical linguist • an educational linguist

  8. What do I believe? • Theory matters and can be improved • Truth matters and can be aimed at. • Minds matter and can be modelled • Texts matter and can be explained • Education matters and needs linguistics • Neighbouring disciplines matter, and can be learned from.

  9. A nice quote One has to ask … whether a particular claim or assumption in one domain can really be true, given what we know about another domain. Wray 2009

  10. What don't I believe in? • macrofunctions in structure • system networks • the rank scale • constructions as signs

  11. Macrofunctions in structure • Many artefacts have multiple functions • But the functions are all 'realized' by the same structure. • e.g. chair: • comfortable for sitting • strong, durable • good-looking • affordable

  12. Why 'interpersonal'? Hearer is part of the meaning of 'question', but not otherwise. For instance What does 'theme' mean?

  13. Why is this the first choice? System networks How might we organise knowledge so tidily? And why?

  14. The rank scale unary branching objections Headedness sentence Hurry! Classification of mother is always predictable from its head daughter clause hurry group hurry word hurry

  15. Constructions • CxG: a language consists of nothing but constructions. • A construction is a form-meaning pair. • So: every 'form' has a meaning • But some forms have no meaning, e.g. • extraposition • (arguably) all of morphology, e.g. {z}, {ing}

  16. What does WG actually claim? • Language is just ordinary knowledge applied to words. So: • Language is a mental network. • The network nodes (concepts) are atoms. • Concepts are defined solely by their links to other concepts. • Concepts are organised in 'isa' hierarchies.

  17. labels are NOT part of the analysis For instance … {on} {a} {round} on aground around round isa hierarchy around1 around2 … all around were … … messing around … looked around … the soil around the base of the plant …

  18. Tokens and types • Tokens are created ad hoc, types are stored. • The type-token ratio = types/tokens • e.g. for The cat sat on the mat TTR = 5/6 • Tokens have distinct properties, so they must be distinct concepts. • Each token isa at least one type. • e.g. He wandered around • Tokens show modifying effects of context. isa around1

  19. An utterance as part of the grammar word noun verb preposition past THE CAT MAT SIT ON SIT, past The cat sat on the mat.

  20. Tokens and syntax • Each word token is distinct from its type: • It shows the effects of its syntactic context. • the/1 means 'the cat' • on means 'on the mat' • sat means 'sat on the mat' • but also 'the cat sat on the mat' • So phrases aren't needed.

  21. Tokens of tokens … 'the cat sat on the mat' 'sat on the mat' 'on the mat' 'the cat' 'cat' 'the mat' 'mat' sat2 the11 on1 sat1 the21 The1 cat sat on the2 mat.

  22. Usage • The network is 'usage based' • built out of episodic memories • It contains 'ex-tokens': words plus context • so formulae • and socially constrained • So any properties may be generalised • 'linguistic' – relations to other words • 'non-linguistic' – relations to social constructs

  23. Processing • Node creation: one node per token • Default inheritance to enrich token nodes • Binding tokens to existing nodes • Spreading activation • activation guides inheritance and binding • super-active tokens become permanent types

  24. An utterance as part of the grammar word noun verb preposition past THE CAT MAT SIT ON SIT, past ? the cat sat mat The on

  25. Summary • Mentalist – just cognitive structures • Just a network of atoms • Declarative network • Organised round an isa hierarchy • Procedures: • node creation • node enrichment (inheritance, binding) • spreading activation

More Related