1 / 46

Language, Rules, and Serial Processing

Language, Rules, and Serial Processing. Prof.dr. Jaap Murre University of Amsterdam University of Maastricht jaap@murre.com http://www.neuromod.org. Overview. Speech Symbols Language Language acquisition Semantics. Speech. What is speech?.

terrijones
Download Presentation

Language, Rules, and Serial Processing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Language, Rules, and Serial Processing Prof.dr. Jaap Murre University of Amsterdam University of Maastricht jaap@murre.com http://www.neuromod.org

  2. Overview • Speech • Symbols • Language • Language acquisition • Semantics

  3. Speech

  4. What is speech? • Speech are modulated wave forms that are produced by a source (lungs and glottis) and filtered by the vocal tract and lips and cheeks

  5. Source-filter model of speech

  6. Speech perception is very difficult. It is also categorical.

  7. Categories, Symbols, Subsymbols

  8. Categories • Items within a category are considered equal with respect to that category • Categories thus induce their own similarity structure on whatever they represent • A dolphin seems to be a big fish, but as a mammal is in a different category • Network always impose a priori constraints on categories

  9. Induced categories • Categories are formed on the basis of regularities in the input (a posteriori constraints) and network architecture (a priori constraints) • Neural network categories are subsymbolic

  10. Symbol • What we now call symbol is really what de Saussure calls a sign (his symbol is what we would now call icon or logo) • Ferdinand de Sausssure (1857-1913), Swiss Founder of Modern Linguistics

  11. From Course in General Linguistics (1916)Nature of the Linguistic Sign • The linguistic sign unites, not a thing and a name, but a concept and a sound-image. The latter is not a material sound, a purely physical thing, but the psychological imprint of the sound, the impression it makes on our senses. (p.66)

  12. Generalization of the sign • I propose to retain the word sign [signe] to designate the whole and to replace concept and sound-image respectively by signified [signifié] and signifier [signifiant]; … (p.67)

  13. Principle I: The Arbitrary Nature of the Sign • The bond between the signifier and the signified is arbitrary (p.67) • … the individual does not have the power to change a sign in any way once it has become established in the linguistic community… (p.69)

  14. Principle II: The Linear Nature of the Signifier • The signifier, being auditory, is unfolded in time from which it gets the following characteristics: • (a) it represents a span, and • (b) the span is measurable in a single dimension; it is a line. (p.70)

  15. Problems for connectionism • Neural network categories are co-determined by input regularities and network architecture: • They are not arbitrary (symbols/signs are) • Neural network representations are parallel • Symbols/signs are linear (serial)

  16. Language

  17. Language • What is language? • Is it innate or learned? • Where located in the brain? • Can neural networks represent language processes?

  18. What is language? • De Saussure distinguished ‘langue’ from ‘parole’ • Chomsky distinguished ‘competence’ from ‘performance’ • Chomsky strongly defended the idea of the innateness of language

  19. Language is hierarchical and can be extremely ambiguous

  20. Grammar may be innate

  21. The essence of grammar is recursion It allows an infinite number of sentences to be generated by just a few rules Simple grammar G = {N,V,S,P} S aSa S bSb S c E.g., c, aca, bcb, aacaa, aabacabaa S Þ aSa Þ aaSaa Þ aabSbaa Þ aabaSabaa Þ aabacabaa The man lit his awful cigar The man that you thought was old lit his awful cigar The man that you thought that your mother had seen lit his awful cigar et cetera

  22. Where does language come from? • Certain aspects of the development of language and thought appear to be universal in that they • (i) preceed any learning by the individual • (ii) are found in all individuals in the same way • These universalia are often of a deep and abstract nature • It is not known at present how they are respresented in the brain, or how they emerge from brain organization

  23. Universal constraints in thought development • Spelke shows that from a very early age, infants know about the continuity and solidity of objects • These constraints lie at the core of the developmental learning system • It is not clear how these are represented in the brain or how they emerge

  24. Selection versus instruction • Chomsky/Pinker: The child must select a grammar • Bickerton: The child is provided with a specific grammar, which it than modifies in the direction of the caretaker’s language

  25. Bickerton: Not all languages may be equally hard to learn • Children’s errors when learning English often resemble Creole, for example, the so called double negative • Perhaps, Creole is the ‘original mother language’

  26. Willem Levelt’s model of speech production and perception

  27. From concept to speech signal

  28. Very complicated transformation take place during speaking • A conceptual representation is a network of neurons that fire with a complex associative correlational pattern • This conceptual-semantic pattern is transformed into a hierarchical syntactic pattern • This pattern is transformed into a serial speech pattern

  29. Language acquisition

  30. Simple recurrent networks Inducing syntactic structures

  31. Simple Recurrent Network …, D, C, B, A Buffer (copy of hidden layer)

  32. Simple Recurrent Network • Introduced by Jeff Elman as a simplification of the general recurrent backpropagation algorithm (Rumelhart, Hinton, & Williams, 1986). • Feedforward plus 1 buffer • Allows learning of sequences • Can learn simple grammars if embedding is not too deep (N.B. grammar induction is NP-Complete!)

  33. Limitations (Sharkey et al., 2000) • Hard and unreliable to train • Extremely sensitive to initial weight configurations • Only 2 out of 90 networks learned to be Finite State Grammar Recognizers • Cannot handle remote dependencies • Useless as a psychological model for grammar acquisition

  34. Finite State Grammar L1 (Sharkey, Sharkey, and Jackson, 2000) C D 2 4 C A H (S) D B 6 1 B E 3 5 E A

  35. Connectionism offers a battle ground for debate • McClelland and Rumelhart’s model of past tense learning has ignited a furious and fertile debate • Rather than rhetoric and assertions, models are used to support arguments • These models typically offer existence proofs at this point

  36. Existence proofs • You say: Your idea cannot work because you could never do X • I make a model that implements my idea and that can do X • Now have an existence proof that my idea covers X • This does not in any way prove that my idea is correct or plausible from a psychological (or biological etc.) perspective

  37. Much more of language can be induced than expected • Past tense, pluralization, case systems can all be learned from examples • Also text-to-speech • Also exceptions can be acquired in this way without disturbing the behavior of the network • Phonological, segmental, and prosodic regularities can successfully be detected (induced), e.g., word stress

  38. Semantics

  39. Semantic networks may be used to help think about the associative networks in the brain

  40. Better is it to view concepts as vectors of abstract ‘features’

  41. Acquisition of semantics • Can semantics be induced? • How much a priori structure needs to be present and what form should this take? • Are rules superfluous? • What is the relationship between episodic and semantic knowledge?

  42. How can semantic organization be organized according to category? • Self-organizing maps in the brain can explain the emergence of topological mappings • Examples are: • the somatosensory homunculus (discussed in lecture 7) • retinotopic maps in V1 (area 17, discussed in lecture 3)

  43. Semantic organization can emerge on the basis of word context (Ritter and Kohonen, 1990)

  44. Example of a semantotopic map Interesting is that words organize into both semantic and grammatical categories

  45. Recent extension to Latent Semantic Analysis by Landauer and colleagues • Meaning determined by context • Reduce the semantic space by singular value decomposition • This improves generalization • Applications to automatic dictionaries and even essay grading!

  46. Conclusions • Neural networks are able to do language and serial processing • They are not great at it • The attempts to have neural networks process and above all learn language have ignite an important debate with the proponents of rule-based methods

More Related