Loading in 2 Seconds...
Loading in 2 Seconds...
CHRIST’S COLLEGE CAMBRIDGE 2-4 September 2013 EMBODIED LANGUAGE II The Natural Origin of Language: The Neuroscientific Base Robin Allott Video and Graphic Evidence See ALSO The Process Outlined and the Ascent of Intelligence Oxford 2011.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
guided by this, the format of the presentation:
to say (or read out) as little as possible of the text in the Hand-Out
to show as much as possible (in the time) of the videos and graphs
The Natural Origin of Language: The Neuroscientific Base
Nullius in Verba. This was and is the foundation motto of the Royal Society in England. The literal translation is "Of No-one in Words". "Do not rely on the words of any single individual". The motto has been given different interpretations and applications. In science no individual can or should be treated as the unquestionable authority. Science cannot advance by speculative verbal theorising but must be based on observation, experiment, validated by independent repetition of experiments. Science cannot progress by diktats. In the study of speech and language there have been diktats - in the 19th century a French philological society banned discussion of the origin of language. A hundred years ago Ferdinand de Saussure declared that language in its origin is arbitrary,words are symbols, conventional cultural constructs.Today this remains the orthodoxy for much of mainstream linguistics.It was the implicit basis for Noam Chomsky's assertion, decades ago, that in the study of language the emphasis had to be on the structures of syntax.
A third application of Nullius in Verba has become relevant with remarkable advances in neuroscientific experimental techniques. These go with a shift of emphasis from language as a mental construct to speech and the motor aspects of language. There has also been the much-delayed recognition of gesture and speech as parts of a single brain system With this, and with articulatory gesture matching patterns of bodily gesture (Haskins Speech Laboratories work), there is a special difficulty of presentation. Presentation can hardly be simply verbal, simply written. The relationships have to be shown. Hence the videos in the presentation leading up to the graphics from Graziano's research at Princeton into the categorical structuring of stimulated hand and arm movements in the monkey cortex. The neuroscientific and evolutionary base of language - before syntax, before lexicon - has to be speech sounds, the most certain examples of language universals. Yet, as Darwin in the Descent of Man could not have known, our speech sounds are part of a classic evolutionary development, in full continuity with the bodily structure and motor organisation of our animal ancestors.
Max Muller: human language originated in the instinctive faculty of giving ‘articulate expression to the rational conceptions of the mind’.
Darwin: “CD is not worthy to be FMM’s adversary as he knows very little about language and, being fully convinced that man is descended from some lower animal, he is forced to believe a priori that language has developed from inarticulate cries”.
Lashley: “I am coming more and more to the conclusion that the rudiments of every human behavioral mechanism will be found far down in the evolutionary scale and also represented even in primitive activities of the nervous system”
William James: Ideomotor action
Liberman: Motor theory of speech perception and production
Kuhl: Categorical perception of phonemes by animals, infants and adult humans
Browman and Goldstein: Articulatory primitives as phonemes
Rizzolatti and Arbib: Language within our Grasp
Graziano: Action primitives in the cortex for arm and hand movements
Speech and gesture function and origin as part of a single system
Communication ... the frog vocalises and gestures
700 languages in New Guinea genetically unrelated to other languages
Speech generating the appropriate movements (describing unscrewing a thermos flask)
As a young man IW lost all motor feedback from the neck down. With a blindfold, not seeing his hands, he cannot control their movements. (Cole,Gallagher, McNeill)
Marcus du Sautoy, Fellow of New College Oxford, professor of mathematics and Simonyi professor for the public understanding of science. Luc Steels, Director of the Artificial Intelligence Laboratory of the Vrije Universiteit Brussel and head of the Sony Science Laboratory Paris.
After the robots are left on all day, new words are created and the experimenters have to learn their meanings
Text extracts and stimulation graphic from Graziano Michael SA, Taylor Charlotte SR, Moore Trin. Complex movements evoked by microstimulation of precentral cortex. Neuron Vol. 34, 841-851, May 30, 2002.[Speech element table from Robin Allott The Physical Foundation of Language 2001].
[See also the further account in Graziano, Michael SA, Taylor Charlotte SR, Moore Trin and Cooke Dylan F. The cortical control of movement revisited. Neuron Vol. 36, 349-362, October 24, 2002.
"One possibility is that the mechanisms for speech were built on a preexisting mechanism for motor control" (p. 355)]
Systematic arm postures representation in motor cortex. The primate brain is thought to contain a map of the body that is used to control movement. This map is stretched across the cortex in front of the central sulcus, with the feet at the top of the brain and the face near the bottom.
Postures Illustrating the Topographic Map Found in Precentral Cortex of Monkey. The circle on the brain shows the area that could be reached with the electrode. The magnified view at the bottom shows the locations of the stimulation sites. The area to the left of the lip of the central sulcus represents the anterior bank of the sulcus. Stimulation on the right side of the brain caused movements mainly of the left side of the body. For all sites, stimulation trains were presented for 500 ms at 200 Hz.
Stimulation on a behaviorally relevant time scale evoked coordinated, complex postures that involved many joints. Stimulation always drove the joints towards the final posture, regardless of the direction of movement required to reach the posture.
Within the large arm and hand representation, the stimulation-evoked postures were organized across the cortex. Postures that involved the arm were arranged across the cortex to form a map of hand positions around the body. The map of hand locations was embedded in a larger, rough map of the monkey's body.Primary motor cortex corresponded mainly to the representation of the central space directly in front of the monkey's chest.
Extracts from: "Motor Theory of Language." Allott R. In Language Origin: A Multidisciplinary Approach 1992 NATO/ASI Kluwer
The relation between motor programming and speech programming can be examined at each level, the phonemic, the lexical and the syntactic. At the phonemic level, the link is readily established. Each phoneme in turn may be thought of as an organised set of instructions to the muscles involved in speech production; Lindblom suggests that each phoneme has an invariant 'program' that is unaffected by changes in syllable stress or speaking rate (tempo), with co-articulation resulting from the temporal overlap of successive programs (Lindblom 1963). ... These ideas lead one to the conception of a motor-alphabet underlying speech, related in some way to the elementary motor-patterning underlying other forms of action. In this connection, one has to consider what Lieberman describes as the 'very odd' phenomena of categorical speech perception, the fact that we distinguish readily between /ba/ and /pa/ even though both sounds may merge imperceptibly into one another in terms of the physical parameters of the phoneme-production, when speech is synthesised. ...
Much of this evidence was reviewed in the papers for the New York Conference in 1975. Dewson et al. had shown that rhesus monkeys could be taught to distinguish between a variety of speech-sounds in a categorical way. They could distinguish /i/ and /u/ and were able to transfer this discrimination from a male voice with a fundamental frequency of 136 Hz to a female voice with a fundamental frequency of 212 Hz. (Dewson, Pribram & Lynch 1969). Burdick & Miller(1975) found that the chinchilla, a member of the rodent family, could learn to distinguish between /a/ and /u/ not only for different vocal productions by the same talker, but could generalise this discrimination to vowel statements by different talkers, changes in pitch level and changes in intensity. Kuhl & Miller(1975) using synthetic speech had shown that chinchillas can distinguish between /ta/ and /da/, between /ka/ and /ga/ and between /pa/ and /ba/. In another study using natural speech, they had found that, following training, syllables containing either /t/'s or /d/'s can be discriminated by chinchillas despite variations in talkers, in the vowels following the plosives and in intensities. Work by Sinnott(1974) with monkeys had shown that they are able to distinguish between acoustic correlates of the place of human articulation with /ba/ and /da/. Morse's(1976) results using rhesus monkeys were also strikingly in agreement. Using the discriminations /b/ /d/ /g/, rhesus monkeys despite their inability to produce the full range of human speech-sounds, nevertheless discriminate between-category change in place of articulation better than a within-category change - an unexpected finding.
... These results with rhesus monkeys and chinchillas are indeed very surprising. ... Both Warren and Morse reached similar conclusions in the light of this evidence. "Since extended speech sounds can be differentiated by animals that are themselves incapable of producing such sounds, it appears probable that our ancestors had the potential for discriminating speech sounds we now use before they could produce them; there is evidence that human speech perception employs prelinguistic abilities shared with other animals to distinguish between phonemic groupings"(Warren 1976 )
Why, and how, should a number of different animals have this very specific categorical capacity, without any apparent relation to the auditory needs of the animals concerned?
The explanation for this must be that the categorisation of speech-sounds is derived from organisation prior to language, and specifically from the categorisation of motor programs used in constructing and executing all forms of bodily action. What the rhesus monkey, or the chinchilla (or other animals) share with the young human infant is very similar skeletal and muscular organisation, with very similar processes for the neural control of movement generally.The specificity of the phoneme is the accidental result of the application of the different elementary motor subprograms to the muscles which went to form the articulatory system.The hierarchical structure of the motor system is built on the basis of a limited set of motor elements, which are combined in an unlimited number of ways (motor-words), just as phonemes can form an unlimited number of spoken words.
Articulatory phonemes can generate not only speech sounds but also can be expressed as bodily movements: specifically as a range of movements of the hand and arm which are related to the motor primitives involved in all bodily movement both for humans and a wide range of animals with basically similar anatomical structures.
Patterns of gesture are directly linked to patterns of articulatory gesture, that is, each word can be expressed as a pattern of movement of the hand and arm. Since the structures of words were originally derived from the transduction of gestures into motor articulatory patterns and the gestures acquired their structure from the object or action they represented this means that from each word can be generated a bodily gesture which can be seen to be related to the meaning of the word.The word is not an arbitrary symbol but founded on the pattern of gesture from which was derived.
As demonstrated at the St Petersburg Conference on Cognitive Science in 2006 for a set of Russian Words.
φύσει ή θέσει Natural or conventional
Bow-wow Pooh-pooh Ding-dong: Onomatopoeic Interjectory (Inarticulate) Sound-mirroring sense.
Ideomotor action : imagine an action - withhold execution - release - action instantly takes place
Motor theory of speech perception: we comprehend spoken words by a transduction of the auditory input into the motor forms involved in the articulation of the words
Categorical perception of speech: - synthetically analysed or produced speech demonstrates an unbroken auditory record. We perceive speech sharply divided into discontinuous phonemes
Articulatory gesture: Motor patterns of utterance are structurally similar to motor patterns of the arm.
Speech origin 'within our grasp’: through imitation with motor mirror neurons
Rizzolatti: Animal cries and human language are radically distinct in terms of comparative brain anatomy.
The Golden Frog: The frog communicates with other golden frogs by gesture and also by sound. What a frog can do, a human being could have done, particularly once the arms were freed by the transition to bipedalism.
New Guinea gesture: Gesture and language go together in this contact with Western anthropologists. New Guinea has 700 different language (without genetic relation to languages elsewhere). The languages originated in the deep valleys, mountains and swamps of New Guinea (Foley) in much the same way as Darwin's finches differentiated in response to the separate environments of the Galapagos Islands. Polygenesis of languages seems better supported than monogenesis.
Imagining action: the patient (with no control of hand and arm movement) was asked (in words) to imagine picking up a cup and drinking from it; the neurosurgeons recorded the neural firing pattern associated with the imagined action. The computer was programmed to replicate the action as it was being imagined by the patient.
Imagining phonemes: The patient imagined producing the specific phonemes. The distinct neural program for each phoneme was identified using direct electronic access to the (epileptic) patient’s brain. The patient, without articulation, differentially generated actions associated with each imagined phoneme.
Uta Frith from University College London was present for the video demonstrations and commented on the imagining of the phonemes.
The experimenter twisted and took off the top of a thermos flask. As the subject described in words the action (the blindfold prevented him seeing his hands and arms) the movements of his hands and arms matched the sequences of the observed action. (Cole , Gallagher , McNeill 2002)
du Sautoy observing two of Luc Steels' standing robots generating words for bodily actions and communicating with each other (and also with du Sautoy).
Graziano's research into the categorical structuring of monkey hand and arm movements in response to a systematic pattern of stimulation of the pre-central cortex.
Comparison of monkey arm postures from The Intelligent Movement Machine and phonemic arm movements from the Physical Foundation of Language (Allott 1973/2001).
Categorical Speech perception - evidence that animals make the same phoneme dicriminations as humans (linked to similar anatomy and brain organisation for movement)
Articulatory phonemes as motor primitives as the bases for the evolutionary origin of language from speech sounds, linked to gestures and formed into words patterned by gestures imitating the objects and actions to which words and gestures related.
Page 30 Demonstration of word/gesture/meaning relations from 2nd Biennial Conference on Cognitive Science June 2006.
Allott, R 1992. The Motor Theory of Language: Origin and Function. In Language Origin: A Multidisciplinary Approach (ed.) by J. Wind et al. NATO ASI. Dordrecht: Kluwer.
Browman, C. and L. Goldstein. 1992. Articulatory phonology: An overview. Phonetica 49: 155-180.
Burdick CK and JB Miller: Speech perception by the chinchilla. Journal of the Acoustic Society of America. 58: 415-427, 1975.
Cole J, Gallagher S, McNeill D. 2002. Gesture following deafferentiation: A phenomenologically informed experimental study. Phenomenology and the Cognitive Sciences 2002. 1:49-67..
Eriksson JL, Villa AE. Behav Processes. 2006 Nov 1;73(3):348-59. Epub 2006 Sep 25. Learning of auditory equivalence classes for vowels by rats.
Foley WA 1986 The Papuan Languages of New Guinea CUP
Graziano MS, Patel KT, Taylor CS. 2004. Mapping from motor cortex to biceps and triceps altered by elbow angle. J. Neurophysiol. 2004 Jul; 92(1):395-407.
Hienz RD, Aleszczyk CM, May BJ. J Acoust Soc Am. 1996 Jun;99(6):3656-68. Vowel discrimination in cats: acquisition, effects of stimulus level, and performance in noise.
Hienz RD, Brady JV. J Acoust Soc Am. 1988 Jul;84(1):186-94. The acquisition of vowel discriminations by nonhuman primates. [baboons]
Jürgens, Uwe. 2000. A comparison of the neural systems underlying speech and non-speech vocal utterances. In Becoming Loquens ed. by Bichakjian BH et al. [In humans, a direct connection between the motor cortex and …the motoneurons innervating the laryngeal muscles ... lacking in the monkey.]
Kuhl PK and JD Miller: Speech perception by the chinchilla. Science, 190: 69-72. 1975.
Lashley, K.S. 1951. The problem of serial order in behavior. In Cerebral Mechanisms in Behavior. (ed.) L.A. Jeffress 112-135. New York: Hafner.
Liberman, A.M. et al. 1967. Perception of the speech code. Psychological Review74: 431-461.
Lieberman P: The biology and evolution of language. Cambridge, Mass.: Harvard University Press.
Lindblom, B. 1991. The Status of Phonetic Gestures. In Modularity and the Motor Theory of Speech Perception. (ed.) by Mattingly, I.M. and M. Studdert-Kennedy, 7-24.
Loeb EP, Giszter SF, Saltiel P, Bizzi E, Mussa-Ivaldi FA. Output units of motor behavior: An experimental and modeling study. J Cogn Neurosci. 2000. 12.1 78-97. Massachusetts Institute of Technology. “We found that a small set (as few as 23 out of 65,536) of all possible combinations of 16 limb muscles …. could stabilize the limb at predictable, restricted portions of the workspace”
Mesgarani N, David SV, Fritz JB, Shamma SA. J Acoust Soc Am. 2008 Feb;123(2):899-909. Phoneme representation and classification in primary auditory cortex. [ferrets]
Morse PA: Speech perception in the human infant and the Rhesus monkey. In Annals of the New York Academy of Sciences Vol. 280: 694-707, 1976.
Mussa-Ivaldi FA, Giszter SF, Bizzi E. 1994. Linear combinations of primitives in vertebrate motor control. Proc Natl Acad Sci 2;91(16):7534-8
Rizzolatti, G. and M.A. Arbib. 1998. Language within our grasp. Trends in Neurosciences. 21(5) 188-194.
Sinnott JM, Mosteller KW. J Acoust Soc Am. 2001 Oct;110(4):1729-32. A comparative assessment of speech sound discrimination in the Mongolian gerbil
Warren RM. Ann N Y Acad Sci. 1976;280:708-17. Auditory perception and speech evolution.
Zoloth SR, Petersen MR, Beecher MD, Green S, Marler P, Moody DB, Stebbins W. Science. 1979 May 25;204(4395):870-3. Species-specific perceptual processing of vocal sounds by monkeys.
Giacomo Rizzolatti. The evolution of language: Educational tools for Cognitive Neuroscience.
David Attenborough. Life in Cold Blood, The Golden Frog : BBCWorldwide
David Attenborough. Life on Earth: Episode 13 The Compulsive Communicators: New Guinea Extract BBC DVD
Daniel Hochberg and Eric Leuthardt. Science Club Series 1: BBC 2
David McNeill New College 2011Presentation [See Shaun Gallagher: How the Body Shapes the Mind OUP 2005 pp. 109-126]
How Do You Programme Intelligence? The Hunt for AI: Luc Steels and du Sautoy: Horizon BBC Two 2012
NEW COLLEGE 2011 Embodied Language I Linked presentations [www.newcollegeembodiedlanguage.com/presentations.htm]
Robin Allott: “Embodied Language and the Ascent of Intelligence”
Leonardo Fogassi: “Mirror neurons and embodied language”
Maurizio Gentilucci: “Gesture and speech are controlled by the same system”
David McNeill: “How language began”
Riikka Möttönen: “Involvement of motor cortex in speech perception”
Jean-Luc Petit: “Three ways to bridge the gap between perception and action, and language”
Roel Willems: “How is language embodied?”