Speech perception
Download
1 / 72

Speech perception - PowerPoint PPT Presentation


  • 620 Views
  • Updated On :

Speech perception. Introduction What is auditory agnosia? Types of auditory agnosia Cognitive models Perceptual or semantic deficit? Brain imaging Summary. Basic speech Perception. Physical signal varies in amplitude , frequency and time .

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Speech perception' - arleen


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Speech perception l.jpg

Speech perception

Introduction

What is auditory agnosia?

Types of auditory agnosia

Cognitive models

Perceptual or semantic deficit?

Brain imaging

Summary


Basic speech perception l.jpg
Basic speech Perception

  • Physical signal varies in amplitude, frequency and time.

  • From these parameters speech perception occurs - phonemes (buh, tuh) are extracted.

  • Perception of phonemes in normal speech is categorical i.e. if physical characteristics of the signal are changed slowly then there is a sudden change in the phoneme perceived.



Variability in the speech stream l.jpg
Variability in the speech stream

  • How do we recognise the same sounds spoken in different ways e.g. by two different people?

  • Speech perception is categorical.

  • The speech recognition system can modify fuzzy input to give the listener the correct sound.


Speech perception5 l.jpg
Speech perception

  • When one listens to a spoken word it is necessary to perform at least three operations:

  • 1) process a series of sound waves;

  • 2) make fine distinctions between similar patterns of sounds;

  • 3) extract meaning from the utterance.


Seamlessness of speech l.jpg
Seamlessness of speech

  • There are no natural breaks in speech.

  • We “hallucinate” word boundaries

    • Oronyms

      • The stuffy nose can lead to problems

    • Speech perception errors

      • Eugene O'neil won a Pullet Surprise


What is auditory agnosia l.jpg
What is auditory agnosia?

  • Auditory agnosia refers to the defective recognition of auditory stimuli in the context of preserved hearing - as tested with audiometry.

  • Primary signs of the disorder include difficulty in understanding the meaning of spoken words.

  • The term `auditory agnosia' can refer to a generalized disorder affecting perception of all types of auditory stimuli including non-verbal sounds, speech and music (e.g., Miceli, 1982).


Neuropathology l.jpg
Neuropathology

  • Auditory agnosias are associated with bilateral, or unilateral lesions of the leftsuperior temporal cortex (Auerbach et al., 1982; Varney and Damasio, 1986) although some cases have been described following unilateral right temporal lobe damage (Roberts et al., 1987).

  • By far the most common cause is cerebro-vascular accident but some cases have been reported following encephalitis (Arias et al., 1995), head injury (Franklin, 1989) and slowly progressive cortical atrophy (Otsuki et al 1998).


Types of auditory agnosia l.jpg
Types of auditory agnosia

  • Disorders of sound recognition can be divided into `apperceptive' and`associative' subtypes after Lissauer's (1890) visual agnosia distinction:

  • Apperceptive auditory agnosia refers to impaired acoutsical analysis of the perceptualstructure of an auditory stimulus (frequency, pitch, timbre).

  • Associative auditory agnosia refers to an inability to associate a successfully perceived auditory stimulus with a conceptual (semantic) meaning.


Spoken word recognition l.jpg
Spoken word recognition

  • Morton (1970) proposed a stage model of auditory word recognition distinguishing between 3 phases:

  • Auditory analysis system

    • identifies phonemes in the speech wave.

  • Auditory input lexicon

    • identifies the phonological properties of known words.

  • Semantic system

    • identifies the meanings of known words.


Apperceptive auditory agnosias l.jpg
Apperceptive auditory agnosias

  • Apperceptive auditory agnosias can be broken down into at least three discrete types:

  • 1) Pure auditory agnosia refers to impairment of non-verbal, environmental sounds after damage to the right temporal lobe (Albert et al., 1972).

  • 2) Pure word deafness refers to the selective impairment of word perception (Albert & Bear, 1974; Saffran et al., 1976; Coslett et al., 1984).

  • 3) Amusia to describe deficits in the processing of musical melodies (e.g., Peretz et al., 1994).


Pure word deafness l.jpg
Pure word deafness

  • Patient cannot understand spoken words but can discriminate verbal and nonverbal sounds and can speak without impairment (no anomia).

  • They are unable to repeat back words they cannot understand and can discriminate between vowels but not syllables suggesting a specific perceptual deficit (Saffran et al. 1976).

  • There seems to be a specific impairment to the perception of speech like sounds e.g. certain speech sounds (phonemes) are more affected more than other types of speech sounds.


Pure word deafness14 l.jpg
Pure word deafness

  • Klein & Harper (1956).

  • RC.

  • Stroke patient who could hear everything....

    • even a leaf falling, but it sounds far away. You think you can catch it and it fades away... jumbled together like foreign folk speaking in the distance. You feel it should be louder but when everyone shouts its still more confusing.


Phonemic boundaries l.jpg
Phonemic boundaries

  • Blumstein et al (1977) asked PWD patients (all with lesions to left superior temporal gyrus) to discriminate between two similar phonemes e.g., "da" and "ta” that were distorted so the acoustic properties fell on a continuum.

  • Patients had "fuzzier" boundaries compared with controls who showed a clear point when discriminations of phonemes changed.

  • Pure word deafness is a disorder at the stage where different acoustic characteristics are classified as instances of a single phoneme.


Word meaning deafness l.jpg
Word meaning deafness

  • Word meaning deafness is an impairment to the comprehension of words in the absence of an impairment to the auditory analysis of words.

  • Word meaning deafness patients at test show:

    • preserved repetition, phoneme discrimination and auditory lexical decision (unlike the PWD patients) showing the auditory input lexicon is dissociable.

    • impaired comprehension from spoken input only.

    • whereas comprehension is perfectly normal when patient is given written words to match with pictures (Franklin et al., 1996; Kohn and Friedman, 1986).


Bramwell 1897 l.jpg
Bramwell (1897)

  • 26 year old woman who had a stroke.

  • Difficulty understanding speech but not deaf:

    • “Is it not a strange thing that I can hear the clock ticking and cannot hear you speak? Now let me think what that means”

  • When asked questions verbally:

    • e.g., "Do you like to come to Edinburgh?"

    • Could not understand but repeated words and wrote them down and could then answer the question!


Bypass route l.jpg
Bypass route

  • One way to repeat a word would be to activate its entry in the auditory input lexicon, access semantics and then release its activation in the speech output lexicon.

  • BUT people can repeat aloud nonwords e.g fep.

  • Therefore, cognitive models of spoken word processing must propose a bypass route from auditory analysis system to the phoneme level that would enable the processing of nonwords.


Mccarthy and warrington 1984 l.jpg
McCarthy and Warrington (1984).

  • Patient ORF.

  • Could repeat words better than nonwords of the same length.

  • Words: 85%

  • Nonwords: 39%

  • Nonwords can only be repeated via the bypass route, whereas words can be repeated by either route.

  • Therefore, suggests impairment to bypass route.


Evidence for a bypass route l.jpg
Evidence for a bypass route

  • Beauvois et al (1980) reported a patient JL who could repeat words but not nonwords aloud.

  • Martin and Saffran (1992) reported patients who also could not repeat nonwords and also made semantic errors when repeating words.

  • Repetition was more difficult for abstract words than concrete words - called deep dysphasia.

  • Supports the existence of a nonlexical bypass route for spoken word repetition as nonwords cannot be repeated via the lexical pathway.


Brain imaging studies of speech perception l.jpg
Brain imaging studies of speech perception

  • Is there evidence for separate lexical and non-lexical processing centres in the human brain?

  • Mummery et al (1999) compared speech perception to silent rest using PET and found that speech perception activated the dorso-lateral temporal (auditory) cortex - bilaterally.

  • Suggests sound based representations stored in superior temporal lobe structures on both sides of the brain - however - recent MEG evidence shows the left hemisphere is specialised for processing rapid temporal events e.g. speech.


Summary l.jpg
Summary

  • Studies of patients with auditory agnosia have contributed to the development of theories of normal spoken word recognition and to the identification of necessary cognitive processes.

  • Studies of auditory agnosia have shown that knowledge about spoken words and their meaning can be dissociated into modules.

  • Studies of auditory agnosia have lead to the identification of speech and non-speech areas in the brain using brain imaging methods.


Reading l.jpg
Reading

  • Parkin, A. Chapter 7.

  • Ellis, A.W. & Young, A.W. Ch 6.

  • McCarthy, R & Warrington, E. (1990) Cognitive Neuropsychology: A clinical introduction - 6.

  • Ellis, A. W. (1984) Introduction to Bramwell's case of word meaning deafness. Cognitive Neuropsychology, 1, 245-258.

  • Hickok, G & Poeppel, D (2000). Towards a functional neuroanatomy of speech perception. Trends in Cognitive Sciences, 4(4), 131-138.


Non speech stimuli l.jpg
Non speech stimuli

  • The non-speech stimuli was signal correlated noise (SCN) which forms a stimulus with the same instantaneous amplitude as the speech signal but contains none of the spectral information such as transients and formants that lead to intelligible speech.

  • SCN preserves segmental information about manner of articulation, rhythm and syllabicity of the speech without comprehension.


Results l.jpg
Results

  • Brain regions that showed an increase in activation with increasing rates of presentation of both speech and SCN were bilateral primary auditory cortex.

  • Brain regions that increased in rate with speech only were bilateral superior temporal lobe regions anterior to the primary auditory cortex.

  • There was also a region of left/right asymmetry i.e. a greater level of activation for speech in the left hemisphere in the core of Wernicke’s area.


Auditory analysis deficits l.jpg
Auditory analysis deficits

  • Pure word deafness.

  • Impaired speech perception in the context of good speech production.

  • Hempel & Stengel (1940).

  • 34 year old.

  • Fell from a bus.

  • First thought to be deaf.

  • BUT audiometric testing normal.


Pure word deafness28 l.jpg
Pure word deafness

  • Intact perception of nonverbal environmental sounds.

  • Read well and with understanding

  • Could write, therefore, semantics intact.

  • Couldn't repeat - early processing problem

  • Claimed that much of what he heard conveyed no meaning to him:

    • I can hear you dead plain but I can't understand what you say. The noises are not quite natural. I can hear but not understand.


Word meaning deafness29 l.jpg
Word meaning deafness

  • Spoken word comprehension is graded by imageability/concreteness—words referring to concrete exemplars are better understood than abstract concepts—and errors are semantically related (Franklin et al., 1994, 1996).

  • Word meaning deafness is a language-specific deficit of the highest order (highly modular)?

  • Tyler and Moss (1997) suggest the behavioural pattern may be the by-product of a generalized (earlier) auditory processing impairment.


Issues in auditory agnosia l.jpg
Issues in auditory agnosia

  • Are there separate modules for speech and non-speech sounds in the human brain?

  • Is verbal memory category specific?

  • What is the nature of the deficit in pure word deafness?

    • Pre-Phonemic

    • Phonemic (analyzing a string of speech sounds into constituent phonemes).

    • Left hemisphere specialized for making fine temporal discriminations between rapidly presented auditory stimuli.


Prephonemic level l.jpg
Prephonemic level

  • Aurebach et al, (1982)

  • Patients with bilateral temporal lobe lesions

  • Impaired perception of spoken words

  • Found to be impaired at:

    • detecting gaps between stimuli

    • determining when two stimuli were simultaneous.

  • Argued for a basic (prephonemic) disorder of acoustic perception.

  • See also Tyler and Moss (1997).


Phonemic level l.jpg
Phonemic level

  • Blumstein et al (1977)

  • Patients with left hemisphere temporal lesions.

  • Asked to discriminate phonemes e.g., "da”/"ta”.

  • If systematically distort phonemes so sounds fall on a continuum normal Ss showed a clear point when their discriminations change.

  • Pure word deaf patients "fuzzier" boundaries.

  • Disorder at the stage where sounds of different acoustic characteristics are classified as being instances of a single phoneme.


Muddying the picture l.jpg
Muddying the picture

  • Pure word deafness.

  • Some patients have difficulty with non speech sounds (e.g., environmental stimuli).

  • Performance is generally improved by slowing down (so not all-or -none).


Slide34 l.jpg


Slide35 l.jpg

  • Auerbach et al (1982) suggested that word deafness was due to the loss of pre-phonological auditory processing

  • Associated with bilateral temporal lobe lesions, but that a unilateral left temporal lesion causes word deafness due to a deficit in phoneme discrimination and identification.

  • A more recent clinical and psychophysical study of a patient developed word deafness after a right temporal infarct (Praamstra et al, 1991).


Heilman et al 1976 l.jpg
Heilman et al (1976) to the loss of pre-phonological auditory processing

  • Left temporal lobe damage

  • Unable to understand spoken words (speech)

  • Hearing was normal

  • Could repeat - so not an auditory analysis problem

  • Could perform picture-word matching tasks in written form, so not a semantic impairment

  • Could also produce spoken names to pictures

  • Argues for an impairment at the level of the auditory input lexicon (or its connection to semantics - how might we test for this?).


Slide37 l.jpg


Basic model l.jpg
Basic Model often retains superordinate (category) information but loses detailed subordinate information (Warrington, 1975).

  • Three main components to comprehension and production of words:

  • 1. Analysis and representation of the sounds of incoming words

  • 2. Analysis and representation of meanings of words

  • 3. Synthesis and production of word sounds


Slide39 l.jpg

  • 1. Problems identifying familiar word sounds – no hearing impairment, speech okay but can’t repeat a heard word.

  • The problem lies in the sensory and perceptual processes, the end result of which is normally the auditory percept (compare with the visual percept in agnosia).

  • Some authors distinguish between pre-phonemic (sensory) and phonemic (perceptual) disorders (Auerbach et al., 1982). Others don’t (McCarthy & Warrington, 1991).


The two types school say l.jpg
The "Two types" school say impairment, speech okay but can’t repeat a heard word.

  • There are failures in sensory processing - "word sound deafness" i.e. failures in the processing of frequency, amplitude, duration or temporal discrimination of the incoming speech sounds.

  • Albert & Bear (1974) reported that impaired temporal acuity lead to words sounding like a foreign language; patient couldn’t tell whether one or two clicks unless they were very well separated; difficulties in determining auditory sequences; improved if words presented slowly.


Slide41 l.jpg


Word meaning deafness42 l.jpg
Word meaning deafness deafness" i.e. inability to discriminate phonemes reliably (Blumstein, Baker & Goodglass, 1977).

  • Problems identifying familiar word meanings

  • Interested in those cases where word sound perception is okay but meaning is still not understood (Gianotti, Caltagirone & Ibba, 1975).

  • The disorder applies to some, not all, words.

  • These patients can repeat words back so perception must be okay (Geschwind, Quadfasel & Segarra, 1976).

  • So, meaning problem must be in the semantic system or in access to it.


Slide43 l.jpg

2. Tested using speech-picture matching (can’t match picture of horse with word horse), speech-definition matching (can’t match definition of horse with word horse).


Characteristics of word meaning deafness l.jpg
Characteristics of Word Meaning Deafness picture of horse with word horse), speech-definition matching (can’t match definition of horse with word horse).

  • 1. Word frequency - massive effect on comprehension (Schuell, Jenkins & Landis (1961).

  • 2. Conceptual areas selectively impaired (i.e. category specificity)


Slide45 l.jpg

  • 1. Colour names, body parts, action names (Goodglass, Klein, Carey & Jones, 1966) (see W&M, p 130)

  • 2. Concrete/abstract. Loss of abstract words is common (Goldstein, 1948) but this could be due to difficulty.

  • Warrington(1975) and Warrington & Shallice (1984) report two patients (AB and SBY respectively) who showed greater loss for concrete words.

  • So, double dissociation between concrete and abstract words.


Slide46 l.jpg

  • 3. Within concrete - Many selective losses here Most common is that food and living things impaired much more than objects (Warrington & Shallice, 1984 cases SBY, JBR; also patients KB and ING).

  • The Double Dissociation comes from patients VER & YOT, who were better on living things (Warrington & McCarthy, 1983, 1987). See W&M, page 132.


Causes of word meaning deafness l.jpg
Causes of word meaning deafness is that food and living things impaired much more than objects (Warrington & Shallice, 1984 cases SBY, JBR; also patients KB and ING).

  • Two possible general causes (McCarthy & Warrington)

  • 1. Disconnection - Geschwind, Quadfasel & Segarra (1968) - repetition ok but both comprehension and spontaneous speech impaired - implies disconnection of perceptual system from meaning of words (affecting comprehension) and meaning of words from production of words (affecting spontaneous speech).


Slide48 l.jpg


Slide49 l.jpg


Slide50 l.jpg

  • Implications of category effects for organisation of semantic knowledge.

  • The category effects inform us about organisation of semantic knowledge.

  • Early explanations were in terms of animate/inanimate distinction.

  • Replaced by sensory (e.g. food and living things) vs functional (objects) distinction.

  • Because the representations of these two classes formed from different origins the double dissociation is plausible.


Modality specific impairments l.jpg
Modality specific impairments semantic knowledge.

  • The argument here is that word meaning deafness can occur when input from other modalities is ok.

  • TOB (McCarthy & Warrington, 1988) had a deficit for living things but this was restricted to spoken word inputs, not picture inputs. e.g. he could define a rhinoceros from its picture but not from its spoken name.

  • Implies separate visual semantic memory and general verbal semantic memory.


References for auditory agnosia l.jpg
References for Auditory Agnosia semantic knowledge.

  • Albert ML, Bear D. Time to understand: a case study of word deafness with reference to the role of time in auditory comprehension. Brain 1974; 97: 373–84.[Medline]

  • Albert ML, Sparks R, Von Stockert T, Sax D. A case study of auditory agnosia: linguistic and non-linguistic processing. Cortex 1972; 8: 427–43.[Medline]

  • Arias M, Requena I, Ventura M, Pereiro I, Castro A, Alverez A. A case of deaf-mutism as an expression of pure word deafness: neuroimaging and electrophysiological data. European Journal of Neurology 1995; 2: 583–5.

  • Auerbach SH, Allard T, Naeser M, Alexander MP, Albert ML. Pure word deafness. Analysis of a case with bilateral lesions and a defect at the prephonemic level. Brain 1982; 105: 271–300.[Abstract]

  • Coslett HB, Brashear HR, Heilman KM. Pure word deafness after bilateral primary auditory cortex infarcts. Neurology 1984; 34: 347–52.[Medline]

  • Ellis AW. Introduction to Bramwell's (1897) case of word meaning deafness. Cognitive Neuropsychology 1984; 1: 245–58.

  • Franklin S. Dissociations in auditory word comprehension; evidence from 9 fluent aphasic patients. Aphasiology 1989; 3: 189–207.

  • Franklin S, Howard D, Patterson K. Abstract word meaning deafness. Cognitive Neuropsychology 1994; 11: 1–34.

  • Franklin S, Turner J, Lambon Ralph MA, Morris J, Bailey PJ. A distinctive case of word meaning deafness? Cognitive Neuropsychology 1996; 13: 1139–62.

  • Griffiths TD, Rees A, Green GR. Disorders of human complex sound processing. Neurocase 1999; 5: 365–78.

  • Hemphill RE, Stengel E. A study of pure word-deafness. Journal Neurology and Psychiatry 1940; 3: 251–62.

  • Kale U, El-Naggar M, Hawthorne M. Verbal auditory agnosia with focal EEG abnormality: an unusual case of a child presenting to an ENT surgeon with `deafness'. The Journal of Laryngology and Otology 1995; 109: 431–2.

  • Kohn SE, Friedman RB. Word-meaning deafness: a phonological-semantic dissociation. Cognitive Neuropsychology 1986; 3: 291–308.

  • Lissauer H. Ein Fall von Seelenblindheit nebst einem Beitrage zur Theorie derselben. Archiv für Psychiatrie und Nervenkrankheiten 1890; 21: 222–70.

  • Mendez MF, Rosenberg S. Word deafness mistaken for Alzheimer's disease: differential characteristics. Journal of the American Geriatrics Society 1991; 39: 209–11.[Medline]

  • Miceli G. The processing of speech sounds in a patient with cortical auditory disorder. Neuropsychologia 1982; 20: 5–20.[Medline]

  • Nielsen JM. Agnosia, apraxia, aphasia. Their value in cerebral localization. New York: Hoeber, 1946.

  • Otsuki M, Soma Y, Sato M, Homma A, Tsuji S. Slowly progressive pure word deafness. European Neurology 1998; 39: 135–40.[Medline]

  • Pearce PS, Darwish H. Correlation between EEG and auditory perceptual measures in auditory agnosia. Brain and Language 1984; 22: 41–8.[Medline]

  • Peretz I, Kolinsky R, Tramo M, Labrecque R, Hublet C, Demeurisse G et al. Functional dissociations following bilateral lesions of auditory cortex. Brain 1994; 117: 1283–1301.[Abstract]

  • Roberts M, Sandercock P, Ghadiali E. Pure word deafness and unilateral right temporo-parietal lesion: a case report. Journal of Neurology, Neurosurgery and Psychiatry 1987; 50: 1708–9.

  • Saffran EMM, Marin OS, Yeni-Komshian GH. An analysis of speech perception in word deafness. Brain and Language 1976; 3: 209–28.[Medline]

  • Shindo M, Kaga K, Tanaka Y. Speech discrimination and lip reading in patients with word deafness or auditory agnosia. Brain and Language 1991; 40: 153–61.[Medline]

  • Stein LK, Curry FK. Childhood auditory agnosia. Journal of Speech and Hearing Disorders 1968; 33: 361–70.[Medline]

  • Tyler LK, Moss HE. Imageability and category-specificity. Cognitive Neuropsychology 1997; 14: 293–318.

  • Varney N, Damasio H. CT scan correlates of sound recognition defect in aphasia. Cortex 1986; 22: 483–6.[Medline]

  • Vignolo LA. Auditory agnosia. Philosophical Transactions of the Royal Society of London B: Biological Sciences 1982; 298: 49–57.

  • Wohlfart G, Lindgren A, Jernelius B. Clinical picture and morbid anatomy in a case of `pure word deafness'. Journal of Nervous and Mental Disease 1952; 116: 818–27.


Slide53 l.jpg

  • PET: Mummery et al. (1999). Functional neuroimaging of speech perception in six normal and two aphasic subjects.

  • The question asked was: which areas of the brain are involved in perceiving speech perception?

  • Method

  • 6 normal subjects were PET scanned while listening to speech (bisyllabic nouns, matched for frequency concreteness and imageability) or signal-correlated noise equivalents (nonspeech stimuli, similar to speech in complexity but not perceived as speechlike)


Slide54 l.jpg

  • Results speech perception in six normal and two aphasic subjects.

  • Regions of activation common to both speech and signal-correlated noise: bilateral primary auditory cortex and superior temporal gyrus

  • Regions of activation specific to speech: bilateral superior temporal sulcus, both anterior and ventral to the primary auditory cortex

  • Only one region which showed asymmetry of activation for speech stimuli: left posterior ventral superior temporal gyrus/superior temporal sulcus i.e. the core of Wernicke’s area.


Slide55 l.jpg

  • Discussion speech perception in six normal and two aphasic subjects.

  • These findings together suggest that the dorsolateral temporal cortex of both hemispheres can be involved in the processing of speech.

  • Evidence for some specialization of this area for speech processing in the left hemisphere (thought to correspond to Wernicke’s area)


Slide56 l.jpg

  • 1.MEG: speech perception in six normal and two aphasic subjects.

  • Helenius et al., 1998: Distinct time courses of word and context comprehension in the left temporal cortex

  • Kutas and Hillyard (1980) showed:

  • When a contextually-constrained sentence ends with a semantically inappropriate word (the girl spread some jam on her socks), an event related potential (recorded electrical activity associated with this stimulus) is elicited 300-500ms after the onset of the final word (termed an N400).


Slide57 l.jpg

  • Later, it was shown that an ERP with the same latency would be elicited regardless of semantic congruity but semantically incongruous words would elicit an N400 which was bigger in amplitude.

  • Helenius et al. aimed to use MEG to identify the cortical source of the N400 signal; something which is not possible with EEG because of the problem of signal blurring at the skull and scalp.


Slide58 l.jpg

  • Method be elicited regardless of semantic congruity but semantically incongruous words would elicit an N400 which was bigger in amplitude.

  • 10 subjects, all right handed with no history of reading disorders

  • Used 4 categories of sentence-ending words:

  • a) highly probable (the piano was out of tune)

  • b) not improbable but rare (the the power went out the house went quiet)

  • c) anomalous (the pizza was too hot to sing)

  • d) improbable but first syllable phonologically similar (thye gambler had a streak of bad luggage)


Slide59 l.jpg

  • 100 examples per category; randomly interspersed. be elicited regardless of semantic congruity but semantically incongruous words would elicit an N400 which was bigger in amplitude.

  • Sentences presented on a screen, one word at a time

  • Subjects required to read sentences silently, paying special attention to their meaning

  • Magnetic signals recorded for each subject, time locked to the onset of the final word of each sentence

  • Signals were averaged for each category separately


Slide60 l.jpg

  • Results be elicited regardless of semantic congruity but semantically incongruous words would elicit an N400 which was bigger in amplitude.

  • Sensors over the left temporal lobe recorded magnetic signal after final word onset which peaked around 400ms and was particularly strong for anomalous and phonologically similar but semantically anomalous sentence endings.

  • This activity was much smaller for semantically possible but rare sentence endings and absent for highly probable sentence endings.


Slide61 l.jpg

  • Source analysis was conducted to establish which cortical regions were responsible for these magnetic signals

  • Left superior temporal cortex (5 subjects)l left frontal cortex (2 subjects)

  • Additionally: left posterior sylvian fissure and/or right superior temporal gyrus

  • Time course of magnetic signal detected from left superior temporal cortex differed according to sentence ending:

  • Order of peak magnetic signal: rare endings (first to peak)/anomalous endings/anomalous but phonologically similar (last)


Slide62 l.jpg

  • Discussion regions were responsible for these magnetic signals

  • Evidence from behavioural studies shows that visual processing of words is completed 200ms after word presentation; hence magnetic signal around 400 ms likely to be related to analysis of meaning of the words and its role in the context created by the sentence

  • Absence of this signal for probable endings likely a "top-down" effect of anticipation; the sentences led to the expectation of one final word; this expectation was fulfilled and the brain showed no detectable response: it is efficient for the brain to respond only to new, unexpected information.


Slide63 l.jpg

  • Onset of signal from left superior temporal cortex began at about 250 ms for all sentence endings but diverged at around 350ms when signal associated with rare endings faded out

  • Signals associated respectively with anomalous and anomalous but phonologically similar continued until about 600ms.

  • Indicates two separate cognitive processes

  • First: analysis of word meaning

  • Second: extended analysis of meaning in context of sentence


Slide64 l.jpg

  • Hence only the anomalous and anomalous but phonologically similar endings caused activation throughout both these periods.

  • The lag in signal for anomalous but phonologically similar endings from 350 ms onwards likely reflects attempts to integrate the sentence ending into the rest of the sentence.


Neuroimgaging and language l.jpg
Neuroimgaging and language similar endings caused activation throughout both these periods.

  • Language is an example of a cognitive function which is highly distributed in the brain (i.e relies on a network of interconnected areas).

  • Neuroimaging studies reveal many cortical areas are activated during simple language tasks eg. word reading, picture naming.

  • One of the shortcomings of neuroimaging techniques is that the activated areas may not be necessary for the task in question

  • Increased blood flow in one area may be a non-functional by-product of another cortical area.


Trans magnetic stimulation l.jpg
Trans Magnetic Stimulation similar endings caused activation throughout both these periods.

  • Trans Magnetic Stimulation can be used as a "lesion" technique can to determine which areas are necessary in language processing.

  • One debate in the TMS/language literature concerns "speech arrest".

  • Several studies have used TMS over frontal areas and produced dramatic results.

  • The subject will be counting or reciting the letters of the alphabet; the TMS is applied and they will be unable to continue for the duration of the stimulation (normal after).


Stewart et al 2001 l.jpg
Stewart et al., (2001) similar endings caused activation throughout both these periods.

  • Stewart et al., (2001) asked the following: Is this effect simply due to interference with the muscles innervating the throat, tongue and mouth or are we seeing a TMS-induced Brocas-type aphasia?

  • 11 neurologically normal adults, all right handed.

  • TMS coil positioned over a site anterior and lateral to the site at which TMS could produce a muscle twitch in the contralateral hand.


Stewart et al 200168 l.jpg
Stewart et al., (2001) similar endings caused activation throughout both these periods.

  • TMS applied as subjects counted briskly upwards from 1 to 10 or recited the days of the week or nursery rhymes.

  • Intensity of TMS and positioning of coil adjusted until speech disrupted (with associated activity in mentalis muscle).

  • Coil then moved approximately 5cm anterior and 2cm lateral and procedure repeated.


Results69 l.jpg
Results similar endings caused activation throughout both these periods.

  • Subjects described the posteriorly produced disruption (associated with muscle activity) as a feeling they had "lost control of their facial muscles”.

  • The anteriorly produced speech disruption (not associated with muscle activity) as a feeling that they "could not get the word out”.

  • The posteriorly produced disruption could be elicited from both hemispheres; the anterior one only from the left hemisphere.


Discussion l.jpg
Discussion similar endings caused activation throughout both these periods.

  • TMS can elicit two types of speech disruption; one motoric in nature, which can be produced by stimulating the precentral gyrus of either hemisphere and one non motoric which can be produced by stimulation over the middle frontal gyrus of the left hemisphere only.

  • The subjects descriptions of the anteriorly-produced disruption is reminiscent of patients with left frontal damage.


Slide71 l.jpg

  • This anterior site would be an interesting region to stimulate with lower levels of magnetic stimulation and using more cognitive paradigms of language function to investigate further the role of this area in speech production.

  • The study may also have implications clinically; possibly TMS –induced speech arrest might provide a non-invasive alternative or adjunct to the WADA test for determining hemispheric dominance for language functioning in patients requiring surgery e.g. for epilepsy.


Normal data differentiating these processes l.jpg
Normal data differentiating these processes stimulate with lower levels of magnetic stimulation and using more cognitive paradigms of language function to investigate further the role of this area in speech production.

  • Speech perception.

  • Similar sounds can easily be confused when presented in isolation.e.g. B and D.

  • McGurk effect.

  • Access to meaning.

  • difficult for words at the "edges" of one's vocabulary. e.g. alpaca, cistern.