1 / 16

Non-auditory influences on speech perception

Non-auditory influences on speech perception. Presentation by: Emma Arbuckle PSY 3108 . Before getting started…. When you think about decoding speech information, which of our five senses comes to mind first? (i.e. which sense is most involved?)

damia
Download Presentation

Non-auditory influences on speech perception

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Non-auditory influences on speech perception Presentation by: Emma Arbuckle PSY 3108

  2. Before getting started… • When you think about decoding speech information, which of our five senses comes to mind first? (i.e. which sense is most involved?) • Can you think of any other senses that may also be involved in speech perception?

  3. Despite our intuitions of speech as something we hear, there is overwhelming evidence that the brain treats speech as something we hear, see, and even feel • This implies that speech perception is multimodal

  4. Speech perception as something wesee • We read lips to better understand someone speaking in a noisy environment or speaking with a foreign accent • Even with clear speech … lip reading enhances our comprehension of a speaker • While individual differences exist in lip-reading skill, evidence suggests that all sighted individuals from every culture use visual speech info • Whenever we are speaking with someone in person, we use information from seeing movement of their lips, teeth, tongue, and non-mouth facial features, and have likely been doing so all our lives

  5. McGurk Effect • Visual speech automatically integrates with auditory speech in a number of different contexts • In the McGurk effect, an auditory speech utterance dubbed synchronously with a video of a face articulating a discrepant utterance induces subjects to report “hearing” an utterance that is influenced by the mismatched visual component • The “heard” utterance can take a form in which…1) the visual info overrides the auditory info2) the visual and auditory components combine to create a new perceived utterance

  6. http://www.youtube.com/watch?v=jtsfidRq2tw • http://www.youtube.com/watch?v=G-lN8vWm3m0

  7. Read my Lips: Asymmetries in the Visual Expression and Perception of Speech Revealed through the McGurk Effect (Nicholls et. Al 2004) • Previous research: • Right side of the mouth: moves more, opens earlier, and opens wider than the left during speech production (Wolf & Goodale, 1987) • This suggests that the right-mouth is more visually expressive than the left-mouth for speech (Campbell 1986) • Nicholls et. al prediction: covering the right side of the mouth would reduce the number of McGurk errors relative to covering the left side

  8. Read my Lips: Asymmetries in the Visual Expression and Perception of Speech Revealed through the McGurk Effect (Nicholls et. Al 2004) • The finding that fewer errors were made when the right side of the mouth was covered than when the left was covered demonstrates that the right side is more important than the left in generating the McGurk effect. • Prediction: McGurk effect would be strongest when both sides of the mouth were visible, because this condition maximizes visual input… • Instead, error rate for the full-mouth condition was very similar to error rate when the left side was covered • Suggesting that visual information provided by movements of the right side of the mouth is just as informative as the information provided by movements of the entire mouth!

  9. Speech perception as something we feel • Even felt speech, accessed either through touching a speaker’s lips, jaw, and neck or through the kinesthetic feedback from one’s own speech movements, readily integrates with heard speech • Somatosensory signals from the facial skin and muscles of the vocal tract provide a rich source of sensory input in speech production

  10. Somatosensory function in speech perception (Takayuki et. al 2009) • Robotic device was used to simulate patterns of facial skin deformation (i.e. stretching) that would normally occur while speaking • By changing the configuration of the robotic device and the wire supports, the facial skin of the participants was stretched in different directions while they listened to words • This variation in skin stretching was found to influence what the participants believed to have heard

  11. Robotic Device:

  12. Somatosensory function in speech perception (Takayuki et. al 2009) • Speech sound classification changed in a predictable way depending on the direction in which the skin was stretched • When the skin was stretched upward, the stimulus was more often judged as “head” • When the skin was stretched downward, the stimulus sounded more like “had”

  13. Somatosensory function in speech perception (Takayuki et. al 2009) • Principal finding of the present study: the perception of speech sounds is modified by stretching the facial skin & the perceptual change depends on the specific pattern of deformation • The presence of any perceptual change depended on the temporal pattern of the stretch such that perceptual change was present only when the timing of skin stretch was comparable to that which occurs during speech production

  14. Important to remember: • Speech information is not equally available across all modalities… • Of course, a greater range of speech information is generally available through hearing than through vision or touch • However: the information that is available takes a common form across modalities … and as far as speech perception is concerned … the modalities are never really separate

  15. Take home messages: • Speech perception entails more than just auditory input • Non-auditory influences on speech perception include: visual info + somatosensory info • Visual info through lip reading • Remember: McGurk effect & Right mouth dominance in lip reading • Somatosensory info through facial deformation patterns • Remember: Moving facial muscles while listening to speech input influences what is heard • i.e. move facial muscles in pattern that would be used to produce the word “head”, participant inclined to hear “head”

More Related