1 / 1

ruth.adam@tuebingen.mpg.de

Visual object categorization with conflicting auditory information. Ruth Adam & Uta Noppeney. Max Planck Institute for Biological Cybernetics, Tübingen. ruth.adam@tuebingen.mpg.de. Introduction. fMRI Acquisition and Analysis. Data acquisition:

molly
Download Presentation

ruth.adam@tuebingen.mpg.de

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Visual object categorization with conflicting auditory information Ruth Adam & Uta Noppeney Max Planck Institute for Biological Cybernetics, Tübingen ruth.adam@tuebingen.mpg.de Introduction fMRI Acquisition and Analysis • Data acquisition: • SIEMENS TimTrio 3T scanner, GE-EPI, TE=40ms, 38 axial slices, • TR=3.08s, voxel size 3x3x3 mm. • - 26 subjects: 2 sessions, 208 volumes each. • Data Analysis: • - SPM5, Event-Related Analysis (HRF, temporal derivative). • - Random Effects Analysis, 2nd Level ANOVA. • -Psychophysiological analysis. Visual object recognition is associated with category-selective activations in the ventral temporal cortices. While faces enhance activations in the fusiform gyrus, landmarks increase activations in the parahippocampal place area.However, in our natural environment objects often produce signals in multiple sensory modalities. To form a more robust and reliable percept, the human brain is challenged to integrate mutliple sensory signals that emanate from the same object. Thus, the question emerges whether category-selective activations in the ventral occipito-temporal cortex are influenced by auditory signals of congruent or incongruent object categories. fMRI Results Main effects of visual category and auditory category Scientific Aim The present study investigates the influence of task-irrelevant and unattended auditory signals on visual-evoked category-selective activations in the ventral occipito-temporal cortex. 15 -22 55 Experimental Design • In a visual selective attention paradigm, subjects categorized visual • objects while ignoring the preceding congruent or incongruent sound. • The 2X2 factorial design manipulated: • (i) Auditory prime: Animal vocalization (auditory ‘Face’) vs. sound associated with Landmark (auditory Landmark). • (ii) Visual target: animal Face vs. Landmark. • Visual stimuli were degraded using phase randomization techniques that preserve mean luminance and RMS contrast of the original images. • 50% of the stimuli were congruent (e.g. an animal Face paired with the sound produced by that animal), 50% were incongruent and combined auditory and visual stimulus components from opposite categories (e.g. an animal Face paired with a Landmark sound). Psychophysiological interactions Physiological factor: right superior temporal g. (auditory Face > auditory Landmark) Psychological factor: audiovisual incongruency • Example run and timing of two trials: Summary & Conclusions • Surprisingly, response times and BOLD responses showed different profiles: • Response times showed a significant interaction between visual and auditory category, i.e. an incongruency effect. This effect emerged primarily due to an interference of incongruent auditory landmarks on the categorization of visual faces. • At the neural level, face selective activations in FFA were not modulated by the congruency of an irrelevant auditory sound. • Parahippocampal activations anterior to the PPA showed additive effects of both visual and auditory information. Activations were increased for (i) visual landmarks relative to faces and (ii) auditory landmarks relative to faces. • Effective connectivity analyses indicated that visual responses are amplified by incongruent auditory inputs via enhanced coupling between auditory and occipito-temporal cortices. • Collectively, these results suggest that face-selective responses in FFA are more robust and less influenced by task-irrelevant sounds even in the context of a behavioural audiovisual interference effect. Behavioural Results Reaction time (mean±SEM) Accuracy (mean±SEM) • - Main effect for Auditory prime - n.s. • Interaction: visual target x auditory prime (= incongruency effect)

More Related