sound localization l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Sound localization PowerPoint Presentation
Download Presentation
Sound localization

Loading in 2 Seconds...

play fullscreen
1 / 54

Sound localization - PowerPoint PPT Presentation


  • 724 Views
  • Uploaded on

Sound localization. What are the factors that determine how well we can tell where a sound is coming from?. Bottom line. Acoustics, peripheral coding, and central processing are all important in sound localization. Importance of sound localization.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Sound localization' - oshin


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
sound localization

Sound localization

What are the factors that determine how well we can tell where a sound is coming from?

bottom line

Bottom line

Acoustics, peripheral coding, and central processing are all important in sound localization.

importance of sound localization
Importance of sound localization
  • Clear survival value: evolutionary importance
  • Involves comparisons between ears and across the spectrum: requires a brain
sound localization the important phenomena
Sound localization: the important phenomena
  • Effects of position in azimuth
  • Effects of frequency
  • Effects of position in elevation
the minimum audible angle maa

Sound 1

Sound 2

The minimum audible angle (MAA)

Speakers

Which sound came

from the right?

Threshold for detecting a change in spatial location of a sound source

maa at different positions
MAA at different positions

From Gelfand (1998)

maa in azimuth
MAA in azimuth

Judged position

MAA

From Blauert (1983)

maa in elevation
MAA in elevation

From Blauert (1983)

the maa is
The MAA is
  • Better at midline than to the sides in azimuth
  • Not so good for sounds around 1500 Hz
  • Not as good in elevation as in azimuth

WHY?

explanations for the characteristics of the maa
Explanations for the characteristics of the MAA
  • Acoustic cues available
  • Peripheral coding
  • Central processing
acoustic cues used in localization
Acoustic cues used in localization
  • Interaural differences
    • Interaural intensity differences (IIDs)
    • Interaural time differences (ITDs)
  • Spectral shape cues
cone of confusion
Cone of confusion

From Gelfand (1998)

cone of confusion17
Cone of confusion

150 degrees

30 degrees

30 degrees

150 degrees

why are we better at straight ahead than off to the sides18
Why are we better at “straight ahead” than off to the sides?
  • Cone of confusion, front-back confusions
  • In the brain, more neurons respond to “straight ahead” than to “off-to-the side”.
the brain can afford to devote more neurons to localizing straight ahead because
The brain can afford to devote more neurons to localizing straight ahead because
  • we can turn our heads
  • most sounds come from straight ahead
  • it isn’t necessary to localize sounds accurately off to the side
  • none of the above; it can’t afford to do this
iids and frequency
IIDs and frequency

From Gelfand (1998)

why iids depend on frequency
Why IIDs depend on frequency

1600Hz

From Gelfand (1998)

the reason that sound is more intense at the ear close to the sound source is
The reason that sound is more intense at the ear close to the sound source is
  • the inverse square law; sound has to travel farther to the far ear
  • it takes longer for sound to travel farther
  • the head absorbs and reflects sound on the side closer to the source
  • the head absorbs and reflects sound on the side farther from the source
slide23
If there is a talker on your left and a talker on your right, and you really want to hear the talker on your right, you should
  • turn your nose toward the talker on the right
  • turn your nose toward the talker on the left
  • turn your left ear to the left talker and your right ear to the right talker (and listen to your right ear)
interaural time differences24
Interaural time differences

From Gelfand (1998)

slide25
ITDs and frequency: How could your brain know if a sound arrived at one ear .2 ms later than at the other?
phase ambiguity

.6 ms

.1 ms

Phase ambiguity

1000 Hz: phase difference is 216 degrees

4000 Hz: phase difference is 864 degrees but looks like 144 degrees

phase ambiguity27

Phase ambiguity

When the interaural delay is longer than the period of the tone, then interaural time comparison gives an ambiguous result.

The comparison gives a result of x degrees phase difference, but the real difference could be 360+x or 720+x, etc.

interaural onset time comparisons

.6 ms

.6 ms

Interaural onset time comparisons

How about the first (onset) response?

It can be used but not all sounds have abrupt onsets

and it only happens once.

interaural time differences29
Interaural time differences

Maximum = .65 ms

F = 1/p

F = 1/.65

F = 1538 Hz

(and any higher frequency)

From Gelfand (1998)

the reason that phase ambiguity occurs is
The reason that phase ambiguity occurs is
  • the ear does not encode the starting phase
  • the ear provides no information about phase
  • phase locking does not occur above 5000 Hz
  • people use the place code for frequecies above 2000 Hz
interaural cues and frequency
Interaural cues and frequency

IID

Cue goodness

ITD

250 500 1000 2000 4000 8000 16000

Neither cue is so good around

1500 Hz.

lateralization experiments
Lateralization experiments

Interaural differences under earphones create the perception of a

sound source located inside the head, at a position determined by

the interaural difference.

interaural intensity difference discrimination

15 dB

9 dB

0 dB

Interaural intensity difference discrimination

Good performance

From Gelfand (1998)

slide35

We know it’s the acoustics in the case of IIDs, because if we artificially create IIDs at low frequencies, people hear sound source at different locations.

interaural time phase difference discrimination
Interaural time (phase) difference discrimination

Still can’t do high

frequency

Better at small

phase separations

(straight ahead)

From Gelfand (1998)

slide37

We know it’s the auditory system in the case of ITDs, because if we “artificially” create ITDs at high frequencies, people still can’t tell what the sound source location is.

am lateralization
AM lateralization

From Yost (1994)

slide40

We know there is a contribution of the auditory system in the case of differences between positions, because if we artificially create different positions, people can still “lateralize” better for midline than for lateral positions.

explanations for the characteristics of the maa41
Explanations for the characteristics of the MAA
  • Acoustic cues available
  • Peripheral coding
  • Central processing
the maa is42
The MAA is
  • Better at midline than to the sides in azimuth
  • Not so good for sounds around 1500 Hz
  • Not as good in elevation as in azimuth

WHY?

what are the acoustic cues to sound elevation

Sound source moving along

an arc directly overhead at

midline

No interaural differences

What are the acoustic cues to sound elevation?
what are the acoustic cues to sound elevation44
What are the acoustic cues to sound elevation?

Elevation - no

Azimuth - yes, but front-back confusions

Elevation - no

Azimuth - yes

Elevation - yes

Azimuth - yes

Localization in elevation requires

pinnas

From Blauert (1983)

what do pinnas do for us
What do pinnas do for us?

The acoustic cue used to localize in elevation is spectral shape.

From Gelfand (1998)

slide46

That localization is less precise in elevation than in azimuth suggests that spectral shape is not as good a cue to location as interaural differences.

localization in azimuth with one ear is similar in precision to localization in elevation
Localization in azimuth with one ear is similar in precision to localization in elevation.

From Gelfand (1998)

Spectral shape cues are available, but used as supplemental

Information for localization in azimuth.

another role of spectral shape cues
Another role of spectral shape cues

Unprocessed sound

Sound shaped by “HRTF filters”

reason that localization in azimuth is better straight ahead than off to the side
Reason that localization in azimuth is better straight ahead than off to the side
  • acoustic cues available
  • peripheral coding
  • central processing
reason that localization in azimuth is not so good for sounds around 1500 hz
Reason that localization in azimuth is not so good for sounds around 1500 Hz
  • acoustic cues available
  • peripheral coding
  • central processing
reason that localization in elevation is not as good as in azimuth
Reason that localization in elevation is not as good as in azimuth
  • acoustic cues available
  • peripheral coding
  • central processing
conclusions
Conclusions
  • The cues to sound location are interaural time and intensity differences and spectral shape.
  • Interaural intensity cues are primarily used at high frequencies, due to acoustic limitations.
  • Interaural time cues are primarily used at low frequencies, due to limitations in peripheral coding of sound, although they can be used to localize amplitude modulated sounds.
conclusions continued
Conclusions (continued)
  • Spectral cues are used in localization in elevation, to resolve front-back confusions, and to produce the perception of a sound in space.
  • Neural processing limitations make us more sensitive in sound localization straight ahead.
text sources
Text sources
  • Blauert, (1983). Spatial hearing: The psychophysics of human sound localization. Cambridge, MA: MIT Press.
  • Gelfand, S.A. (1998) Hearing: An introduction to psychological and physiological acoustics. New York: Marcel Dekker.
  • Yost, W.A. (1994) Fundamentals of hearing: an introduction. San Diego: Academic Press.