Enhanced auditory spatial localization in blind echolocators

Enhanced auditory spatial localization in blind echolocators
Neuropsychologia 67 (2015) 35–40
Contents lists available at ScienceDirect
journal homepage: www.elsevier.com/locate/neuropsychologia
Enhanced auditory spatial localization in blind echolocators
Tiziana Vercillo a,n, Jennifer L. Milne b, Monica Gori a, Melvyn A. Goodale b
Robotics, Brain & Cognitive Sciences Department, Fondazione Istituto Italiano di Tecnologia, via Morego 30, 16163 Genoa, Italy
The Brain and Mind Institute, The University of Western Ontario, London, Ontario, Canada
art ic l e i nf o
a b s t r a c t
Article history:
Received 18 July 2014
Received in revised form
2 December 2014
Accepted 3 December 2014
Available online 4 December 2014
Echolocation is the extraordinary ability to represent the external environment by using reflected sound
waves from self-generated auditory pulses. Blind human expert echolocators show extremely precise
spatial acuity and high accuracy in determining the shape and motion of objects by using echoes. In the
current study, we investigated whether or not the use of echolocation would improve the representation
of auditory space, which is severely compromised in congenitally blind individuals (Gori et al., 2014). The
performance of three blind expert echolocators was compared to that of 6 blind non-echolocators and 11
sighted participants. Two tasks were performed: (1) a space bisection task in which participants judged
whether the second of a sequence of three sounds was closer in space to the first or the third sound and
(2) a minimum audible angle task in which participants reported which of two sounds presented successively was located more to the right. The blind non-echolocating group showed a severe impairment
only in the space bisection task compared to the sighted group. Remarkably, the three blind expert
echolocators performed both spatial tasks with similar or even better precision and accuracy than the
sighted group. These results suggest that echolocation may improve the general sense of auditory space,
most likely through a process of sensory calibration.
& 2014 Elsevier Ltd. All rights reserved.
Visual deprivation
Auditory localization
1. Introduction
Echolocation is the ability to “see” the environment by using
auditory rather than visual cues. It is a perceptual and navigational
strategy used by some animals, such as bats and dolphins, but its
use has also been described in some blind humans. Blind people
who echolocate – for example, by making mouth clicks and listening to the click echoes – demonstrate excellent spatial acuity in
discriminating the direction and distance of objects (Teng et al.,
2012; Thaler et al., 2011). A functional recruitment of visual deafferented areas, which usually are activated by visual stimuli in
sighted people, apparently subserves this ability (Arnott et al.,
2013; Thaler et al., 2011, 2014). Furthermore, echolocation-specific
activity has been reported in the calcarine cortex, a brain region
typically devoted to vision, suggesting that the visual cortex could
potentially process ‘supra-modal’ spatial functions after the loss of
visual sensory input (Thaler et al., 2011). Moreover, the activation
of the occipital cortex in response to echo stimuli seems to be
organized in a topographic manner (Arnott et al., 2013).
An interesting question is whether or not echolocation expertise plays any role in the representation of natural sounds in
the environment. Gori et al. (2014) recently reported for the first
Corresponding author.
E-mail address: [email protected] (T. Vercillo).
0028-3932/& 2014 Elsevier Ltd. All rights reserved.
time that in congenitally blind individuals, such representations
seem to be imprecise and inaccurate compared to those of sighted
people. The researchers tested a group of congenitally blind individuals in two different spatial auditory tasks: a space bisection
task (in which subjects had to report whether the second sound of
a sequence of three was closer in space to the first or to the third)
and a minimum audible angle task. The performance of all the
blind participants tested by Gori et al. (2014) was impaired only in
the bisection task, but no difference was found with respect to the
performance of sighted participants in the minimum audible angle
task, in which a simple comparison between a central and a lateral
sound was demanded. The bisection task used in this study,
however, required the representation of a complex auditory space
in which the relationship between more than two sounds must be
understood to correctly perform the task. Moreover, such representations must be assembled over time, since the three stimuli are presented to participants sequentially, denoting an involvement of spatial memory. Thus, the authors suggested that
early visual deprivation does not affect auditory localization per se,
but rather the integration of multiple spatial representations in a
complex map. Indeed, as several researchers have previously
shown, congenitally blind individuals can localize single sound
sources with similar or even higher precision than sighted individuals (Lessard et al., 1998). In support of the results from Gori
et al. (2014), many electrophysiological studies have found visual
deprivation interferes with the development of auditory maps in
T. Vercillo et al. / Neuropsychologia 67 (2015) 35–40
animals (King et al., 1993, 1998; Knudsen and Knudsen, 1989;
Wallace and Stein, 1997; Withington-Wray et al., 1990). Moreover,
psychophysical reports have shown that visual information might
be used to calibrate auditory spectral pinna cues. Zwiers et al.
(2001) reported poor localization in early blind individuals on the
elevation plane, where binaural cues are poor and the benefits of
vision are normally maximal.
The study by Gori et al. (2014) raises the question of whether
echolocation could serve to recalibrate the ability of blind individuals to represent sounds in complex spatial configurations.
Specifically, we aimed to determine if blind individuals who have
mastered echolocation techniques show improved auditory spatial
representations. It seems plausible that the audio-motor feedback
provided by the relationship between self-motion and changes in
the echoic information might help in constructing accurate spatial
representations. In other words, prolonged echolocation practice
may compensate for the lack of cross-sensory spatial calibration of
the auditory system through a process of sensory-motor calibration. Moreover, the constant processing of acoustic echo cues – for
example, the time delay between the emitted sound and the echo
and the changes in the spectrum of the sound resulting from the
addition of the echo relative to the emission – might also improve
the sensitivity of sound perception in blind echolocators.
In the present study we investigated whether the echo feedback provided by the use of echolocation can interact with auditory spatial representations in a process of auditory recalibration.
We hypothesized that echo feedback associated with motor interaction with objects and the plasticity in the visual cortex reported in blind expert echolocators might improve the mapping of
auditory space in blind individuals.
2. Methods and procedures
2.1. Participants
Eleven sighted (mean age 3373 years of age, 6 female and
5 male), six congenitally blind non-echolocators (mean age 377 5
years of age, 3 female and 3 male), and three congenitally blind
expert echolocators (E1, E2 and E3; respectively 57 years of age,
male, 50 years of age, male and 54 years of age female) were tested
on the two spatial tasks. Two of the participants from the congenitally blind group and two other congenitally blind individuals
(21 and 25 years of age) were also tested in a control experiment.
Sighted participants had normal vision and hearing. Two echolocators (E1, E2) use palatal mouth clicks while the third prefers a
finger snap. E1 and E3 have been using echolocation since they can
remember, while E2 started as an adult but has been doing this for
many years. All of them rely on echolocation on a daily basis for
navigation and perception. Blind participants were recruited from
the general population and tested at the University of Western
Ontario (London, Ontario, Canada) and the Institute David Chiossone in Genoa (GE). The blind echolocator E3 was recruited at the
Durham University (Durham, YK). Table S1 (Supplemental material) reports the details about age, pathology and residual vision of
the blind participants. Gender and ages for all three groups were
similar: a two tailed unpaired t-test revealed no significant differences in age between sighted and blind-non-echolocating participants (t ¼0.6356, P¼ 0.534) and modified t-tests (Crawford and
Howell, 1998; see below for description) revealed no differences in
age between the echolocator E1 and the group of sighted (t ¼1.971,
P¼ 0.077) and blind participants (t ¼1.630, P¼ 0.154), between the
echolocator E2 and the group of sighted (t¼ 1.376, P ¼0.199) and
blind participants (t¼1.051, P ¼0.333) and between the echolocator E3 and the group of sighted (t¼ 1.71, P¼ 0.11) and blind
participants (t¼1.25, P¼ 0.26). All participants provided informed
consent. For the blind participants, the document was read aloud
by the experimenter, and the location to sign was indicated with
tactile markers.
2.2. Sound recording procedure
Sounds were recorded with binaural microphones in an echodampened room with participant E1. Auditory stimuli were 500 Hz
tones of 75 ms duration at 60 dB sound pressure level (SPL). Since
(Gori et al., 2014) did not find differences in spatial precision
during the localization of 500 and 3000 Hz sounds and pink noise
(ranging from 0 to 5 kHz) in both sighted and blind individuals, we
decided to use the same validated experimental procedure to
better compare the results between the two studies. The recording
was made in an echo-dampened room at 180 cm from the 0 position (the central location of the speakers). The testing room was
equipped with 3.8-cm convoluted foam sheets on the four walls.
We presented sounds via loudspeaker positioned on a platform at
the height of the subject's ears and we recorded 23 different
source sound locations. Each location was recorded several times
to have the possibility of selecting the best sound in terms of Interaural Time Difference (ITD) and Interaural Level Difference
(ILD). We analysed the features of the sound with the open source
software “Audacity”. The 0° position was located directly in front
of the participant, at a distance of 180 cm. The remaining sound
locations spanned from 25° to 25° of visual angle (Fig. 1a).
Fig. 1. (a) Sound recording procedure. The sound was presented via a loudspeaker at 23 different locations, spanning 725° of visual angle. The recording was made with
binaural microphones with the subject seated 180 cm in front of the central speaker. (b) Space bisection task. A sequence of three sounds was presented. The location of the
first and the third sound were fixed at 25° and þ 25° respectively, while the position of the second stimulus (the probe) ranged over 7 23° of visual angle. Participants had
to report whether the location of the probe stimulus sounded spatially closer to the first or the third sound. (c) Minimum audible angle task. Two auditory stimuli were
presented: the standard always at the 0° position, and the probe ranging over 7 25° of visual angle. The order of presentation of the two stimuli was randomized over trials.
Participants had to report which one of the two sounds was located more to the right side of space.
T. Vercillo et al. / Neuropsychologia 67 (2015) 35–40
2.3. Procedure
2.4. Data analysis
We used t-tests, modified by Crawford and Howell, to test the
hypothesis that an individual did not come from a population of
controls; under the null hypothesis, the individual is an observation from a distribution with the same mean and variance as the
controls (Crawford et al., 2010; Crawford and Howell, 1998). Following this method, the control sample scores are treated as statistics rather than as population parameters. The formula for the
test is the one set out by Sokal and Rohlf (1995); if the t-value
obtained from this test falls below the (negative) one-tailed 5%
Space Bisection
Proportion 'right' (%)
During the experiment, participants sat in a silent room and
listened to the binaural recordings over headphones. We asked the
sighted participants to keep their eyes closed. We measured auditory thresholds in two spatial tasks: a space bisection task and a
minimum audible angle task. The order of the two tasks was
randomized over subjects. In the space bisection task (Fig. 1b),
participants had to listen to a sequence of three sounds presented
successively at 500 ms intervals. The position of the first sound
was set at 25° (on the far left side of the participant), the third at
þ25° (on the far right side of the participant), and the second
sound (the probe stimulus) at an intermediate position that
spanned 24° to 24°, determined by a constant stimuli algorithm
controlled by Matlab software. Each position of the second sound
was repeated 8 times. We asked participants to report whether the
second stimulus was spatially closer to the first or to the third
sound. Participants performed 168 trials on this task. In the
minimum audible angle task (Fig. 1c), only two sounds were presented successively, the standard at 0° and the probe stimulus at a
different position that spanned 25° to 25°, again determined by
a constant stimuli algorithm. Each position of the probe stimulus
was repeated 8 times for a total of 184 trials. Participants had to
report which of the two stimuli was located more to the right side
of space. We randomized the order of presentation of the two
stimuli to avoid any temporal effect. The very brief inter-stimulus
interval of 500 ms used in the minimum audible angle task might
generate an illusory motion effect. Because blind individuals are
reported to have superior ability in perception of auditory motion
(Lewald, 2013), we decided to replicate the minimum audible
angle task in a small group of four congenitally blind participants
using an inter-stimulus interval of 1 s instead of 500 ms, to avoid
the illusory motion effect.
The proportion of trials where the probe stimulus was judged
“closer to the third” in the space bisection task, or “more on the
right” in the minimum audible angle task, was computed for each
location of the probe stimulus and fitted by cumulative Gaussians
(see Fig. 2), yielding Point of Subjective Equality (PSE, given by the
mean) and threshold (standard deviation). These two measurements allow for direct comparisons between different participants
based on the percentage of correct responses. For the minimum
audible angle task, accuracy represents the perceived location of
the standard stimulus, while in the space bisection task it represents the perceived midpoint between the first and the third
stimulus. The thresholds are based on the concept of sensitivity:
the functioning of a perceptual process is well characterized by its
ability to detect differences between different stimuli. Indeed
precision is usually defined as the inverse of the thresholds.
Standard errors for the PSE and threshold estimates were obtained
with a bootstrap procedure (Efron and Tibshirani, 1993). The experiment was performed according to the principles defined in the
declaration of Helsinki. All testing procedures were approved by
the Ethics Board at the University of Western Ontario and by the
ASL3 of Genoa (Italy).
25 -25
Position (deg)
Fig. 2. Psychometric functions for both the bisection (panels on the left) and the
minimum audible angle tasks (panels on the right) for the three blind echolocators
participants (E1, E2, E3; green curves), three representative blind-non-echolocators
participants (B1, B2, B3; red curves) and three sighted participants (S1, S2, S3; gray
curves). (For interpretation of the references to color in this figure legend, the
reader is referred to the web version of this article.)
critical value, then it can be concluded that the case's score is
sufficiently low to reject the null hypothesis that it is an observation from the population of scores for controls. In this way,
we could compare the performance of each echolocator to that of
the sighted and the blind non-echolocating group. We also used an
analysis of variance (ANOVA) to compare the performance of the
sighted and the blind non-echolocating groups.
3. Results
In contrast to the poor performance of the congenitally blindnon-echolocators, the blind echolocators and the sighted participants performed well on the space bisection task. Fig. 2 shows
psychometric functions for both the bisection and the minimum
audible angle tasks in three representative sighted participants
(gray curves), blind-non echolocators (red curves), and blind
echolocators (green curves). Blind participants who do not use
echolocation showed a severe impairment on the space bisection
task: the red curves show a particularly shallow slope, sometimes
near chance. However, the performance of this group of participants did not differ from the performance of sighted participants
in the minimum audible angle task (see red and gray curves in
Fig. 2, panels on the right side). These results are in agreement
with Gori et al. (2014). Measured thresholds for both the sighted
T. Vercillo et al. / Neuropsychologia 67 (2015) 35–40
Thresholds (deg)
Minimum Audible Angle
Space Bisection
Sighted Blind
Sighted Blind
PSE Bisection (deg)
Thresholds Bisection (deg)
30 -20
Thresholds MAA (deg)
PSE MAA (deg)
Fig. 3. Average thresholds (with 95% confidence intervals) for sighted (gray bars), congenitally blind (red bars), and blind expert echolocators (green bars) for the two spatial
tasks: (a) the space bisection task and (b) the minimum audible angle (MAA) task. The dashed line represents the average thresholds for the sighted group. (c and d)
Individual thresholds (a) and PSE (b) for the space bisection task plotted as a function of measured thresholds/PSE for the minimal audible angle task, calculated from the
individual psychometric functions. We used a shape-code to identify the congenitally blind participants (for clinical details see Table S1 in the Supplemental material). Note
that the individual scores for the blind non-echolocators are indicated by different symbols (see Supplemental material). Arrows at the margin and filled symbols show the
average means, while open symbols show individual data. The open green circle represents the echolocator E1, the open green triangle the echolocator E2 and the green
upside-down triangle the echolocator E3. The shaded areas represent the 95% confidence intervals. (For interpretation of the references to color in this figure legend, the
reader is referred to the web version of this article.)
and the blind non-echolocating participants were slightly higher
with respect to what was previously reported by Gori et al. (2014).
On the other hand, the thresholds (standard deviation of the
curves) and the accuracy (PSE) of the three blind echolocators
were similar to (see green curves in the panels on the right) or
better (see green curves in the panels on the left) than those of the
sighted participants.
The average threshold (Fig. 3a and b) for the congenitally blind
non-echolocators was substantially worse than that of the sighted
participants (F(1, 15) ¼13.760, P¼ 0.002). In contrast, modified ttests revealed that echolocator E1's mean threshold was similar to
that of the sighted participants (t ¼ 1.235, P ¼0.24) and significantly better than the threshold of the blind-non-echolocators
(t ¼ 2.814, P¼ 0. 03), and that E2 and E3's mean thresholds were
significantly better than that of both sighted participants and the
(t ¼ 5.253,
P¼ 0.00;
t¼ 3.772, P ¼0. 01, and t ¼ 4.77, P o0.001; t¼ 5.57, P ¼0.006,
respectively). Only one congenitally blind participant (red apexdown triangle in Fig. 3c) showed performance that was similar to
that of the three echolocators. We believe that the unusual performance of this blind participant might arise from the residual
vision he exhibited early on in his life and/or from intense rehabilitation training.
Interestingly, not only spatial precision but also accuracy (the
50% point of each psychometric function, which represents the
PSE) for the three blind expert echolocators was higher than that
of the sighted participants (E1: t¼ 3.730, P ¼0.003; E2: t¼ 0.4119,
P¼ 0.002; E3: t ¼5.97, P¼ 0.002). As can be seen in Fig. 3d, there
was considerable variance in the PSEs of congenitally blind-nonecholocators. Nevertheless, the blind echolocators still performed
better than this group (E1: t¼ 1.98, P ¼0.05; E2: t¼1.90, P ¼0.05;
E3: t ¼1.90, P ¼ 0.05). These results suggest that echolocation expertise may contribute to and support the development of a sophisticated and well-calibrated spatial auditory map of Euclidean
relationships necessary to perform the bisection task.
Echolocation expertise was also associated with impressive
improvements in both precision and accuracy in the minimum
audible angle task. As Fig. 3b shows, the thresholds for the three
blind echolocators on this task were almost half of those of the
other participants. The modified t-tests confirmed that the echolocators' performance was significantly better than that of the
sighted participants and the congenitally blind-non-echolocators
(E1: t¼ 4.881, Po 0.001; t¼ 3.322, P ¼0.01, respectively; E2:
t¼ 4.633, P o0.001; t ¼ 3.206, P¼ 0.01, respectively and E3:
t¼ 2.872, P ¼0.008; t¼ 2.38, P¼ 0.03, respectively). There was
no significant difference in the thresholds of the congenitally
blind-non-echolocators and the sighted participants (F(1, 15) ¼
1.5151, P ¼0.237).
With respect to accuracy, both the congenitally blind-nonecholocators and the sighted participants revealed a slight bias to
the left (see Fig. 3d). In contrast, the blind echolocators showed
impressive spatial acuity, with significantly higher accuracy than
T. Vercillo et al. / Neuropsychologia 67 (2015) 35–40
Average Thresholds (deg)
500 ms
1 sec
Inter-Stimulus-Interval (ms)
Fig. 4. Average thresholds for two groups of congenitally blind individuals in the
minimum audible angle task with an inter-stimulus-interval of 500 ms (N ¼ 6, red
bar) and 1 s (N¼ 4, striped red bar). The difference between the two conditions is
not statistically significant. (For interpretation of the references to color in this
figure legend, the reader is referred to the web version of this article.)
the sighted participants (E1: t¼ 6.126, Po 0.001; E2: t¼ 4.126,
P ¼0.002; E3: t¼ 4.13, P ¼0.002). In comparison to the accuracy
of the congenitally blind-non-echolocators, E1's performance was
significantly better (t¼ 2.486, P ¼0.03), but E2 and E3's performance did not differ (t¼ 1.31, P ¼0.42 and t¼ 1.32, P ¼0.24). To
summarize, blind echolocators showed similar or even greater
precision than sighted participants in perceiving the relative position of sounds in the bisection task whereas, in agreement with
previous research (Gori et al., 2014), the congenitally blind-nonecholocators were impaired. Moreover, even on the simpler
minimum audible angle task, the blind echolocators performed
better than both the congenitally blind-non-echolocators and the
sighted participants, who did not differ from each other.
Fig. 4 shows average thresholds for the minimum audible angle
task with an inter-stimulus-interval (ISI) of 500 ms (red bar) and
1 s (red striped bar) measured in the two different groups of
congenitally blind participants. Increasing the temporal separation
between stimuli slightly improved the spatial precision of the
blind individuals by reducing the average threshold that was now
equal to 12.8 74°. This difference, however, was not statistically
significant (unpaired two sample t-test, t¼1.09, P ¼0.30), suggesting that blind participants were not using the motion information as a cue during the spatial task.
4. Discussion
Representing external space is a key function of the brain for
navigating (and understanding) the world. Without vision, accurate and precise sound localization becomes crucial since audition
provides the only source of information about objects beyond
reachable space. Gori et al. (2014) recently reported that congenitally blind individuals have a severe deficit in perceiving the
spatial relations of sounds in their immediate environment. The
authors proposed that early visual deprivation may disturb the
normal process of cross-sensory calibration between vision and
audition (Gori et al., 2008), affecting the development of complex
spatial auditory maps, leaving simple topography preserved. In the
current study, we not only confirmed the auditory spatial deficit
previously described by Gori et al. (2014), but also revealed that
the use of echolocation might help in preserving this auditory
spatial function.
The average thresholds measured in our study in the sighted
and blind-non-echolocators were slightly higher than those previously reported (Gori et al., 2014). This is true for both of the
groups and tasks, apart from the performance of blind nonecholocators in the space bisection task. We believe that these
differences may be due to the use of recorded sounds as opposed
to the natural sound sources used by Gori et al. (2014); in other
words, it may be the case that the procedure of sound recording
generates noisier stimuli. Even so, blind non-echolocators showed
impaired performance in the space bisection in both the current
and previous studies, despite differences in stimuli. Conversely,
the blind non-echolocators exhibited good performance, with
thresholds similar to those of the sighted group, in the minimum
audible angle task, in agreement with the previous research
(Lessard et al., 1998; Roder et al., 1999). It could be argued that the
short temporal separation between the two sounds in the minimum audible angle task induces a motion illusion which could be
used to deduce the spatial position of the sounds. For example a
sequence of sounds where the first stimulus is more to the right
than the second one might be interpreted as a sound moving to
the left. Thus, blind participants might have performed the localization task as a motion discrimination task, where it has been
already been shown that their performance exceeds that of sighted individuals (Lewald, 2013). We can exclude this hypothesis,
however, since we showed that congenitally blind non-echolocators showed no difference in performance on the minimum audible angle task with two inter-stimulus-intervals (500 ms and
1 s). We suggest that early visual deprivation affects only the integration of multiple spatial auditory representations.
In contrast to the performance of the blind non-echolocating
participants, the performance of the three blind expert echolocators was impressively precise and accurate in both tasks. Indeed,
the three echolocators performed the bisection task with similar
precision to that of the sighted group and with even higher precision than the sighted and blind non-echolocating groups on the
minimum audible angle task. The results of the current study
suggest that the routine use of echolocation for navigation not
only preserves the auditory spatial ability in blind individuals who
have acquired this skill, but also leads to an improvement of auditory sensitivity. Early adoption of echolocation does not seem to
be essential for the improvement in auditory spatial processing
that we found. One of the echolocators who we tested learned to
echolocate later in life. Thus, the use of echolocation as a primary
source for navigation, independent of when it was adopted, appears to be the essential factor.
Based on the results of the current study, echolocation serves as
a valuable tool that not only allows blind people to navigate their
environment, but also provides audio-motor feedback that may
aid in sensory calibration, as previously suggested by Kolarik et al.
(2014). We believe that this sensory-motor feedback may promote
the development of an accurate spatial representation of auditory
space beyond peripersonal space.
Of course, blind non-echolocators could easily calibrate their
auditory representation in the immediate haptic workspace (by
touching the sound source), but they might be expected to show
difficulties with auditory stimuli in far space, unlike blind echolocators. During navigation, self-generated sound pulses provide
sensory feedback, the echo, which is fundamental for developing a
sensory-motor association similar to visual-motor exploration.
Moreover, the echo can provide accurate information about the
location of objects within the range of 2 m (Rowan et al., 2013), a
greater distance than the one used in our experiment (1.8 m). The
idea of sensory-motor recalibration is also supported by previous
research showing that the development of spatial capabilities is
driven by the interaction between visual perception and the execution of movements (Brambring, 2006) and by empirical data
T. Vercillo et al. / Neuropsychologia 67 (2015) 35–40
suggesting that sensory substitution devices are effective only
when the person manipulates the device that is being used to
explore the environment (Bach-y-Rita, 1972).
The use of echolocation on a daily basis may result in higher
sensitivity to auditory cues simply due to the necessary reliance on
(and thus experience with) such cues to support perception via
echoes. For example, the use of interaural echo timing and level
differences (relative to the signal emissions) as well as comparisons of spectral information may also help to support improved
auditory spatial processing in general (i.e. with source-auditory
information as opposed to echo information). Furthermore, it is
possible that the more accurate calibration of auditory space in
echolocation experts is facilitated by the topographic mapping of
the echoes in what would normally be visual areas in the sighted
brain (Arnott et al., 2013; Teng et al., 2012; Thaler et al.,
2011, 2014). Future research should address these (and other)
possibilities in order to better understand the relationship between echolocation expertise and improved auditory spatial
An impairment in spatial memory after visual deprivation may
be another possible explanation of our results. Indeed, the performance of the blind non-echolocators we tested was strongly
affected only in the space bisection task, where the spatial locations of three sounds have to be memorized and then simultaneously recalled to evaluate the spatial configuration. Conversely
the minimum audible angle task has a smaller memory load, since
participants had only to memorize the positions of two sounds.
This hypothesis is supported by previous studies showing difficulties in the spatial processing in blind adults due to the simultaneous processing of independent spatial representations
(Vecchi et al., 2004) and deficit in the spatial recall in blind children (Millar, 1975). According to this hypothesis, the use of echolocation could improve auditory localization by increasing the
spatial memory storage capacity.
In sum, the current findings suggest that the use of echolocation may have important benefits for the representation of auditory space in general, possibly eradicating deficits typically seen in
blind individuals. This study has important implications for early
mobility training in the blind. Our results also open the door for
future research in this area, which could clarify the role of auditory-motor recalibration in the development of visual-like spatial
Appendix A. Supplemental material
Supplementary data associated with this article can be found in
the online version at http://dx.doi.org/10.1016/j.neuropsychologia.
Brambring, M., 2006. Divergent development of gross motor skills in children who
are blind or sighted. J. Vis. Impair. Blind 100 (10), 620–634.
Crawford, J.R., Garthwaite, P.H., Porter, S., 2010. Point and interval estimates of effect sizes for the case-controls design in neuropsychology: rationale, methods,
implementations, and proposed reporting standards. Cogn. Neuropsychol. 27
(3), 245–260.
Crawford, J.R., Howell, D.C., 1998. Comparing an individual's test score against
norms derived from small samples. Clin. Neuropsychol. 12 (4), 482–486.
Efron, B., Tibshirani, R.J., 1993. An Introduction to the Bootstrap. Chapman & Hall,
New York, NY.
Gori, M., Del Viva, M., Sandini, G., Burr, D.C., 2008. Young children do not integrate
visual and haptic form information. Curr. Biol. 18 (9), 694–698. http://dx.doi.
Gori, M., Sandini, G., Martinoli, C., Burr, D.C., 2014. Impairment of auditory spatial
localization in congenitally blind human subjects. Brain 137 (Pt 1), 288–293.
King, A.J., Carlile, S., 1993. Changes induced in the representation of auditory space
in the superior colliculus by rearing ferrets with binocular eyelid suture. Exp.
Brain Res. 94 (3), 444–455.
King, A.J., Schnupp, J.W., Thompson, I.D., 1998. Signals from the superficial layers of
the superior colliculus enable the development of the auditory space map in
the deeper layers. J. Neurosci. 18 (22), 9394–9408.
Knudsen, E.I., Knudsen, P.F., 1989. Visuomotor adaptation to displacing prisms by
adult and baby barn owls. J. Neurosci. 9 (9), 3297–3305.
Kolarik, A.J., Cirstea, S., Pardhan, S., Moore, B.C., 2014. A summary of research investigating echolocation abilities of blind and sighted humans. Hear Res. 310,
60–68. http://dx.doi.org/10.1016/j.heares.2014.01.010.
Lessard, N., Pare, M., Lepore, F., Lassonde, M., 1998. Early-blind human subjects
localize sound sources better than sighted subjects. Nature 395 (6699),
278–280. http://dx.doi.org/10.1038/26228.
Lewald, J., 2013. Exceptional ability of blind humans to hear sound motion: implications for the emergence of auditory space. Neuropsychologia 51 (1),
181–186. http://dx.doi.org/10.1016/j.neuropsychologia.2012.11.017.
Millar, S., 1975. Spatial memory by blind and sighted children. Br. J. Psychol. 66 (4),
Roder, B., Teder-Salejarvi, W., Sterr, A., Rosler, F., Hillyard, S.A., Neville, H.J., 1999.
Improved auditory spatial tuning in blind humans. Nature 400 (6740), 162–166.
Rowan, D., Papadopoulos, T., Edwards, D., Holmes, H., Hollingdale, A., Evans, L.,
Allen, R., 2013. Identification of the lateral position of a virtual object based on
echoes by humans. Hear Res. 300, 56–65. http://dx.doi.org/10.1016/j.
Sokal, R.R., Rohlf, J.F., 1995. Biometry. W.H. Freeman, San Francisco, CA.
Teng, S., Puri, A., Whitney, D., 2012. Ultrafine spatial acuity of blind expert human
echolocators. Exp. Brain Res. 216 (4), 483–488. http://dx.doi.org/10.1007/
Thaler, L., Arnott, S.R., Goodale, M.A., 2011. Neural correlates of natural human
echolocation in early and late blind echolocation experts. PLoS One 6 (5),
e20162. http://dx.doi.org/10.1371/journal.pone.0020162.
Thaler, L., Milne, J.L., Arnott, S.R., Kish, D., Goodale, M.A., 2014. Neural correlates of
motion processing through echolocation, source hearing, and vision in blind
echolocation experts and sighted echolocation novices. J. Neurophysiol. 111 (1),
112–127. http://dx.doi.org/10.1152/jn.00501.2013.
Vecchi, T., Tinti, C., Cornoldi, C., 2004. Spatial memory and integration processes in
congenital blindness. Neuroreport 15 (18), 2787–2790.
Wallace, M.T., Stein, B.E., 1997. Development of multisensory neurons and multisensory integration in cat superior colliculus. J. Neurosci. 17 (7), 2429–2444.
Withington-Wray, D.J., Binns, K.E., Dhanjal, S.S., Brickley, S.G., Keating, M.J., 1990.
The maturation of the superior collicular map of auditory space in the guinea
pig is disrupted by developmental auditory deprivation. Eur. J. Neurosci. 2 (8),
Zwiers, M.P., Van Opstal, A.J., Cruysberg, J.R., 2001. A spatial hearing deficit in earlyblind humans. J. Neurosci. 21 (9).
Arnott, S.R., Thaler, L., Milne, J.L., Kish, D., Goodale, M.A., 2013. Shape-specific activation
of occipital cortex in an early blind echolocation expert. Neuropsychologia 51 (5),
938–949. http://dx.doi.org/10.1016/j.neuropsychologia.2013.01.024.
Bach-y-Rita, P., 1972. Brain Mechanisms in Sensory Substitution. Academic Press,
New York.
All in-text references underlined in blue are linked to publications on ResearchGate, letting you access and read them immediately.
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF