[download as pdf]

[download as pdf]
Multisensory Research 28 (2015) 195–226
brill.com/msr
Neural Correlates of Human Echolocation of
Path Direction During Walking
Katja Fiehler 1,∗,∗∗ , Immo Schütz 1,∗ , Tina Meller 1 and Lore Thaler 2,∗∗
1
2
Department of Psychology, Justus-Liebig-University Giessen,
Otto-Behaghel-Str. 10F, 35394 Giessen, Germany
Department of Psychology, Durham University, South Road, Science Site, Lore Thaler,
Durham DH1 3LE, UK
Received 15 October 2014; accepted 24 March 2015
Abstract
Echolocation can be used by blind and sighted humans to navigate their environment. The current
study investigated the neural activity underlying processing of path direction during walking. Brain
activity was measured with fMRI in three blind echolocation experts, and three blind and three sighted
novices. During scanning, participants listened to binaural recordings that had been made prior to
scanning while echolocation experts had echolocated during walking along a corridor which could
continue to the left, right, or straight ahead. Participants also listened to control sounds that contained
ambient sounds and clicks, but no echoes. The task was to decide if the corridor in the recording continued to the left, right, or straight ahead, or if they were listening to a control sound. All participants
successfully dissociated echo from no echo sounds, however, echolocation experts were superior at
direction detection. We found brain activations associated with processing of path direction (contrast: echo vs. no echo) in superior parietal lobule (SPL) and inferior frontal cortex in each group.
In sighted novices, additional activation occurred in the inferior parietal lobule (IPL) and middle and
superior frontal areas. Within the framework of the dorso-dorsal and ventro-dorsal pathway proposed
by Rizzolatti and Matelli (2003), our results suggest that blind participants may automatically assign
directional meaning to the echoes, while sighted participants may apply more conscious, high-level
spatial processes. High similarity of SPL and IFC activations across all three groups, in combination
with previous research, also suggest that all participants recruited a multimodal spatial processing
system for action (here: locomotion).
Keywords
Blindness, vision, audition, space perception, navigation, PPC, fMRI
*
These authors equally contributed to the work.
To whom correspondence should be addressed. E-mail: [email protected];
[email protected]
**
© Koninklijke Brill NV, Leiden, 2015
DOI:10.1163/22134808-00002491
196
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
1. Introduction
Echolocation is the ability to sense the environment through reflection of
sound (Griffin, 1944). It is probably best known from bats and marine mammals (Thomas et al., 2004), but it is by now well established that humans are
able to use echolocation as well (Kolarik et al., 2014; Schenkman and Nilsson,
2010; Stoffregen and Pittenger, 1995), and that echolocation can be learned
by both blind (e.g., Worchel and Mauney, 1951) and sighted people (e.g., Ammons et al., 1953; Teng and Whitney, 2011). In fact, some blind humans who
echolocate using mouth-clicks can echolocate with an accuracy approaching
that of some bat species (Teng et al., 2012). Skilled echolocators can reliably
determine the distance and direction to objects (Rice and Feinstein, 1965; Rice
et al., 1965; Rosenblum et al., 2000; Schoernich et al., 2013), as well as their
azimuth (Thaler et al., 2011; Wallmeier et al., 2013). They can also use echolocation to determine the shape of sound reflecting surfaces in 3D (Arnott et al.,
2013; Thaler et al., 2011) and 2D (Milne et al., 2014a), as well as what materials a sound reflecting surface is made of (Arnott et al., 2013; Hausfeld et al.,
1982; Milne et al., 2014b).
Only recently have scientists started to investigate brain areas involved in
human echolocation. It has been reported that echolocation of objects and
scenes recruits calcarine cortex (i.e., primary visual cortex) in skilled blind
echolocators (Thaler et al., 2011). Following up on this initial finding, subsequent studies investigated the neural representation of specific echolocation
features, such as movement (Thaler et al., 2011, 2014), shape (Arnott et al.,
2013), or surface material (Milne et al., 2014b). From research to date it appears that there may be a feature specific organization. For example, echolocation of moving surfaces leads to an increase in activation in temporal-occipital
brain areas, potentially encroaching on visual motion area MT+ (Thaler et
al., 2011, 2014). Furthermore, shape processing through echolocation is associated with activation in LOC (Arnott et al., 2013), and processing of surface
materials is associated with an increase in activity in parahippocampal cortex
(Milne et al., 2014b). It has also been shown that echolocation of surfaces
positioned at one side can lead to a relative increase in brain activity in contralateral calcarine cortex (Thaler et al., 2011), or (for moving surfaces) in
contralateral temporal-occipital brain areas (Thaler et al., 2014). There is also
evidence suggesting that surfaces located more towards the periphery lead to
more rostral activation in calcarine cortex, whereas more centrally located surfaces lead to a relative increase of activation at the occipital pole (Arnott et al.,
2013). In sum, evidence gathered in blind echolocation experts to date suggests that neural processing for echolocation may be organized in a feature
specific way and that it might include pathways typically associated with vision in sighted people.
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
197
One of the primary uses of echolocation is that it can provide information about the spatial environment useful for navigation. For example, bats
use echolocation to avoid obstacles, locate passageways or to detect prey
(Grunwald et al., 2004; Schnitzler et al., 2003; Weissenbacher and Wiegrebe,
2003). Blind echolocation experts also comment on the fact that a primary
benefit of echolocation is to provide information beyond reachable space
which improves their mobility and orientation. Accordingly, blind people who
echolocate report having significantly better mobility in unfamilar places as
compared to blind people who do not echolocate (Thaler, 2013). Also consistent with this, behavioral studies have shown that echolocation can be used to
detect doorways (e.g., Carlson-Smith and Wiener, 1996) and obstacles (e.g.,
Cotzin and Dallenbach, 1950; Supa et al., 1944) during walking. It is not
known, however, which brain areas are involved when echolocation is used
to orient oneself in the environment, despite studies investigating how spatial
locations per se are represented in the echolocating brain (e.g., Arnott et al.,
2013; Thaler et al., 2011, 2014).
In sighted humans, visual information from the calcarine cortex onwards is
processed along two pathways: a ventral pathway projecting from the primary
visual cortex to the infero-temporal cortex, and a dorsal pathway projecting
from the primary visual cortex to posterior parietal cortex (PPC), respectively. Based on lesion studies in monkeys and humans, the dorsal pathway
has been associated with visual spatial localization and goal-directed action,
whereas the ventral pathway has been associated with object identification
and conscious visual perception (Goodale and Milner, 1992; Ungerleider and
Mishkin, 1982). For example, patients with lesions in the superior parietal
lobule (SPL) are often impaired in reaching to visual targets in the periphery,
a deficit termed Optic Ataxia (Pisella et al., 2009). Patients with damage to
the inferior parietal lobule (IPL) commonly suffer from an inability to detect,
orient toward or respond to left (contralesional) stimuli, known as Neglect
(Heilman et al., 2000; Karnath and Perenin, 2005; Vallar and Perani, 1986). In
contrast, patients with lesions to the ventral stream, e.g., the LOC, suffer from
Visual Form Agnosia and are unable to identify objects, whilst still being able
to grasp them (Goodale et al., 1991; Westwood et al., 2002). A division of
labor between dorsal and ventral pathways has also been suggested within the
auditory system (Kaas and Hackett, 1999; Rauschecker, 2011; Rauschecker
and Tian, 2000). Thus, both for audition and vision, the PPC in the sighted
brain has been implicated in processing of spatial information with particular
relevance for action and spatial orientation.
Less is known about the neural underpinnings of spatial processing for action and orientation in blind humans. Loss of vision is typically associated
with loss in mobility and orientation skills (Brabyn, 1982; Brown and Brabyn,
1987; Deiaune, 1992; Long, 1990; Long et al., 1990; Roentgen et al., 2009;
198
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
Salive et al., 1994). This highlights just how much people rely on vision for
orienting themselves. Without vision, spatial information about the distal environment has to be received through other sensory modalities, in particular
audition (note that touch, temperature and smell/taste apply to the proximal
rather than distal environment). Another alternative to sense the distal environment are sensory substitution devices, that transform information about the
distal environment obtained via artificial sensors into auditory or tactile information (Bach-y-Rita and Kercel, 2003; Brabyn, 1982; Roentgen et al., 2009).
In regard to spatial hearing on the behavioral level, blind people, in particular those who are early blind, as compared to sighted people are better
at discriminating azimuth of peripheral sound sources (Voss et al., 2004),
mono-aural sound localization (Lessard et al., 1998), and they also show better spatial tuning in the periphery (Röder et al., 1999; Voss et al., 2004). Most
notably, both early and late blind people are also better than sighted people
at discriminating distances of sound sources (Voss et al., 2004). However,
some investigations have also reported deficits in auditory–spatial tasks; for
example people who are congenitally blind are impaired relative to sighted
controls in detecting the elevation of an auditory target (Zwiers et al., 2001) or
when spatially bisecting an auditory target array (Gori et al., 2013). Interestingly, Vercillo et al. (2015) showed that the performance of congenitally blind
echolocators in an auditory spatial bisection task was similar or even better,
compared to the performance of sighted and non-echolocating blind participants, respectively. This suggests that echolocation experience may compensate for the lack of visual calibration of auditory spatial maps in congenitally
blind people.
Blindness is not only associated with complex changes on the behavioral
level, but also on the neural level (for reviews see, e.g., Bavelier and Neville,
2002; Burton, 2003; Merabet and Pascual-Leone, 2010; Noppeney, 2007;
Röder and Rösler, 2004). In regard to spatial auditory processing, improved
auditory performance in early and congenitally blind humans has been linked
to the recruitment of occipital brain areas (Collignon et al., 2009a; Gougoux
et al., 2005), and parts of the PPC associated with spatial processing of visually perceived objects in sighted people (Collignon et al., 2007, 2009b, 2011;
Lingnau et al., 2014). Also for tactile processing it has been shown repeatedly that blind people as compared to sighted people have superior ability to
read Braille and (possibly related to this) better tactile acuity (Goldreich and
Kanics, 2003; Grant et al., 2000; Van Boven et al., 2000; Wong et al., 2011).
In terms of brain activity, processing of tactile input, and in particular Braille
reading, has also been linked to activity in striate and extra-striate visual areas
(Büchel, 1998; Cohen et al., 1997; Sadato et al., 1996).
With respect to navigation and/or spatial orientation specifically, it has been
shown that blind people who have been trained to navigate in an environment
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
199
using a sensory substitution device that transforms visual information into
eletroctactile stimulation on the tongue perform superior to equally trained
sighted blindfolded controls (Kupers et al., 2010). Furthermore, in the same
study Kupers et al. (2010) also showed that brain activation during route
recognition in blind people coincided with locations of activations in sighted
people performing the task based on visual information, and that the largest
cluster of activation was in the PPC, in particular SPL, with other common
activations in superior occipital cortex, cuneus and parahippocampus. This
suggests that the ‘visual’ navigation system may be usurped by navigation
through other modalities.
In this study we investigated which brain areas are involved during echolocation of path direction during walking in a naturalistic setting inside and
outside a building. To this end, we compared brain activations as measured
with fMRI in three skilled blind echolocators to those measured in three blind
and three sighted control subjects who had rarely or never used echolocation
before. During fMRI scanning, participants listened to pre-recorded echolocation clicks and echoes that had been recorded when walking through a corridor
inside and outside a building. After sound presentation they had to decide
whether the walkway within the corridor continued to the left, straight ahead
or to the right. Participants also listened to control recordings that contained
clicks but not echoes.
2. Materials and Methods
2.1. Participants
Three early blind, male echolocation experts (BE1, BE2, BE3) participated in
this study. All reported using tongue click-echolocation on a daily basis. Both
BE1 (age 41) and BE2 (age 42) were enucleated in infancy due to retinoblastoma (BE1 at 18 months (left eye) and 30 months (right eye); BE2 at 12
months (both eyes)) and used echolocation since childhood, starting at age
8–10 years and four years, respectively. BE3 (age 16) completely lost his sight
due to congenital amaurosis with 36 months and started to use echolocation
at 3.5 years of age. All echolocation experts were right-handed measured with
the Edinburgh Handedness Inventory (EHI; Oldfield, 1971) and reported no
residual vision and normal hearing. We tested six male control participants
who reported being unfamiliar with echolocation prior to the study. They were
matched by gender, age, handedness and education to the three echolocation
experts (Table 1). The three blind novices (BN1–3, aged 33, 37, 22 years) also
lost sight shortly after birth. BN1 and BN2 reported diffuse brightness detection, whereas BN3 lacked any light perception since he was enucleated in the
first months after birth. Sighted participants (SN1–3, aged 36, 38, 20 years)
Gender
Male
Male
Male
Male
Male
Male
Male
Male
Male
Subject
BE1
BE2
BE3
BN1
BN2
BN3
SN1
SN2
SN3
36
38
20
22
37
16
33
42
41
Age
92
100
100
82
100
91
82
91
64
EHI
A-level
A-level
A-level
A-level
A-level
Highschool
A-level
A-level
A-level
Education
–
–
–
Both eyes first month
Birth
18 months first eye,
30 months second eye
36 months
Birth
12 months both eyes
Blindness since
Enucleation due to
retinoblastoma
–
–
–
Congenital amaurosis
Enucleation due to
retinoblastoma
Enucleation due to
retinoblastoma
Congenital amaurosis
Genetic defect
Cause of blindness
–
–
–
Total, no light detection
Detection of bright
light
Detection of bright
light
Total, no light detection
Total, no light detection
Total, no light detection
Degree of blindness
Table 1.
Sample description of echolocation experts (BE), blind novices (BN) and sighted novices (SN). The handedness score was assessed with the Edinburgh
Handedness Inventory (Oldfield, 1971; right-handed: maximum score +100, left-handed: maximum score −100)
200
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
201
had normal or corrected to normal vision. The experiment was conducted in
accordance with the Declaration of Helsinki (2008) and approved by the local
ethics committees. All participants gave written informed consent.
2.2. Apparatus and Stimuli
2.2.1. Recording Procedure and Setup
Stimuli were created by recording echolocation clicks and echoes from each
echolocation expert in different spatial scenarios. Binaural recordings were
made both in an indoor and outdoor environment while each expert walked
through a corridor, which was constructed from four poster-boards made of
wood fibers and attached to metal stands. Corridors were 185 cm long and
110 cm wide and opened to the left, the right or continued straight ahead
(see Fig. 1 for exact dimensions), resulting in six different scenarios (leftindoor/outdoor, straight-indoor/outdoor and right-indoor/outdoor). Start and
end points of the corridor were marked haptically to assure the same walking
distance of approx. 150 cm for each participant in every trial. In the indoor
environment, the corridor was set up in the entrance hall of the university
building. Outdoors, the corridor was placed on grass next to the building. In
both environments, the echolocation experts walked along the corridor without
shoes in order to minimize additional acoustic information. For the same reason, the ground was covered with fleece blankets in the outdoor environment,
which were also used to cover surrounding objects (e.g., picture frames) in
the indoor setting. Consistent with previous studies (e.g., Thaler et al., 2011,
2014), in-ear omni-directional microphones (Sound Professionals-TFB-2; flat
frequency range 20–20 000 Hz) were placed at the opening of the participant’s
auditory canals and attached to a portable Edirol R-09 digital wave recorder
(24-bit, stereo, 96 kHz sampling rate). The experts were instructed to slowly
walk through the setup facing straight ahead, while clicking loudly with their
usual frequency and pausing for a short moment at the critical point where
Figure 1. Schematic representation of the stimulus recording setup for ‘left’, ‘straight ahead’
and ‘right’ corridor conditions. Dark grey bars denote building walls, light grey bars indicate
the felt start and end positions of the walking paths and black bars the positions of mobile poster
boards used to create a corridor. Blind experts (BE) slowly walked from the start position to the
end position while producing click sounds.
202
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
they recognized a change in the direction of the corridor, if present. For each
echolocation expert, recordings were created when participants were walking and clicking, and whilst walking without making clicks. Participants were
timed during walking to make sure that the start and end of the walking path
would be traversed within 10 s at a steady pace. Recordings were made separately for BE1, BE2 and BE3 with six to eight recordings per expert and
scenario. Only the blind experts traversed the corridor in the recording phase;
the BN and SN groups never physically traversed the corridor.
2.2.2. Stimulus Processing and Selection
Sounds were processed in Audacity (2.0.2, 2012). Prior testing had revealed
a slight imbalance between right and left microphone channels. Thus, prior to
any further processing the left channel of sounds was amplified by 0.44 dB.
Because of specifications of the software used to present sound stimuli
(Presentation 16.1, Neurobehavioral Systems) sounds were downsampled to
44.1 kHz. For each scenario and echolocation expert, two recordings were selected based on objective (absence of interference sounds like a crossing car)
and subjective (identifiability of the directions as rated by the experts) criteria.
Control stimuli which did not contain the click echoes were created as follows.
First, we cut samples from recordings during which participants had walked
without clicking to a length of 10 s. Then, for matching conditions in echolocation conditions (i.e., walking whilst clicking) we isolated the left channel,
and selected the clicks within that channel, whilst taking care to truncate the
main part of the echo (based on visual criteria). This truncation served to
remove monaural information contained in click-echoes. Subsequently, each
truncated click was inserted into an empty (i.e., silent) track so that the onset
of each truncated click matched the onset of its ‘partner’ click in the echolocation stimulus. Subsequently, the empty + click track (which at to this point
was left channel only) was duplicated to create a stereo-track. We chose to
duplicate the left-truncated click instead of truncating and copying both the
left and right track from the original, in order to avoid binaural information
that could have possibly still been present in the truncated clicks. Then these
stereo empty-click trains were merged with the 10-s track from when participants had walked without clicking. Using this procedure, we created a control
clip for each echolocation clip. Importantly, control clips were matched to
echolocation clips both in terms of background and ambient sounds, as well
as in regard to the spectro-temporal features of clicks, whilst truncation and
channel-doubling essentially removed mono- and binaural echo information.
This resulted in 72 different stimuli, i.e., two per direction (3), environment
(2) and expert (3) both with and without echoes (2 × 3 × 2 × 3 × 2 = 72).
During behavioral training and fMRI scanning, each expert was presented with
his own clicks and clicks from another expert. The sighted and blind novices
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
203
Table 2.
Average acoustic energy of echolocation and control sounds broken down by participants (BE1,
BE2, BE3) and condition (indoor vs. outdoor). Numbers in parentheses are standard deviations.
∗ The comparably large difference in average sound level between echo and control conditions
for ‘BE1 − outdoors’ (and comparably large SD) is due to variation in background sounds that
we could not match perfectly across echo and control conditions for this participant. Note that
for all other stimuli differences in sound intensity between echo and control conditions were
below threshold for human listeners (Raab and Taub, 1969)
BE1 − indoor
BE1 − outdoor
BE2 − indoor
BE2 − outdoor
BE3 − indoor
BE3 − outdoor
Control
Average (dB RMS)
Echo
Average (dB RMS)
Clicking
speed (Hz)
−38.8 (0.2)
−35.1 (4.8)∗
−36.8 (0.1)
−35.4 (1.5)
−40.5 (0.2)
−37.2 (1.1)
−38.7 (1.4)
−31.8 (4.8)∗
−35.6 (0.4)
−34.4 (1.8)
−40.2 (1.3)
−36.4 (2.8)
2.4
2.2
3
2.4
3.8
3.5
heard the clicks from two different experts (BN1, SN1: BE1 and BE2; BN2,
SN2: BE1 and BE3; BN3, SN3: BE2 and BE3). This resulted in 48 stimuli
for each participant, i.e., two per direction (3), environment (2) and expert
(2) for both with and without echoes (2 × 3 × 2 × 2 × 2 = 48). Table 2
lists average acoustic energy and clicking frequencies for each echolocation
expert and condition. As an additional control, a silent baseline condition was
introduced during the fMRI scanning.
2.3. Task and Procedure
2.3.1. Training
To become familiar with the task and stimuli, each novice participant received
a circa 60-min training session before the scanning, which took place in a quiet
room at the University of either Gießen or Marburg. Participants were comfortably seated in front of a laptop equipped with MRI compatible stereo in-ear
headphones (Sensimetrics, Model S14, Malden, MA, USA), which were also
used during the scanning task. The headphones are surrounded by cone shaped
foam for noise attenuation and were adjusted in size and shape to fit each participant. In each run, 48 stimuli (see above) were presented in random order
via Presentation (16.1, Neurobehavioral Systems) software. Participants were
instructed to press the appropriate key as soon as they identified the direction
of the corridor as ‘echo left’, ‘echo straight ahead’, ‘echo right’ or ‘no echo’
(control). After each trial, acoustic feedback was given indicating the correct
stimulus. After three to four runs, all participants reached the criteria of 100%
correct discrimination of echo versus no echo (irrespective of corridor direction) and at least 65% correct identifications of the corridor direction with
204
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
echoes. This was followed by one to two runs without feedback to prepare for
the task procedure during scanning.
2.3.2. Functional Paradigm
Before the scanning session, participants performed one training run outside
the scanner. After the training, they were instructed and prepared for scanning
by adjusting earphone position and volume to a comfortable level. In order
to enable them to discriminate subtle auditory differences in the MR environment, the circulatory fan was turned off and participants were equipped with
additional headphones for noise protection. Participants were allowed to try
the four-button response box to which the four responses (‘echo left’, ‘echo
straight ahead’, ‘echo right’ and ‘no echo’) were assigned from left to right,
equivalent to the layout on the laptop keyboard used in the training. They performed the task in the dark inside the scanner while keeping their eyes closed
and wearing a blindfold. All participants were instructed to close their eyes
during scanning. The functional paradigm consisted of six runs (each lasting
about 10 min) with 36 active and 10 silent baseline trials each. The four conditions, Echo_Source1, noEcho_Source1, Echo_Source2, and noEcho_Source2,
were counterbalanced (latin square design) across four different clusters. Each
cluster contained four trials (one of each condition) and combined them in
a different order. Per functional run, nine clusters were presented with one
silent baseline trial preceding and following each cluster, as illustrated in
Fig. 2. Recording environment (indoor/outdoor) and direction (left, straight,
right) categories were distributed equally across and within stimulus conditions. The sparse-sampling design resulted in a 2 s scan, followed by a 10 s
scanning pause in which, after a 0.5 s pause, the stimulus was presented for
Figure 2. Exemplary overview of a single run during the experiment. Each participant performed six runs. Each run was split into nine clusters separated by silent baseline trials. Each
cluster contained combinations of echo vs. no echo trials and source (i.e., the expert with whom
the recording had been made). Path directions and indoor/outdoor environments were presented
in pseudo-random order within each run. Note that the displayed example only shows a subset
of all possible stimulus conditions.
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
205
9 s. The onset of the next scan after another 0.5 s pause cued the participant
to provide their response via button-press. Training, experimental setup and
scanning took about 120 min.
2.3.3. Imaging Parameters
Imaging was performed at the Bender Institute of Neuroimaging (BION) at
Gießen University on a 1.5 Tesla scanner (Symphony Quantum; Siemens,
Erlangen, Germany) with a quantum gradient system and a standard singlechannel head coil. A gradient-echo field map was measured before the functional run to allow later correction for inhomogeneities in the static magnetic
field. Functional imaging was conducted using a T2∗ -weighted gradient-echoplanar (EPI) imaging sequence in combination with a sparse-sampling design
(Hall et al., 1999) with a repetition time (TR) of 12 s (10 s silent gap +
2 s image acquisition) and an echo time (TE) of 43 ms (matrix size: 64 ×
64 mm; field of view: 192 mm2 ; flip angle: 90°). In descending order, 24
contiguous axial 5 mm-slices of the whole brain were measured with a resolution of 3 × 3 × 5 mm3 . We acquired 47 functional volumes for each
run. Anatomical images were acquired at a resolution of 1 × 1 × 1.4 mm3
using T1-weighted magnetization-prepared, rapid-acquisition gradient echo
(MPRAGE) sequence (matrix size: 256 × 180 mm; field of view: 250 mm;
TE: 4.18 ms; TR: 1990 ms; voxel size: 1.4 × 1.0 × 1.0 mm). Scanning time
in total was approximately 75 min.
2.3.4. Preprocessing
Functional MRI data were preprocessed and analyzed using the FMRIB Software Library (FSL version 5; Jenkinson et al., 2012, www.fmrib.ox.ac.uk/fsl).
Only runs with more than 50% correct responses were included in the MRI
analysis, leading to the exclusion of two sessions (BE1 run 5, SN2 run 5). The
first volume of each run was always a silent baseline trial and removed from
further analysis. EPI volumes were corrected for B0 field inhomogeneities
using individual field maps recorded in each run. Motion correction was performed using FSL’s MCFLIRT with the middle volume as reference volume
(Jenkinson et al., 2002). Additionally, we used a custom-made FSL tool to
check for motion-related outlier volumes by calculating the mean squared difference to the respective adjacent volumes. No participant had to be excluded
due to motion artifacts. EPI volumes were corrected for differences in slice
acquisition time, and a high-pass filter cutoff of 360 s was applied to remove
slow linear trends from the data. Functional images were then coregistered
onto the high-resolution anatomical scan through boundary-based registration
(BBR; Greve and Fischl, 2009) using the FSL FLIRT tool. Subsequently, all
images were coregistered onto the MNI152 standard space template image at
2 mm resolution using linear (12 degrees of freedom) and additional non-linear
206
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
transformations (FSL FNIRT). Finally, spatial smoothing was applied using a
7 mm full width at half maximum (FWHM) Gaussian kernel.
2.4. Statistical Analysis
2.4.1. Behavioral Data
Behavioral response data were analyzed by calculating the percentage of correct responses for participants’ judgements about whether an echo was present
or not present (regardless of direction), as well as for judgements about direction within the stimuli that contained echoes. Due to technical problems,
participant BE1’s key press responses were not correctly recorded and had to
be excluded from behavioral data analyses. Trials without any response were
also dropped from further analyses (average: 4.3%; blind experts: 2.5%, blind
novices: 4.0%, sighted novices: 5.9%). The percentage of correct responses
was then compared to chance performance (echo detection: 50%, direction
discrimination: 33%) using binomial tests.
2.4.2. MRI Data
Statistical fMRI analysis of each separate run was carried out using FEAT
(FMRI Expert Analysis Tool) Version 6.00, part of FSL version 5.0 (Jenkinson et al., 2012). Analyses were based on a least-square estimate using a
General Linear Model (GLM) for each run. Four regressors of interest were
specified for the conditions Echo_Source1, noEcho_Source1, Echo_Source2,
and noEcho_Source2. Silent baseline (SB) trials were not explicitly modelled,
thus serving as implicit model baseline. Due to the sparse sampling design,
regressors were not convolved with a template HRF, but rather defined as a
Boxcar function spanning the whole 2 s volume acquired after each stimulus.
The six motion parameters from MCFLIRT 6 DoF motion correction were
added to the GLM as regressors of no interest.
In a second level analysis, functional data from all six runs of each participant were coregistered and normalized to MNI standard space at 2 mm
resolution using FLIRT. Single-participant activations across all runs were
calculated by fitting a random-effects (RFX) GLM using FSL FLAME1. Additionally, an overall RFX GLM was fit to all recorded functional runs across
participants, allowing for the detection of activations common to all participants. For the RFX analysis across all nine participants, data within each
participant was treated as a fixed effects model. Contrasts were defined for
the effect of sound source (own > foreign click sounds, and vice versa), of
spatial echoes by comparing sounds that included echoes to control sounds
without echoes (echo > no echo) and of all sounds, contrasting sound trials
against the silent baseline (sounds > baseline). RFX fMRI results were corrected for multiple comparisons by applying Gaussian Random Field Theory
at the cluster level using z > 2.3 (z > 3.7 for the global analysis) and a cluster
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
207
probability threshold of p < 0.05 (p < 0.01 for the global analysis). To define common areas for echo-related activation in each group (BE, BN, SN),
we took RFX activation maps resulting from the echo > no echo contrast in
each participant and used these to calculate logical overlapping regions across
all three participants in each group. For these calculations we adopted a cluster size threshold of 100 contiguous voxels (instead of a cluster probability
threshold of p < 0.05) for all participants.
Labeling of activated areas was done using the Jülich Histological CytoArchitectonic Atlas (Eickhoff et al., 2007) if possible, otherwise the HarvardOxford Subcortical Structural Atlas was used to assign labels to structures
(Desikan et al., 2006).
3. Results
3.1. Behavioral Data
Figure 3A displays the percentage of correct responses for echo detection (regardless of direction). All participants successfully judged stimuli with echo as
echo sounds, as well as those without echoes as control sounds (overall mean:
96.8% ± 5.5% correct responses). Thus, participants were able to discriminate
echo from control stimuli. Binomial tests indicated all participants’ responses
Figure 3. (A) Percentage of correct responses for stimuli with echoes (black bars) and without
echoes (grey bars) in terms of echo detection regardless of direction. The dotted line illustrates
chance level of 50%. (B) Percentage of correct responses when considering participants’ judgments of direction from those stimuli which contained echoes. The dotted line indicates chance
level of 33%. Stars mark results which were significantly above chance level, while (∗ ) marks a
trend of p = 0.079. Error bars show ±1 standard error in both plots.
208
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
to be significantly above the 50% chance level, regardless of whether echoes
were present or absent (all p < 0.001).
When participants discriminated path directions in trials which contained
echoes, performance was lower than for simple detection of echoes as illustrated in Fig. 3B. On average, directions were judged correctly in 36.8 ±
14.6% of all trials, with comparable mean performance for blind experts
(40.1%) and blind novices (39.2%) but lower performance for the sighted
novices (34.2%). Binomial tests showed significant above-chance performance in one blind expert (BE2: 41.1%, p = 0.048) and a trend in the other
(BE3: 40.0%, p = 0.079). When we excluded the first run of each expert,
BE3’s performance was also significantly better than chance (BE3: 43.7%,
p = 0.024) indicating a possible effect of training or familiarization. Such an
improvement was not present in any of the novices. Surprisingly, one of the
blind novices also performed significantly better than chance (BN3: 50.0%,
<0.001). All other participants were not different from chance level (BN1:
32.1%, p = 0.616; BN2: 35.7%, p = 0.318; SN1: 30.8%, p = 0.711; SN2:
39.3%, p = 0.103; SN3: 32.7%, p = 0.562).
3.2. Functional Imaging Data
3.2.1. Sound vs. Silence
We first tested whether the processing of sound stimuli depended on the person
who produced the click-sounds. The sound-source contrasts which compared
between the two different sound sources for each participant (own vs. foreign
clicks for BE, sound source 1 vs. 2 for BN and SN) did not show any differential activation between the two sources. The respective trials were therefore
pooled for further analyses and all reported activations are based on both sound
sources.
Activations resulting from both types of echolocation stimuli (clicks with
echoes present and clicks with echoes removed) compared to silent baseline
trials (sounds vs. baseline contrast) as assessed using RFX GLM across all
nine participants are shown in Fig. 4. It is evident that the global GLM analysis
based on all participants revealed activation in right and left primary auditory
cortices (for more details see online Supplementary Table S1). A breakdown
for each group and participant separately is shown in Fig. 5. Consistent with
the global GLM result, for this contrast we found bilateral activations in primary auditory cortex in all nine participants.
3.2.2. Echo vs. Control
In order to determine activations associated specifically with processing of
path direction, we examined the echo vs. no echo contrast, which compared
BOLD activity during listening to echolocation stimuli with clicks and echoes
to BOLD activity during listening to control stimuli where echoes were ab-
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
209
Figure 4. Global RFX GLM activations for the contrast sounds vs. silent baseline, overlaid on
the MNI-Colin27 brain template (data shown in neurological convention, i.e., Right-is-Right).
Shown activations are significant using a cluster-level threshold of z > 3.7 and a cluster probability threshold of p < 0.01.
Figure 5. Activations for all three participants in each group for the contrast sounds vs. silent
baseline, overlaid on the MNI-Colin27 brain template (data shown in neurological convention,
i.e., Right-is-Right). Shown activations are significant using a cluster-level threshold of z > 2.3
(except for participant BN2 where z > 2.3, but no cluster-level correction was applied due to
generally low activations) and a cluster probability threshold of p < 0.05.
sent. Please note that even though participants were not very accurate judging
path direction (compare Fig. 3B), they were nearly perfect judging when an
echo had been present or not (compare Fig. 3A). Importantly, the response
whether an echo was present or not was always tied to a direction judgment
(‘echo left’, ‘echo straight ahead’, ‘echo right’). Thus, participants engaged
in path direction judgments in echo conditions; in contrast to the control condition where responses were not tied to a direction judgment (‘no echo’). In
fact, upon questioning after scanning, participants said that they had tried to
determine the direction of the path when they had listened to what they felt
were echo stimuli, but that they had found the task difficult. Global GLM
RFX analysis showed activation in all participants in right Premotor Cortex
(PMC, BA6), right IFC (BA44) and right PPC (i.e., SPL and IPL) (Fig. 6).
Just as for the contrast sound vs. silence, the contrast echo vs. no echo also revealed bilateral activations in auditory cortices. However, for the contrast echo
210
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
Figure 6. Global RFX GLM activations for the contrast echo vs. no echo, overlaid on the MNIColin27 brain template (data shown in neurological convention, i.e., Right-is-Right). Shown
activations are significant using a cluster-level threshold of z > 3.7 and a cluster probability
threshold of p < 0.01.
Figure 7. Activations for all three participants in each group for the contrast echo vs. no echo
overlaid on the MNI-Colin27 brain template (data shown in neurological convention, i.e., Rightis-Right). Displayed activations are significant using a cluster-level threshold of z > 2.3 and
cluster probability threshold of p < 0.05.
vs. no echo these activations are more superior/posterior, and also comprise
the planum temporale (for more details see online Supplementary Table S2).
Since individual participant analyses revealed that activations were more consistent within than between groups, we below present results separately for
each group.
Figure 7 displays BOLD activations for the echo vs. no echo contrast for
each participant according to the experimental groups. Detailed cluster-level
results of each participant are shown in Table 3. In the blind expert echolo-
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
211
Table 3.
Activations found in all individual subjects for the echo > no echo contrast. Results are clusterlevel corrected at z > 2.3 with a cluster probability threshold of p < 0.05. Z-values reference
peak activations within each cluster, corresponding peak coordinates are reported in MNI space
(mm)
Subject
Voxels
p
zmax
X
BE1
6259
7.09e−26
3.30
2578
1.31e−13
1860
BE2
BE3
Y
Z
L/R
Area(s)
2
−76
40
R∗
3.52
54
18
26
R
1.24e−10
3.32
50
−72
14
R
1607
1.69e−09
3.29
−32
24
−4
L
558
0.0007
3.51
0
8
50
R
529
0.00109
2.88
32
−56
44
R
517
0.00131
3.36
38
18
2
R
501
418
359
0.00168
0.00637
0.0173
2.84
2.89
2.91
4
−30
−52
−76
−68
−24
−18
−22
10
R
L
L
1838
1.12e−10
3.33
14
−72
54
R∗
1183
1.5e−07
3.06
60
−30
28
R
770
2.86e−05
3.41
−56
−16
2
L
514
464
343
300
1664
0.0012
0.00267
0.0208
0.0451
8.63e−08
3.24
3.26
3.15
3.08
3.21
54
24
−36
−58
32
−2
−6
−2
−64
−78
48
62
56
8
18
R
R
L
L
R
761
485
0.000495
0.0131
3.22
3.44
56
54
20
−42
30
14
R
R
Superior parietal lobule
Visual cortex V1,
BA17; V2, BA18
Inferior parietal lobule
Inferior frontal gyrus
Premotor cortex
Inferior parietal lobule
Visual cortex V4
Premotor cortex
Inferior frontal gyrus
Orbito-frontal cortex
Superior frontal cortex
Premotor cortex
Superior parietal lobule
Anterior intra-parietal
sulcus hIP1, hIP2, hIP3
Inferior frontal gyrus
Insular cortex
Cerebellum
Cerebellum
Primary auditory
cortex
Insular cortex
Superior parietal lobule
Inferior parietal lobule
Inferior parietal lobule
Superior parietal lobule
Primary auditory
cortex
Inferior parietal lobule
Premotor cortex
Premotor cortex
Premotor cortex
Lateral occipital cortex
Superior parietal lobule
Visual cortex V2,
BA18; V1, BA17
Inferior frontal gyrus
Primary auditory
cortex
212
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
Table 3.
(Continued)
Subject
Voxels
p
zmax
X
Y
Z
L/R
Area(s)
BN1
387
7063
0.0475
1.30e−24
3.46
3.55
54
10
8
−96
48
18
R
R∗
2378
3.95e−11
3.60
58
18
30
R
1909
2.08e−09
3.22
−34
−64
54
L
561
429
941
0.00259
0.0159
0.000237
3.41
2.97
3.31
68
−34
46
−34
−72
−48
14
−16
60
R
L
R
559
463
0.0111
0.0329
3.32
3.09
52
16
6
−80
42
16
R
R
449
8370
0.0387
5.15e−29
2.99
3.87
14
30
−74
−46
54
44
R
R
1312
1.8e−09
3.54
−36
−46
38
L
433
0.0107
3.29
36
−86
16
R
2291
1.15e−13
3.33
52
12
44
R
1485
895
818
7.9e−10
1.5e−06
4.48e−06
3.40
3.23
3.14
−42
−44
38
54
6
−50
−6
24
48
L
L
R
817
4.55e−06
3.34
−36
−58
44
L
Inferior frontal gyrus
Visual cortex V1,
BA17; V2, BA18; V4
Superior parietal lobule
Inferior parietal lobule
Anterior intra-parietal
sulcus hIP1
Inferior frontal gyrus
Premotor cortex
Superior parietal lobule
Inferior parietal lobule
Inferior parietal lobule
Visual cortex V4
Inferior parietal lobule
Anterior intra-parietal
sulcus hIP1, hIP3
Superior parietal lobule
Premotor cortex
Visual cortex V1,
BA17; V2, BA18
Superior parietal lobule
Superior parietal lobule
Inferior parietal lobule
Anterior intra-parietal
sulcus hIP3
Premotor cortex
Inferior frontal gyrus
Superior parietal lobule
Anterior intra-parietal
sulcus hIP1
Visual cortex BA19
Inferior parietal lobule
Premotor cortex
Inferior frontal gyrus
Middle frontal gyrus
Frontal pole
Inferior frontal gyrus
Superior parietal lobule
Anterior intra-parietal
sulcus hIP3
Inferior parietal lobule
Superior parietal lobule
BN2
BN3
SN1
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
213
Table 3.
(Continued)
Subject
SN2
SN3
Voxels
p
zmax
X
Y
Z
L/R
668
644
583
533
394
4.2e−05
6.1e−05
0.000161
0.000365
0.00413
3.22
3.34
3.33
3.41
3.03
16
−6
40
0
64
−70
−82
58
34
−38
58
−24
−2
44
10
R
L
R
R
R
304
286
5008
0.023
0.033
5.42e−19
3.30
2.91
3.65
40
−58
44
28
−28
34
−4
12
30
R
L
R
2198
2.38e−10
3.56
24
−76
54
R
1875
3.69e−09
3.42
−34
−44
40
L
1382
671
589
3.35e−07
0.000699
0.00195
3.31
3.48
3.37
−36
−48
−56
40
40
−14
30
−14
6
L
L
L
553
0.00311
2.98
32
−44
46
R
471
0.00934
3.41
68
−20
0
R
467
447
3306
2531
0.00986
0.013
3.7e−14
1.31e−11
3.28
3.18
3.68
3.64
30
−12
56
34
0
−70
6
−40
56
64
42
40
R
L
R
R
1448
1.52e−07
3.53
−30
−52
42
L
Area(s)
Anterior intra-parietal
sulcus hIP1,3
Superior parietal lobule
Cerebellum
Frontal pole
Superior frontal gyrus
Inferior parietal lobule
Superior temporal
gyrus
Orbito-frontal cortex
Planum temporale
Inferior frontal gyrus
Premotor cortex
Superior parietal lobule
Anterior intra-parietal
sulcus hIP1,3
Inferior parietal lobule
Anterior intra-parietal
sulcus hIP1, hIP2
Primary somatosensory
cortex
Inferior parietal lobule
Inferior frontal gyrus
Frontal pole
Primary auditory
cortex
Superior parietal lobule
Anterior intra-parietal
sulcus hIP3
Primary somatosensory
cortex
Superior temporal
gyrus
Middle frontal gyrus
Superior parietal lobule
Premotor cortex
Anterior intra-parietal
sulcus hIP2,3
Inferior parietal lobule
Superior parietal lobule
Anterior intra-parietal
sulcus hIP2,3
214
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
Table 3.
(Continued)
Subject
Voxels
p
zmax
X
Y
Z
L/R
1118
4.08e−06
3.79
−44
0
36
L
937
783
514
2.84e−06
0.000164
0.00474
3.37
3.62
3.43
64
8
68
8
20
−28
8
36
18
R
R
R
Area(s)
Inferior parietal lobule
Primary somatosensory
cortex
Premotor cortex
Inferior frontal gyrus
Inferior frontal gyrus
Premotor cortex
Primary auditory
cortex
∗ Even though peak voxels are located in one hemisphere, clusters also extend into the other
hemisphere.
cators, the most prominent activation was found in the SPL. All three experts
showed right-hemispheric SPL activity, BE1 and BE2 additionally activated
the left SPL. All three experts also showed activation in right PMC/IFC. BE1
and BE3 displayed activation in right primary visual cortex (BA17/18), and
BE1 and BE2 additional bilateral IPL activations. We observed a similar activation pattern in the blind novice group. BN1 and BN2 showed activation in
right V1. All three blind novice participants showed activation in right SPL,
comparable to the blind experts. Parietal activations in the BN group extended
further into the IPL/IPS as compared to the BE group. Furthermore, we observed right ventral PMC/IFC activations in all blind novices. The sighted
novice participants also showed activation in right ventral PMC/IFC. In contrast to the blind participants, even though they did show activations in right
SPL, their parietal activation was more bilateral and as a whole located more
inferior extending into IPL and adjacent aIPS. Additionally, activation of the
left ventral IFC (BA44 and 45/Broca’s area) was found in all sighted participants, which was absent in the blind expert and blind novice groups.
In order to better qualify which activations were consistent within the
groups, we overlaid z-statistic maps of all three participants in each group,
and identified all clusters that were above threshold. Data used for participant’s individual maps are essentially those on which Fig. 7 and Table 3 are
based, with the exception that instead of using a cluster probability threshold
of p < 0.05, we adopted a minimum cluster size threshold of 100 contiguous
voxels for individual participants’ maps (compare also Section 2.4.2).
The overlapping clusters in each experimental group are reported in Table 4,
sorted by the number of overlapping voxels. In both blind expert and blind
novice participants, the only brain areas where activation overlapped across
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
215
Table 4.
Areas of overlapping activations within each group, reported as contiguous clusters of >100
voxels. Coordinates are MNI coordinates in mm for the center of gravity (COG) of each cluster
Group
Voxels
X
BE
1448
51
138
104
1272
972
BN
SN
Z
L/R
BA
Area(s)
5
47
R
44/6
13
9
15
36
−78
−83
−74
−48
52
45
45
52
R
R
R
R
7
7
7
7/40
206
49
11
39
R
44/6
4428
2063
46
−36
23
−43
32
45
R
L
45
40
1069
1019
−45
38
8
−42
30
47
L
R
44
40
1010
40
−52
51
R
40
360
349
270
108
11
31
34
66
−72
9
28
−29
53
61
1
14
R
R
R
R
7
9
11
40
Premotor cortex/
Inferior frontal cortex
Superior parietal lobule
Superior parietal lobule
Superior parietal lobule
Superior parietal lobule/
Anterior intra-parietal
sulcus
Inferior frontal cortex/
Premotor cortex
Inferior frontal cortex
Anterior
intra-parietal sulcus/
Inferior parietal lobule
Inferior frontal cortex
Anterior
intra-parietal sulcus/
Inferior parietal lobule
Anterior
intra-parietal sulcus/
Inferior parietal lobule
Superior parietal lobule
Middle frontal gyrus
Orbito-frontal gyrus
Inferior parietal lobule
Superior temporal gyrus
Y
all three participants were right IFC/PMC and right SPL. Most notably, activation also overlapped in the same area in the sighted group. Furthermore, the
sighted group also showed activation overlap in right IFC, showing the largest
cluster there and in the IPL and IPS. The left-hemispheric IFC activation and
additional frontal activations in middle frontal gyrus which were unique to the
sighted novice group spatially overlapped in all three SN participants.
In sum, the analysis investigating groups separately highlights the involvement of right IFC/PMC and right SPL for BE, BN and SN. For BE and BN
it also highlights involvement of right V1 (four out of six BE and BN participants), and for SN participants the involvement of a more bilateral SPL/IPL
network, left IFC/PMC and additional frontal areas. Overall, this pattern of
results is consistent with results from the global GLM RFX analysis for this
216
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
contrast, but pinpoints areas of activation in parietal cortex and PMC more
precisely.
4. Discussion
We investigated the neural correlates of blind human echolocation experts as
well as blind and sighted novices in a spatial path direction detection task
based on click-echoes recorded in a naturalistic setting. Participants heard
click-echo stimuli from one of two expert echolocators and had to determine
the direction in which a path continued (left, straight ahead, right). On the
behavioral level we found that all three groups were very good at detecting
echoes, but only the blind experts and one of the blind novices were better
than chance at deciding in which direction the path went. In regard to brain
activity as measured with fMRI we found that all participants showed higher
activation in the right IFC/PMC (BA6, 44 and 45) when listening to echoes as
compared to control sounds without echoes. In addition, there was an increase
of activity in the right SPL in each participant. While in the blind experts and
blind novices this activation was primarily located in SPL, in sighted participants, this activation widely spread into the IPS and IPL of both hemispheres.
Moreover, additional activations in the left IFC (BA44 and 45) and superior
and middle frontal areas were found only in sighted participants.
4.1. Behavioral Performance
All participants, blind and sighted alike, were able to decide between echo and
control sounds with very high accuracy. This is in line with previous studies
showing that sighted people can easily learn to dissociate between click sounds
with and without echo (e.g., Thaler et al., 2011). It is important to note that
blind and sighted novices received training in the echo detection and direction
detection task before participating in the fMRI experiment, while the blind
expert echolocators received no such training. The higher performance of the
BE group without much familiarization with the sounds is therefore indicative
of their experienced use of click-echo sounds. However, one of the blind experts (BE2) showed comparably low performance in the echo detection task,
but only when classifying control sounds without echoes (Fig. 3A). To further
investigate this finding, we looked at his performance across scanning sessions
and found that he responded at chance for control sounds in the very first run
and then consistently improved in performance up to above 90% in the last
run. The discrepancy between echo and control sounds for BE2 might be due
to the artificial nature of the control stimuli. Therefore, even blind echolocation experts may need training or familiarization with unfamiliar sounds
before reaching optimal discrimination performance. However, none of the
nine subjects showed any trend in performance across scanning sessions when
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
217
discriminating path directions, indicating that a possible familiarization effect
not necessarily influences further spatial processing of auditory stimuli.
While participants were very good at dissociating echo-sounds from control sounds, the task of detecting path directions from echo stimuli proved to
be hard. As expected, the blind echolocation experts achieved above-chance
classification performance in the MRI experiment; however, also one blind
novice performed better than chance. In general, direction detection accuracy was surprisingly low in the blind experts, although they were able to
tell path direction with a high success rate, and generally found the task easy
when they had walked through the corridor setup while recording the stimuli, and whilst screening stimuli via headphones (compare Section 2.2.2). The
low performance in the direction detection task during scanning was possibly
caused by the echo sounds overlaid with additional sound information from
the environment due to recordings in real-world settings, or the unfamiliar
MR environment which might have distracted from the task.
Nonetheless, the marked difference in judgments between echo and no echo
conditions clearly shows that all participants engaged in the task during scanning. Specifically, the response whether an echo was present or not was always
tied to a direction judgment (‘echo left’, ‘echo straight ahead’, ‘echo right’).
Thus, even though participants were not accurate at judging path direction,
they nevertheless engaged in path direction judgments in echo conditions.
Upon questioning after scanning participants also said that they had tried to
determine the direction of the path when they had listened to what they felt
were echo stimuli. In contrast, since in control conditions responses were not
tied to a direction judgment (‘no echo’), participants did not engage in direction judgments in control conditions. Thus, the high accuracy in ‘echo left’,
‘echo right’ and ‘echo straight ahead’ vs. ‘no echo’ judgments behaviorally
validates our comparison of brain activity between echo and no echo conditions, even though accuracy of ‘echo left’, ‘echo right’ and ‘echo straight
ahead’ answers when evaluated by direction was low.
4.2. Interpretation of Activations in Parietal Cortex
We found that all nine subjects showed an increase in activation in right SPL
while they performed the path direction detection task as compared to the control condition. Similar activations have been reported in a study where blind
and blindfolded sighted subjects navigated a 2D virtual pathway using an electrotactile Tongue Display Unit suggesting that the SPL is part of a navigation
and/or route-recognition network (Kupers et al., 2010). Importantly, in that
study SPL was not only active during tactile route navigation but also when
sighted control subjects executed the same task with full vision suggesting
that parietal brain areas involved in navigation using vision can be recruited
by other modalities in the blind. Our findings support and extend the results by
218
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
Kupers et al. (2010) showing that the SPL is also involved in spatial navigation
based on echo sounds in blind and sighted people highlighting its function in
multisensory spatial navigation.
Within the PPC, the blind experts and blind novices mainly activated the bilateral SPL (overlapping only in the right hemisphere) while activation in the
sighted novices was more widespread and centered in the bilateral IPS and IPL
extending into the SPL. In the well-known visual pathway model by Goodale
and Milner (1992), the PPC is seen as a structure of the dorsal visual pathway which is involved in visual spatial localization for the guidance of action.
These functions, however, explicitly assigned to the SPL leaving the role of
the IPL widely unclear. The authors speculate that the IPL may subserve perceptual awareness by transforming information from both the dorsal and the
ventral pathway (Milner and Goodale, 1995). A later model by Rizzolatti and
Matelli (2003) extended the dorsal pathway and proposed two sub-streams,
a dorso-dorsal (d-d) stream projecting to the SPL and a ventro-dorsal (vd) stream projecting to the IPL including the anterior IPS, respectively. The
d-d stream is supposed to have the basic characteristics of the dorsal pathway of Goodale and Milner (1992), i.e., a system for online action control,
and causes Optic Ataxia after damage. The v-d stream is suggested to play
a crucial role in both perception and action and engages in high-level spatial
and motor functions. In contrast to the SPL where those functions seem to
be equally distributed across both hemispheres, the IPL shows a clear hemispheric difference: the right IPL is involved in space perception and action and
the left IPL engages in action organization, necessary for object manipulation,
grasping and tool use, and even in cognitive tasks, such as action recognition from preceding motor knowledge. Thus, lesions to the right v-d stream
lead to Neglect while lesions to the left v-d stream cause Limb Apraxia. Our
results show that the blind participants mainly activated the d-d stream bilaterally while the sighted participants relied also on the bilateral v-d stream. In
the context of this model, this may imply different task strategies depending
on vision. Blind subjects, in particular blind expert echolocators, may have
accessed on-line mechanisms of action control to ‘automatically’ assign directional meaning to the echoes, without having to consciously process the
click-echoes. Sighted participants, on the other hand, may have applied more
conscious, high-level spatial processes as they were untrained and thus unable
to automatically decode complex echo information, such as spatial directions.
The observed activation in the IPL is suggestive of the idea that sighted participants engaged a more cognitive route, possibly by retrieving memories of
sounds presented during training and their associated directions and comparing them to the current stimulus. In support of this asssumption, the right IPL
has been previously found to mediate auditory working memory for monitor-
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
219
ing and updating sound locations independent of motor acts (Claude et al.,
2008).
As mentioned in the Introduction, not only visual processing is split along
dorsal and ventral routes, but parietal cortex has also been implicated within
a dual-stream model of auditory processing. According to this model, there
is a dorsal ‘where’ and a ventral ‘what’ stream within the auditory system, with stronger focus on spatial processing for action/sensorimotor control
along the dorsal pathway which has its nodal point in the IPL, with a righthemispheric preference, and further projections to the IFC (Kaas and Hackett,
1999; Rauschecker, 2011; Rauschecker and Tian, 2000). Since we did not include visual or regular ‘source’ hearing conditions in our study, we are unable
to determine to what degree parietal areas we identified for processing of path
direction with echolocation map onto visual or auditory dorsal pathways. Future research is needed to address this issue.
4.3. Interpretation of Activations in Prefrontal Cortex
Sighted participants showed additional activations in superior frontal and middle frontal brain areas which were absent in both blind groups. Together with
the activations we found in the left IPL and IPS in the sighted, these areas form
a parietofrontal circuit processing conceptual knowledge and the pragmatics
of action, also known as ‘acting with’ system (Johnson and Grafton, 2002).
This is consistent with our suggestion that sighted people relied stronger on
high-level spatial functions and recognition of spatial memories. Similar findings have been revealed in a study in which early blind and sighted people
learned to determine distance based on an ultrasound-based sensory substitution device, and where sighted people showed stronger frontal activations
(Chan et al., 2012). Moreover, Kupers et al. (2010) demonstrated in the abovementioned electrotactile navigation study more activations in frontal areas in
sighted participants not seen in the blind and argued for the use of cognitive
strategies, such as decision making, in the sighted. Since the parietofrontal circuit has also been associated with spatial working memory (Silk et al., 2010),
this may underline the possibility that our sighted subjects reactivated and
maintained memory representations acquired during the training. However,
the lack of hippocampal and parahippocampal activations in our study would
make the involvement of spatial memory unlikely (see also next paragraph).
In sum, our results suggest that sighted participants used a different strategy
to resolve the direction detection task based on click-echoes compared to the
blind echolocation experts and blind novices.
4.4. Absence of Activation in Hippocampus or Parahippocampus
The hippocampus has been implicated in spatial memory, for example relevant
for navigation and route finding (Hartley et al., 2014), and the parahippocam-
220
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
pus has been linked to related aspects of cognition, such as scene and route
recognition (Aminoff et al., 2013). Kupers et al. (2010) found that a navigation
and route-recognition task completed with an eletrotactile sensory substitution
device led to an increase in activity not only in SPL, but also in parahippocampal cortex in blind people. They also found that this activity overlapped with
activity observed in sighted people performing the task visually. They suggested that the parahippocampal activation can be understood considering that
participants were presented with two routes on each trial and had to decide
which route had been presented previously. Thus, the task had a scene recognition component, likely mediated through parahippocampus. In our current
study, we did not find an increase in activation in parahippocampus (or hippocampus) during path direction detection as compared to control conditions.
This could be understood considering that our task did not contain a scene or
route recognition component like the task used by Kupers et al. (2010). Specifically, our task required online processing of spatial information mediated by
echo information, but there was no requirement to match any path or route to
a path or route traversed previously. Another possible explanation for the lack
of increase in activation in parahippocampus (or hippocampus) in our study
as compared to Kupers et al. (2010) might also be that subjects in Kupers et
al.’s study performed at much higher levels than our subjects and thus were
perceiving a spatial scene more successfully on average.
4.5. Activations in Primary Auditory Cortex and Planum Temporale
As expected, the contrast all sound vs. baseline revealed an increase in activation in primary auditory cortex. Unexpectedly, however, we also observed
an increase in activity in primary auditory cortex/planum temporale for the
contrast echo vs. no echo (compare Fig. 6 and Supplementary Table S2). The
activity in primary auditory cortex for this contrast was unexpected because
we had constructed stimuli such as to minimize differences in acoustic properties of stimuli between the two conditions, i.e., acoustic properties known
to drive A1, such as frequency or sound pressure level. Furthermore, previous
research using stimuli constructed in a similar way did not find an increase in
activity in primary auditory cortex for the comparison echo vs. no echo (Milne
et al., 2014b; Thaler et al., 2011). Nevertheless, in our study the absence of
echoes in the control stimuli led to a slight drop in sound pressure level in
control stimuli as compared to echo stimuli (compare Table 2), and it is possible that this is responsible for the activity difference we observed in A1.
The echo-related activity in planum temporale can be understood considering
that the planum temporale is involved in binaural perception of sound location
and movement (Arnott et al., 2004; Deouell et al., 2007; Griffiths and Warren,
2002; Krumbholz et al., 2005). Thus, binaural spatial properties in our echo
stimuli are likely to have driven the relative increase in activity in the planum
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
221
temporale for the echo vs. no echo contrast. This is consistent with previous
findings showing that echo information can drive activity in the planum temporale (Thaler et al., 2014).
4.6. Occipital vs. Parietal Activations — Comparison to previous
Echolocation Studies
Past research comparing activations between conditions that required processing of an echo and echo-less control condition have suggested that in particular
occipital brain areas are involved in echo processing in blind echo experts
(Arnott et al., 2013; Thaler et al., 2011, 2014). The current study suggests that
two out of three BE and two out of three BN showed increased activation in
right BA17/18 for processing echo as compared to control sounds. Nevertheless, the difference in activation between echo and control sounds is mainly
evident in parietal, not occipital areas. The main difference between the current and previous studies investigating spatial echo processing is that previous
studies focused on how spatial locations per se are represented in the blind
brain, with a focus on the perceptual appraisal of the stimulus (Arnott et al.,
2013; Thaler et al., 2011, 2014), whereas the current study required people to
engage in spatial processing as relevant for an action, i.e., locomotion, associated with activation of the SPL.
Acknowledgements
This project was supported by the IRTG 1901 ‘The Brain in Action’ and the
DFG grant Fi1567/3-1.
References
Aminoff, E. M., Kveraga, K. and Bar, M. (2013). The role of parahippocampal cortex in cognition, Trends Cogn. Sci. 17, 379–390.
Ammons, C. H., Worchel, P. and Dallenbach, K. M. (1953). “Facial vision”: the perception of
obstacles out of doors by blindfolded and blindfolded-deafened subjects, Am. J. Psychol. 66,
519–553.
Arnott, S. R., Binns, M. A., Grady, C. L. and Alain, C. (2004). Assessing the auditory dualpathway model in humans, Neuroimage 22, 401–408.
Arnott, S. R., Thaler, L., Milne, J. L., Kish, D. and Goodale, M. A. (2013). Shape-specific
activation of occipital cortex in an early blind echolocation expert, Neuropsychologia 51,
938–949.
Bach-y-Rita, P. and Kercel, S. W. (2003). Sensory substitution and the human–machine interface, Trends Cogn. Sci. 7, 541–546.
Bavelier, D. and Neville, H. J. (2002). Cross-modal plasticity: where and how? Nat. Rev. Neurosci. 3, 443–452.
Brabyn, J. A. (1982). New developments in mobility and orientation aids for the blind, IEEE
Trans. Biom. Eng. 29, 285–289.
222
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
Brown, B. and Brabyn, J. A. (1987). Mobility and low vision: a review, Clin. Exp. Optom. 70,
96–101.
Büchel, C. (1998). Functional neuroimaging studies of braille reading: cross-modal reorganization and its implications, Brain 121, 1193–1194.
Burton, H. (2003). Visual cortex activity in early and late blind people, J. Neurosci. 23, 4005–
4011.
Carlson-Smith, C. and Wiener, W. R. (1996). The auditory skills necessary for echolocation:
a new explanation, J. Vis. Impair. Blind. 90, 21–35.
Chan, C. C. H., Wong, A. W. K., Ting, K.-H., Whitfield-Gabrieli, S., He, J. and Lee, T. M. C.
(2012). Cross auditory–spatial learning in early-blind individuals, Hum. Brain Mapp. 33,
2714–2727.
Claude, A., He, Y. and Grady, C. (2008). The contribution of the inferior parietal lobe to auditory
spatial working memory, J. Cogn. Sci. 20, 285–295.
Cohen, L. G., Celnik, P., Pascual-Leone, A., Corwell, B., Falz, L., Dambrosia, J., Honda, M.,
Sadato, N., Gerloff, C., Catalá, M. D. and Hallett, M. (1997). Functional relevance of crossmodal plasticity in blind humans, Nature 389, 180–183.
Collignon, O., Lassonde, M., Lepore, F., Bastien, D. and Veraart, C. (2007). Functional cerebral
reorganization for auditory spatial processing and auditory substitution of vision in early
blind subjects, Cereb. Cortex 17, 457–465.
Collignon, O., Davare, M., Olivier, E. and De Volder, A. G. (2009a). Reorganisation of the right
occipito-parietal stream for auditory spatial processing in early blind humans. A transcranial
magnetic stimulation study, Brain Topogr. 21, 232–240.
Collignon, O., Voss, P., Lassonde, M. and Lepore, F. (2009b). Cross-modal plasticity for the
spatial processing of sounds in visually deprived subjects, Exp. Brain Res. 192, 343–358.
Collignon, O., Vandewalle, G., Voss, P., Albouy, G., Charbonneau, G., Lassonde, M. and Lepore, F. (2011). Functional specialization for auditory–spatial processing in the occipital
cortex of congenitally blind humans, Proc. Natl Acad. Sci. USA 108, 4435–4440.
Cotzin, M. and Dallenbach, K. M. (1950). ‘Facial vision’: the role of pitch and loudness in the
perception of obstacles by the blind, Am. J. Psychol. 63, 485–515.
Deiaune, W. (1992). Low vision mobility problems — perceptions of O-and-M specialists and
persons with low vision, J. Vis. Impair. Blind. 86, 58–62.
Deouell, L. Y., Heller, A. S., Malach, R., D’Esposito, M. and Knight, R. T. (2007). Cerebral
responses to change in spatial location of unattended sounds, Neuron 55, 985–996.
Desikan, R. S., Ségonne, F., Fischl, B., Quinn, B. T., Dickerson, B. C., Blacker, D., Buckner,
R. L., Dale, A. M., Maguire, R. P., Hyman, B. T., Albert, M. S. and Killiany, R. J. (2006).
An automated labeling system for subdividing the human cerebral cortex on MRI scans into
gyral based regions of interest, Neuroimage 31, 968–980.
Eickhoff, S. B., Paus, T., Caspers, S., Grosbras, M.-H., Evans, A. C., Zilles, K. and Amunts, K.
(2007). Assignment of functional activations to probabilistic cytoarchitectonic areas revisited, Neuroimage 36, 511–521.
Goldreich, D. and Kanics, I. M. (2003). Tactile acuity is enhanced in blindness, J. Neurosci. 23,
3439–3445.
Goodale, M. A. and Milner, A. D. (1992). Separate visual pathways for perception and action,
Trends Neurosci. 15, 20–25.
Goodale, M. A., Milner, A. D., Jakobson, L. S. and Carey, D. P. (1991). A neurological dissociation between perceiving objects and grasping them, Nature 349, 154–156.
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
223
Gori, M., Sandini, G., Martinoli, C. and Burr, D. C. (2013). Impairment of auditory spatial
localization in congenitally blind human subjects, Brain 137, 288–293.
Gougoux, F., Zatorre, R. J., Lassonde, M., Voss, P. and Lepore, F. (2005). A functional neuroimaging study of sound localization: visual cortex activity predicts performance in earlyblind individuals, PLoS Biol. 3, e27. DOI:10.1371/journal.pbio.0030027.
Grant, A. C., Thiagarajah, M. C. and Sathian, K. (2000). Tactile perception in blind Braille
readers: a psychophysical study of acuity and hyperacuity using gratings and dot patterns,
Percept. Psychophys. 62, 301–312.
Greve, D. N. and Fischl, B. (2009). Accurate and robust brain image alignment using boundarybased registration, Neuroimage 48, 63–72.
Griffin, D. R. (1944). Echolocation by blind men, bats and radar, Science 100(2609), 589–590.
Griffiths, T. D. and Warren, J. D. (2002). The planum temporale as a computational hub, Trends
Neurosci. 25, 348–353.
Grunwald, J.-E., Schörnich, S. and Wiegrebe, L. (2004). Classification of natural textures in
echolocation, Proc. Natl Acad. Sci. USA 101, 5670–5674.
Hall, D. A., Haggard, M. P., Akeroyd, M. A., Palmer, A. R., Summerfield, A. Q., Elliott, M. R.,
Gurney, E. M. and Bowtell, R. W. (1999). Sparse temporal sampling in auditory fMRI, Hum.
Brain Mapp. 7, 213–223.
Hartley, T., Lever, C., Burgess, N. and O’Keefe, J. (2014). Space in the brain: How the hippocampal formation supports spatial cognition, Philos. Trans. R. Soc. B Biol. Sci. 369(1635),
20120510. DOI:10.1098/rstb.2012.0510.
Hausfeld, S., Power, R. P., Gorta, A. and Harris, P. (1982). Echo perception of shape and texture
by sighted subjects, Percept. Mot. Skills 55, 623–632.
Heilman, K. M., Valenstein, E. and Watson, R. T. (2000). Neglect and related disorders, Semin.
Neurol. 20, 463–470.
Jenkinson, M., Bannister, P., Brady, M. and Smith, S. (2002). Improved optimization for the
robust and accurate linear registration and motion correction of brain images, Neuroimage
17, 825–841.
Jenkinson, M., Beckmann, C. F., Behrens, T. E. J., Woolrich, M. W. and Smith, S. M. (2012).
FSL, Neuroimage 62, 782–790.
Johnson, S. H. and Grafton, S. T. (2003). From ‘acting on’ to ‘acting with’: the functional
anatomy of object-oriented action schemata, Prog. Brain Res. 142, 127–139.
Kaas, J. H. and Hackett, T. A. (1999). ‘What’ and ‘where’ processing in auditory cortex, Nat.
Neurosci. 2, 1045–1047.
Karnath, H.-O. and Perenin, M.-T. (2005). Cortical control of visually guided reaching: evidence from patients with optic ataxia, Cereb. Cortex 15, 1561–1569.
Kolarik, A. J., Cirstea, S., Pardhan, S. and Moore, B. C. J. (2014). A summary of research
investigating echolocation abilities of blind and sighted humans, Hear. Res. 310, 60–68.
Krumbholz, K., Schönwiesner, M., Rübsamen, R., Zilles, K., Fink, G. R. and von Cramon, D. Y.
(2005). Hierarchical processing of sound location and motion in the human brainstem and
planum temporale, Eur. J. Neurosci. 21, 230–238.
Kupers, R., Chebat, D. R., Madsen, K. H., Paulson, O. B. and Ptito, M. (2010). Neural correlates
of virtual route recognition in congenital blindness, Proc. Natl Acad. Sci. USA 107, 12716–
12721.
Lessard, N., Pare, M., Lepore, F. and Lassonde, M. (1998). Early-blind human subjects localize
sound sources better than sighted subjects, Nature 395(6699), 278–280.
224
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
Lingnau, A., Strnad, L., He, C., Fabbri, S., Han, Z., Bi, Y. and Caramazza, A. (2014). Crossmodal plasticity preserves functional specialization in posterior parietal cortex, Cereb. Cortex 24, 541–549.
Long, R. G. (1990). Orientation and mobility research: what is known and what needs to be
known, Peabody J. Educ. 67, 89–109.
Long, R. G., Rieser, J. J. and Hill, E. W. (1990). Mobility in individuals with moderate visual
impairments, J. Vis. Impair. Blind. 84, 111–118.
Merabet, L. B. and Pascual-Leone, A. (2010). Neural reorganization following sensory loss: the
opportunity of change, Nat. Rev. Neurosci. 11, 44–52.
Milne, J. L., Goodale, M. A. and Thaler, L. (2014a). The role of head movements in the discrimination of 2-d shape by blind echolocation experts, Atten. Percept. Psychophys. 76,
1828–1837.
Milne, J. L., Goodale, M. A., Arnott, S. R., Kish, D. and Thaler, L. (2014b). Parahippocampal
cortex is involved in material processing through echolocation in blind echolocation experts,
Vis. Res. DOI:10.1016/j.visres.2014.07.004.
Milner, A. D. and Goodale, M. A. (1995). The Visual Brain in Action. Oxford University Press,
Oxford, UK.
Noppeney, U. (2007). The effects of visual deprivation on functional and structural organization
of the human brain, Neurosci. Biobehav. Rev. 31, 598–609.
Oldfield, R. C. (1971). The assessment and analysis of handedness: the Edinburgh inventory,
Neuropsychologia 9, 97–113.
Pisella, L., Sergio, L., Blangero, A., Torchin, H., Vighetto, A. and Rossetti, Y. (2009). Optic
ataxia and the function of the dorsal stream: contributions to perception and action, Neuropsychologia 47, 3033–3044.
Raab, D. H. and Taub, H. B. (1969). Click intensity discrimination with and without a background masking noise, J. Acoust. Soc. Am. 46, 965–968.
Rauschecker, J. P. (2011). An expanded role for the dorsal auditory pathway in sensorimotor
control and integration, Hear. Res. 271, 16–25.
Rauschecker, J. P. and Tian, B. (2000). Mechanisms and streams for processing of what and
where in auditory cortex, Proc. Natl Acad. Sci. USA 97, 11800–11806.
Rice, C. E. and Feinstein, S. H. (1965). Sonar system of the blind: size discrimination, Science
148, 1107–1108.
Rice, C. E., Feinstein, S. H. and Schusterman, R. J. (1965). Echo-detection ability of the blind:
size and distance factors, J. Exp. Psychol. 70, 246–251.
Rizzolatti, G. and Matelli, M. (2003). Two different streams form the dorsal visual system:
anatomy and functions, Exp. Brain Res. 153, 146–157.
Röder, B. and Rösler, F. (2004). Compensatory plasticity as a consequence of sensory loss,
in: The Handbook of Multisensory Processes, G. Calvert, C. Spence and B. E. Stein (Eds),
pp. 719–747. MIT Press, Cambridge, MA, USA.
Röder, B., Teder-Sälejärvi, W., Sterr, A., Rösler, F., Hillyard, S. A. and Neville, H. J. (1999).
Improved auditory spatial tuning in blind humans, Nature 400, 162–166.
Roentgen, U. R., Gelderblom, G. J., Soede, M. and de Witte, L. P. (2009). The impact of electronic mobility devices for persons who are visually impaired: a systematic review of effects
and effectiveness, J. Vis. Impair. Blind. 103, 743–753.
Rosenblum, L. D., Gordon, M. S. and Jarquin, L. (2000). Echolocating distance by moving and
stationary listeners, Ecol. Psychol. 12, 181–206.
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
225
Sadato, N., Pascual-Leone, A., Grafman, J., Ibañez, V., Deiber, M. P., Dold, G. and Hallett, M.
(1996). Activation of the primary visual cortex by braille reading in blind subjects, Nature
380, 526–528.
Salive, M. E., Guralnik, J., Glynn, R. J. and Christen, W. (1994). Association of visual impairment with mobility and physical function, J. Am. Geriatr. Soc. 42, 287–292.
Schenkman, B. N. and Nilsson, M. E. (2010). Human echolocation: blind and sighted persons’
ability to detect sounds recorded in the presence of a reflecting object, Perception 39, 483–
501.
Schnitzler, H. U., Moss, C. F. and Denzinger, A. (2003). From spatial orientation to food acquisition in echolocating bats, Trends Ecol. Evol. 18, 386–394.
Schörnich, S., Wallmeier, L., Gessele, N., Nagy, A., Schranner, M., Kish, D. and Wiegrebe, L.
(2013). Psychophysics of human echolocation, in: Basic Aspects of Hearing, B. C. J. Moore,
R. D. Patterson, I. M. Winter, R. P. Carlyon and H. E. Gockel (Eds), pp. 311–319. Springer,
New York, NY, USA.
Silk, T. J., Bellgrove, M. A., Wrafter, P., Mattingley, J. B. and Cunnington, R. (2010). Spatial
working memory and spatial attention rely on common neural processes in the intraparietal
sulcus, Neuroimage 53, 718–724.
Stoffregen, T. A. and Pittenger, J. B. (1995). Human echolocation as a basic form of perception
and action, Ecol. Psychol. 7, 181–216.
Supa, M., Cotzin, M. and Dallenbach, K. M. (1944). “Facial vision”: the perception of obstacles
by the blind, Am. J. Psychol. 57, 133–183.
Teng, S. and Whitney, D. (2011). The acuity of echolocation: spatial resolution in the sighted
compared to expert performance, J. Vis. Impair. Blind. 105, 20–32.
Teng, S., Puri, A. and Whitney, D. (2012). Ultrafine spatial acuity of blind expert human echolocators, Exp. Brain Res. 216, 483–488.
Thaler, L. (2013). Echolocation may have real-life advantages for blind people: an analysis of
survey data, Front. Physiol. 4, 98. DOI:10.3389/fphys.2013.00098.
Thaler, L., Arnott, S. R. and Goodale, M. A. (2011). Neural correlates of natural human echolocation in early and late blind echolocation experts, PLoS One 6, e20162.
DOI:10.1371/journal.pone.0020162.
Thaler, L., Milne, J. L., Arnott, S. R., Kish, D. and Goodale, M. A. (2014). Neural correlates
of motion processing through echolocation, source hearing, and vision in blind echolocation
experts and sighted echolocation novices, J. Neurophysiol. 111, 112–127.
Thomas, J. A., Moss, C. F. and Vater, M. (2004). Echolocation in Bats and Dolphins. University
of Chicago Press, Chicago, IL, USA.
Ungerleider, L. G. and Mishkin, M. (1982). Two cortical visual systems, in: Analysis of Visual
Behaviour, D. J. Ingle, M. A. Goodale and R. J. W. Mansfeld (Eds), pp. 549–586. MIT Press,
Cambridge, MA, USA.
Vallar, G. and Perani, D. (1986). The anatomy of unilateral neglect after right-hemisphere stroke
lesions. A clinical/CT-scan correlation study in man, Neuropsychologia 24, 609–622.
Van Boven, R. W., Hamilton, R. H., Kauffman, T., Keenan, J. P. and Pascual-Leone, A. (2000).
Tactile spatial resolution in blind braille readers, Neurology 54, 2230–2236.
Vercillo, T., Milne, J. L., Gori, M. and Goodale, M. A. (2015). Enhanced auditory spatial localization in blind echolocators, Neuropsychologia 67, 35–40.
226
K. Fiehler et al. / Multisensory Research 28 (2015) 195–226
Voss, P., Lassonde, M., Gougoux, F., Fortin, M., Guillemot, J.-P. and Lepore, F. (2004). Earlyand late-onset blind individuals show supra-normal auditory abilities in far-space, Curr. Biol.
14, 1734–1738.
Wallmeier, L., Geßele, N. and Wiegrebe, L. (2013). Echolocation versus echo suppression in
humans, Proc. R. Soc. B Biol. Sci. 280(1769), 20131428. DOI:10.1098/rspb.2013.1428.
Weissenbacher, P. and Wiegrebe, L. (2003). Classification of virtual objects in the echolocating
bat, Megaderma lyra, Behav. Neurosci. 117, 833–839.
Westwood, D. A., Danckert, J., Servos, P. and Goodale, M. A. (2002). Grasping twodimensional images and three-dimensional objects in visual-form agnosia, Exp. Brain Res.
144, 262–267.
Wong, M., Gnanakumaran, V. and Goldreich, D. (2011). Tactile spatial acuity enhancement in
blindness: evidence for experience-dependent mechanisms, J. Neurosci. 31, 7028–7037.
Worchel, P. and Mauney, J. (1951). The effect of practice on the perception of obstacles by the
blind, J. Exp. Psychol. 41, 170–176.
Zwiers, M. P., van Opstal, A. J. and Cruysberg, J. R. (2001). A spatial hearing deficit in earlyblind humans, J. Neurosci. 21, RC142: 1–5.
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertisement